<?xml version='1.0' encoding='UTF-8'?><?xml-stylesheet href='static/style.xsl' type='text/xsl'?><OAI-PMH xmlns="http://www.openarchives.org/OAI/2.0/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/ http://www.openarchives.org/OAI/2.0/OAI-PMH.xsd"><responseDate>2026-04-22T10:22:38Z</responseDate><request verb="GetRecord" identifier="oai:ebiltegia.mondragon.edu:20.500.11984/6567" metadataPrefix="rdf">https://ebiltegia.mondragon.edu/oai/request</request><GetRecord><record><header><identifier>oai:ebiltegia.mondragon.edu:20.500.11984/6567</identifier><datestamp>2024-07-03T06:15:35Z</datestamp><setSpec>com_20.500.11984_473</setSpec><setSpec>col_20.500.11984_478</setSpec></header><metadata><rdf:RDF xmlns:rdf="http://www.openarchives.org/OAI/2.0/rdf/" xmlns:ow="http://www.ontoweb.org/ontology/1#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:ds="http://dspace.org/ds/elements/1.1/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:doc="http://www.lyncode.com/xoai" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/rdf/ http://www.openarchives.org/OAI/2.0/rdf.xsd">
   <ow:Publication rdf:about="oai:ebiltegia.mondragon.edu:20.500.11984/6567">
      <dc:title>Towards manufacturing robotics accuracy degradation assessment: A vision-based data-driven implementation</dc:title>
      <dc:creator>Izagirre, Unai</dc:creator>
      <dc:creator>andonegui, imanol</dc:creator>
      <dc:creator>Eciolaza, Luka</dc:creator>
      <dc:creator>Zurutuza, Urko</dc:creator>
      <dc:subject>robot health monitoring</dc:subject>
      <dc:subject>industrial robot</dc:subject>
      <dc:subject>PHM</dc:subject>
      <dc:subject>Machine learning</dc:subject>
      <dc:subject>augmented reality</dc:subject>
      <dc:description>In this manuscript we report on a vision-based data-driven methodology for industrial robot health assessment. We provide an experimental evidence of the usefulness of our methodology on a system comprised of a 6-axis industrial robot, two monocular cameras and five binary squared fiducial markers. The fiducial marker system permits to accurately track the deviation of the end-effector along a fixed non-trivial trajectory. Moreover, we monitor the trajectory deflection using three gradually increasing weights attached to the end-effector. When the robot is loaded with the maximum allowed payload, a deviation of 0.77mm is identified in the Z-coordinate of the end-effector. Tracing trajectory information, we train five supervised learning regression models. Such models are afterwards used to predict the deviation of the end-effector, using the pose estimation provided by the visual tracking system. As a result of this study, we show that this procedure is a stable, robust, rigorous and reliable tool for robot trajectory deviation estimation and it even allows to identify the mechanical element producing non-kinematic errors.</dc:description>
      <dc:date>2024-07-02T09:28:30Z</dc:date>
      <dc:date>2024-07-02T09:28:30Z</dc:date>
      <dc:date>2021</dc:date>
      <dc:type>http://purl.org/coar/resource_type/c_6501</dc:type>
      <dc:identifier>0736-5845</dc:identifier>
      <dc:identifier>https://katalogoa.mondragon.edu/janium-bin/janium_login_opac.pl?find&amp;ficha_no=159488</dc:identifier>
      <dc:identifier>https://hdl.handle.net/20.500.11984/6567</dc:identifier>
      <dc:language>eng</dc:language>
      <dc:rights>© 2021 Elsevier</dc:rights>
      <dc:publisher>Elsevier</dc:publisher>
   </ow:Publication>
</rdf:RDF></metadata></record></GetRecord></OAI-PMH>