- -

Combining haptics and inertial motion capture to enhance remote control of a dual-arm robot

RiuNet: Repositorio Institucional de la Universidad Politécnica de Valencia

Compartir/Enviar a

Citas

Estadísticas

  • Estadisticas de Uso

Combining haptics and inertial motion capture to enhance remote control of a dual-arm robot

Mostrar el registro sencillo del ítem

Ficheros en el ítem

dc.contributor.author Girbés-Juan, Vicent es_ES
dc.contributor.author Schettino, Vinicius es_ES
dc.contributor.author Gracia Calandin, Luis Ignacio es_ES
dc.contributor.author Solanes, J. Ernesto es_ES
dc.contributor.author Demiris, Yiannis es_ES
dc.contributor.author Tornero, Josep es_ES
dc.date.accessioned 2023-12-15T19:01:19Z
dc.date.available 2023-12-15T19:01:19Z
dc.date.issued 2022-06 es_ES
dc.identifier.issn 1783-7677 es_ES
dc.identifier.uri http://hdl.handle.net/10251/200802
dc.description.abstract [EN] High dexterity is required in tasks in which there is contact between objects, such as surface conditioning (wiping, polishing, scuffing, sanding, etc.), specially when the location of the objects involved is unknown or highly inaccurate because they are moving, like a car body in automotive industry lines. These applications require the human adaptability and the robot accuracy. However, sharing the same workspace is not possible in most cases due to safety issues. Hence, a multi-modal teleoperation system combining haptics and an inertial motion capture system is introduced in this work. The human operator gets the sense of touch thanks to haptic feedback, whereas using the motion capture device allows more naturalistic movements. Visual feedback assistance is also introduced to enhance immersion. A Baxter dual-arm robot is used to offer more flexibility and manoeuvrability, allowing to perform two independent operations simultaneously. Several tests have been carried out to assess the proposed system. As it is shown by the experimental results, the task duration is reduced and the overall performance improves thanks to the proposed teleoperation method. es_ES
dc.description.sponsorship This research was funded by Generalitat Valenciana (Grants GV/2021/074 and GV/2021/181) and by the SpanishGovernment (Grants PID2020-118071GB-I00 and PID2020-117421RBC21 funded by MCIN/AEI/10.13039/501100011033). This work was also supported byCoordenacao de Aperfeiaoamento de Pessoal de Nivel Superior (CAPES Brasil) under Finance Code 001, by CEFET-MG, and by a Royal Academy of Engineering Chair in Emerging Technologies to YD. es_ES
dc.language Inglés es_ES
dc.publisher Springer-Verlag es_ES
dc.relation.ispartof Journal on Multimodal User Interfaces es_ES
dc.rights Reconocimiento (by) es_ES
dc.subject Multimodal teleoperation es_ES
dc.subject Haptic feedback es_ES
dc.subject Motion capture es_ES
dc.subject Dual-arm robotics es_ES
dc.subject Collaborative robot es_ES
dc.subject Surface conditioning es_ES
dc.subject.classification INGENIERIA DE SISTEMAS Y AUTOMATICA es_ES
dc.title Combining haptics and inertial motion capture to enhance remote control of a dual-arm robot es_ES
dc.type Artículo es_ES
dc.identifier.doi 10.1007/s12193-021-00386-8 es_ES
dc.relation.projectID info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2013-2016/DPI2017-87656-C2-1-R/ES/VISION ARTIFICIAL Y ROBOTICA COLABORATIVA EN PULIDO DE SUPERFICIES EN LA INDUSTRIA/ es_ES
dc.relation.projectID info:eu-repo/grantAgreement/GENERALITAT VALENCIANA//AEST%2F2019%2F010//AYUDA ESTANCIA EN EMPRESA FORD ESPAÑA S.A. "ROBOTICA INDUSTRIAL/COLABORATIVA EN EL PROCESO DE LIJADO/PULIDO DE CARROCERIAS DE AUTOMOVIL"/ es_ES
dc.relation.projectID info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2017-2020/PID2020-117421RB-C21/ES/PULIDO ROBOTIZADO AVANZADO DE SUPERFICIES EN LA INDUSTRIA DEL AUTOMOVIL/ es_ES
dc.relation.projectID info:eu-repo/grantAgreement/GENERALITAT VALENCIANA//AEST%2F2021%2F079//ESTANCIA ALFATEC. DETECCIÓN Y CLASIFICACIÓN DE DEFECTOS.../ es_ES
dc.relation.projectID info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2017-2020/PID2020-118071GB-I00/ES/APRENDIZAJE AUTOMATICO BIOINSPIRADO/ es_ES
dc.relation.projectID info:eu-repo/grantAgreement/GENERALITAT VALENCIANA//GV%2F2021%2F181//Interacción humano-robot avanzada basada en realidad mixta y fusión sensorial para operaciones de tratamiento de superficies de productos manufacturados./ es_ES
dc.relation.projectID info:eu-repo/grantAgreement/GVA//GV%2F2021%2F074/ es_ES
dc.relation.projectID info:eu-repo/grantAgreement/CAPES//001/ es_ES
dc.rights.accessRights Abierto es_ES
dc.contributor.affiliation Universitat Politècnica de València. Escuela Politécnica Superior de Alcoy - Escola Politècnica Superior d'Alcoi es_ES
dc.contributor.affiliation Universitat Politècnica de València. Instituto de Diseño para la Fabricación y Producción Automatizada - Institut de Disseny per a la Fabricació i Producció Automatitzada es_ES
dc.contributor.affiliation Universitat Politècnica de València. Escuela Técnica Superior de Ingeniería del Diseño - Escola Tècnica Superior d'Enginyeria del Disseny es_ES
dc.description.bibliographicCitation Girbés-Juan, V.; Schettino, V.; Gracia Calandin, LI.; Solanes, JE.; Demiris, Y.; Tornero, J. (2022). Combining haptics and inertial motion capture to enhance remote control of a dual-arm robot. Journal on Multimodal User Interfaces. 16(2):219-238. https://doi.org/10.1007/s12193-021-00386-8 es_ES
dc.description.accrualMethod S es_ES
dc.relation.publisherversion https://doi.org/10.1007/s12193-021-00386-8 es_ES
dc.description.upvformatpinicio 219 es_ES
dc.description.upvformatpfin 238 es_ES
dc.type.version info:eu-repo/semantics/publishedVersion es_ES
dc.description.volume 16 es_ES
dc.description.issue 2 es_ES
dc.relation.pasarela S\466231 es_ES
dc.contributor.funder GENERALITAT VALENCIANA es_ES
dc.contributor.funder AGENCIA ESTATAL DE INVESTIGACION es_ES
dc.contributor.funder Agencia Estatal de Investigación es_ES
dc.contributor.funder UNIVERSIDAD POLITECNICA DE VALENCIA es_ES
dc.contributor.funder Royal Academy of Engineering, Reino Unido es_ES
dc.contributor.funder Coordenaçao de Aperfeiçoamento de Pessoal de Nível Superior, Brasil es_ES
dc.description.references Hägele M, Nilsson K, Pires JN, Bischoff R (2016) Industrial robotics. Springer, Cham, pp 1385–1422. https://doi.org/10.1007/978-3-319-32552-1_54 es_ES
dc.description.references Hokayem PF, Spong MW (2006) Bilateral teleoperation: an historical survey. Automatica 42(12):2035–2057. https://doi.org/10.1016/j.automatica.2006.06.027 es_ES
dc.description.references Son HI (2019) The contribution of force feedback to human performance in the teleoperation of multiple unmanned aerial vehicles. J Multimodal User Interfaces 13(4):335–342 es_ES
dc.description.references Jones B, Maiero J, Mogharrab A, Aguliar IA, Adhikari A, Riecke BE, Kruijff E, Neustaedter C, Lindeman RW (2020) Feetback: augmenting robotic telepresence with haptic feedback on the feet. In: Proceedings of the 2020 international conference on multimodal interaction, pp 194–203 es_ES
dc.description.references Merrad W, Héloir A, Kolski C, Krüger A (2021) Rfid-based tangible and touch tabletop for dual reality in crisis management context. J Multimodal User Interfaces. https://doi.org/10.1007/s12193-021-00370-2 es_ES
dc.description.references Schettino V, Demiris Y (2019) Inference of user-intention in remote robot wheelchair assistance using multimodal interfaces. In: 2019 IEEE/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 4600–4606 es_ES
dc.description.references Casper J, Murphy RR (2003) Human–robot interactions during the robot-assisted urban search and rescue response at the world trade center. IEEE Trans Syst Man Cybern Part B (Cybern) 33(3):367–385. https://doi.org/10.1109/TSMCB.2003.811794 es_ES
dc.description.references Chen JY (2010) UAV-guided navigation for ground robot tele-operation in a military reconnaissance environment. Ergonomics 53(8):940–950. https://doi.org/10.1080/00140139.2010.500404 (pMID: 20658388.) es_ES
dc.description.references Aleotti J, Micconi G, Caselli S, Benassi G, Zambelli N, Bettelli M, Calestani D, Zappettini A (2019) Haptic teleoperation of UAV equipped with gamma-ray spectrometer for detection and identification of radio-active materials in industrial plants. In: Tolio T, Copani G, Terkaj W (eds) Factories of the future: the Italian flagship initiative. Springer, Cham, pp 197–214. https://doi.org/10.1007/978-3-319-94358-9_9 es_ES
dc.description.references Santos Carreras L (2012) Increasing haptic fidelity and ergonomics in teleoperated surgery. PhD Thesis, EPFL, Lausanne, pp 1–188. https://doi.org/10.5075/epfl-thesis-5412 es_ES
dc.description.references Hatzfeld C, Neupert C, Matich S, Braun M, Bilz J, Johannink J, Miller J, Pott PP, Schlaak HF, Kupnik M, Werthschützky R, Kirschniak A (2017) A teleoperated platform for transanal single-port surgery: ergonomics and workspace aspects. In: IEEE world haptics conference (WHC), pp 1–6. https://doi.org/10.1109/WHC.2017.7989847 es_ES
dc.description.references Burns JO, Mellinkoff B, Spydell M, Fong T, Kring DA, Pratt WD, Cichan T, Edwards CM (2019) Science on the lunar surface facilitated by low latency telerobotics from a lunar orbital platform-gateway. Acta Astronaut 154:195–203. https://doi.org/10.1016/j.actaastro.2018.04.031 es_ES
dc.description.references Sivčev S, Coleman J, Omerdić E, Dooly G, Toal D (2018) Underwater manipulators: a review. Ocean Eng 163:431–450. https://doi.org/10.1016/j.oceaneng.2018.06.018 es_ES
dc.description.references Abich J, Barber DJ (2017) The impact of human–robot multimodal communication on mental workload, usability preference, and expectations of robot behavior. J Multimodal User Interfaces 11(2):211–225. https://doi.org/10.1007/s12193-016-0237-4 es_ES
dc.description.references Hong A, Lee DG, Bülthoff HH, Son HI (2017) Multimodal feedback for teleoperation of multiple mobile robots in an outdoor environment. J Multimodal User Interfaces 11(1):67–80. https://doi.org/10.1007/s12193-016-0230-y es_ES
dc.description.references Katyal KD, Brown CY, Hechtman SA, Para MP, McGee TG, Wolfe KC, Murphy RJ, Kutzer MDM, Tunstel EW, McLoughlin MP, Johannes MS (2014) Approaches to robotic teleoperation in a disaster scenario: from supervised autonomy to direct control. In: IEEE/RSJ international conference on intelligent robots and systems, pp 1874–1881. https://doi.org/10.1109/IROS.2014.6942809 es_ES
dc.description.references Niemeyer G, Preusche C, Stramigioli S, Lee D (2016) Telerobotics. Springer, Cham, pp 1085–1108. https://doi.org/10.1007/978-3-319-32552-1_43 es_ES
dc.description.references Li J, Li Z, Hauser K (2017) A study of bidirectionally telepresent tele-action during robot-mediated handover. In: Proceedings—IEEE international conference on robotics and automation, pp 2890–2896. https://doi.org/10.1109/ICRA.2017.7989335 es_ES
dc.description.references Peng XB, Kanazawa A, Malik J, Abbeel P, Levine S (2018) Sfv: reinforcement learning of physical skills from videos. ACM Trans. Graph. 37(6):178:1-178:14. https://doi.org/10.1145/3272127.3275014 es_ES
dc.description.references Coleca F, State A, Klement S, Barth E, Martinetz T (2015) Self-organizing maps for hand and full body tracking. Neurocomputing 147: 174–184. Advances in self-organizing maps subtitle of the special issue: selected papers from the workshop on self-organizing maps 2012 (WSOM 2012). https://doi.org/10.1016/j.neucom.2013.10.041 es_ES
dc.description.references Von Marcard T, Rosenhahn B, Black MJ, Pons-Moll G (2017) Sparse inertial poser: automatic 3d human pose estimation from sparse Imus. In: Computer graphics forum, vol 36. Wiley, pp 349–360 es_ES
dc.description.references Zhao J (2018) A review of wearable IMU (inertial-measurement-unit)-based pose estimation and drift reduction technologies. J Phys Conf Ser 1087:042003. https://doi.org/10.1088/1742-6596/1087/4/042003 es_ES
dc.description.references Malleson C, Gilbert A, Trumble M, Collomosse J, Hilton A, Volino M (2018) Real-time full-body motion capture from video and IMUs. In: Proceedings—2017 international conference on 3D vision, 3DV 2017 (September), pp 449–457. https://doi.org/10.1109/3DV.2017.00058 es_ES
dc.description.references Du G, Zhang P, Mai J, Li Z (2012) Markerless kinect-based hand tracking for robot teleoperation. Int J Adv Robot Syst 9(2):36. https://doi.org/10.5772/50093 es_ES
dc.description.references Çoban M, Gelen G (2018) Wireless teleoperation of an industrial robot by using myo arm band. In: International conference on artificial intelligence and data processing (IDAP), pp 1–6. https://doi.org/10.1109/IDAP.2018.8620789 es_ES
dc.description.references Lipton JI, Fay AJ, Rus D (2018) Baxter’s homunculus: virtual reality spaces for teleoperation in manufacturing. IEEE Robot Autom Lett 3(1):179–186. https://doi.org/10.1109/LRA.2017.2737046 es_ES
dc.description.references Zhang T, McCarthy Z, Jow O, Lee D, Chen X, Goldberg K, Abbeel P (2018) Deep imitation learning for complex manipulation tasks from virtual reality teleoperation. In: IEEE international conference on robotics and automation (ICRA), pp 5628–5635. https://doi.org/10.1109/ICRA.2018.8461249 es_ES
dc.description.references Hannaford B, Okamura AM (2016) Haptics. Springer, Cham, pp 1063–1084. https://doi.org/10.1007/978-3-319-32552-1_42 es_ES
dc.description.references Rodríguez J-L, Velàzquez R (2012) Haptic rendering of virtual shapes with the Novint Falcon. Proc Technol 3:132–138. https://doi.org/10.1016/J.PROTCY.2012.03.014 es_ES
dc.description.references Teklemariam HG, Das AK (2017) A case study of phantom omni force feedback device for virtual product design. Int J Interact Des Manuf (IJIDeM) 11(4):881–892. https://doi.org/10.1007/s12008-015-0274-3 es_ES
dc.description.references Karbasizadeh N, Zarei M, Aflakian A, Masouleh MT, Kalhor A (2018) Experimental dynamic identification and model feed-forward control of Novint Falcon haptic device. Mechatronics 51:19–30. https://doi.org/10.1016/j.mechatronics.2018.02.013 es_ES
dc.description.references Georgiou T, Demiris Y (2017) Adaptive user modelling in car racing games using behavioural and physiological data. User Model User-Adapted Interact 27(2):267–311. https://doi.org/10.1007/s11257-017-9192-3 es_ES
dc.description.references Son HI (2019) The contribution of force feedback to human performance in the teleoperation of multiple unmanned aerial vehicles. J Multimodal User Interfaces 13(4):335–342. https://doi.org/10.1007/s12193-019-00292-0 es_ES
dc.description.references Ramírez-Fernández C, Morán AL, García-Canseco E (2015) Haptic feedback in motor hand virtual therapy increases precision and generates less mental workload. In: 2015 9th international conference on pervasive computing technologies for healthcare (PervasiveHealth), pp 280–286. https://doi.org/10.4108/icst.pervasivehealth.2015.260242 es_ES
dc.description.references Saito Y, Raksincharoensak P (2019) Effect of risk-predictive haptic guidance in one-pedal driving mode. Cognit Technol Work 21(4):671–684. https://doi.org/10.1007/s10111-019-00558-3 es_ES
dc.description.references Girbés V, Armesto L, Dols J, Tornero J (2016) Haptic feedback to assist bus drivers for pedestrian safety at low speed. IEEE Trans Haptics 9(3):345–357. https://doi.org/10.1109/TOH.2016.2531686 es_ES
dc.description.references Girbés V, Armesto L, Dols J, Tornero J (2017) An active safety system for low-speed bus braking assistance. IEEE Trans Intell Transp Syst 18(2):377–387. https://doi.org/10.1109/TITS.2016.2573921 es_ES
dc.description.references Escobar-Castillejos D, Noguez J, Neri L, Magana A, Benes B (2016) A review of simulators with haptic devices for medical training. J Med Syst 40(4):104. https://doi.org/10.1007/s10916-016-0459-8 es_ES
dc.description.references Coles TR, Meglan D, John NW (2011) The role of haptics in medical training simulators: a survey of the state of the art. IEEE Trans Haptics 4(1):51–66. https://doi.org/10.1109/TOH.2010.19 es_ES
dc.description.references Okamura AM, Verner LN, Reiley CE, Mahvash M (2010) Haptics for robot-assisted minimally invasive surgery. In: Kaneko M, Nakamura Y (eds) Robotics research. Springer tracts in advanced robotics, vol 66. Springer, Berlin, pp 361–372. https://doi.org/10.1007/978-3-642-14743-2_30 es_ES
dc.description.references Ehrampoosh S, Dave M, Kia MA, Rablau C, Zadeh MH (2013) Providing haptic feedback in robot-assisted minimally invasive surgery: a direct optical force-sensing solution for haptic rendering of deformable bodies. Comput Aided Surg 18(5–6):129–141. https://doi.org/10.3109/10929088.2013.839744 es_ES
dc.description.references Ju Z, Yang C, Li Z, Cheng L, Ma H (2014) Teleoperation of humanoid Baxter robot using haptic feedback. In: 2014 international conference on multisensor fusion and information integration for intelligent systems (MFI). IEEE, pp 1–6. https://doi.org/10.1109/MFI.2014.6997721 es_ES
dc.description.references Clark JP, Lentini G, Barontini F, Catalano MG, Bianchi M, O’Malley MK (2019) On the role of wearable haptics for force feedback in teleimpedance control for dual-arm robotic teleoperation. In: International conference on robotics and automation (ICRA), pp 5187–5193. https://doi.org/10.1109/ICRA.2019.8793652 es_ES
dc.description.references Gracia L, Solanes JE, Muñoz-Benavent P, Miro JV, Perez-Vidal C, Tornero J (2018) Adaptive sliding mode control for robotic surface treatment using force feedback. Mechatronics 52:102–118. https://doi.org/10.1016/j.mechatronics.2018.04.008 es_ES
dc.description.references Zhu D, Xu X, Yang Z, Zhuang K, Yan S, Ding H (2018) Analysis and assessment of robotic belt grinding mechanisms by force modeling and force control experiments. Tribol Int 120:93–98. https://doi.org/10.1016/j.triboint.2017.12.043 es_ES
dc.description.references Smith C, Karayiannidis Y, Nalpantidis L, Gratal X, Qi P, Dimarogonas DV, Kragic D (2012) Dual arm manipulation—a survey. Robot Auton Syst 60(10):1340–1353. https://doi.org/10.1016/j.robot.2012.07.005 es_ES
dc.description.references Girbés-Juan V, Schettino V, Demiris Y, Tornero J (2021) Haptic and visual feedback assistance for dual-arm robot teleoperation in surface conditioning tasks. IEEE Trans Haptics 14(1):44–56. https://doi.org/10.1109/TOH.2020.3004388 es_ES
dc.description.references Tunstel EW Jr, Wolfe KC, Kutzer MD, Johannes MS, Brown CY, Katyal KD, Para MP, Zeher MJ (2013) Recent enhancements to mobile bimanual robotic teleoperation with insight toward improving operator control. Johns Hopkins APL Tech Digest 32(3):584 es_ES
dc.description.references García A, Solanes JE, Gracia L, Muñoz-Benavent P, Girbés-Juan V, Tornero J (2021) Bimanual robot control for surface treatment tasks. Int J Syst Sci. https://doi.org/10.1080/00207721.2021.1938279 es_ES
dc.description.references Jasim IF, Plapper PW, Voos H (2014) Position identification in force-guided robotic peg-in-hole assembly tasks. Proc CIRP 23((C)):217–222. https://doi.org/10.1016/j.procir.2014.10.077 es_ES
dc.description.references Song HC, Kim YL, Song JB (2016) Guidance algorithm for complex-shape peg-in-hole strategy based on geometrical information and force control. Adv Robot 30(8):552–563. https://doi.org/10.1080/01691864.2015.1130172 es_ES
dc.description.references Kramberger A, Gams A, Nemec B, Chrysostomou D, Madsen O, Ude A (2017) Generalization of orientation trajectories and force-torque profiles for robotic assembly. Robot Auton Syst 98:333–346. https://doi.org/10.1016/j.robot.2017.09.019 es_ES
dc.description.references Pliego-Jiménez J, Arteaga-Pérez MA (2015) Adaptive position/force control for robot manipulators in contact with a rigid surface with unknown parameters. In: European control conference (ECC), pp 3603–3608. https://doi.org/10.1109/ECC.2015.7331090 es_ES
dc.description.references Gierlak P, Szuster M (2017) Adaptive position/force control for robot manipulator in contact with a flexible environment. Robot Auton Syst 95:80–101. https://doi.org/10.1016/j.robot.2017.05.015 es_ES
dc.description.references Solanes JE, Gracia L, Muñoz-Benavent P, Miro JV, Girbés V, Tornero J (2018) Human–robot cooperation for robust surface treatment using non-conventional sliding mode control. ISA Trans 80:528–541. https://doi.org/10.1016/j.isatra.2018.05.013 es_ES
dc.description.references Ravandi AK, Khanmirza E, Daneshjou K (2018) Hybrid force/position control of robotic arms manipulating in uncertain environments based on adaptive fuzzy sliding mode control. Appl Soft Comput 70:864–874. https://doi.org/10.1016/j.asoc.2018.05.048 es_ES
dc.description.references Solanes JE, Gracia L, Muñoz-Benavent P, Esparza A, Miro JV, Tornero J (2018) Adaptive robust control and admittance control for contact-driven robotic surface conditioning. Robot Comput Integr Manuf 54:115–132. https://doi.org/10.1016/j.rcim.2018.05.003 es_ES
dc.description.references Perez-Vidal C, Gracia L, Sanchez-Caballero S, Solanes JE, Saccon A, Tornero J (2019) Design of a polishing tool for collaborative robotics using minimum viable product approach. Int J Comput Integr Manuf 32(9):848–857. https://doi.org/10.1080/0951192X.2019.1637026 es_ES
dc.description.references Chen F, Zhao H, Li D, Chen L, Tan C, Ding H (2019) Contact force control and vibration suppression in robotic polishing with a smart end effector. Robot Comput Integr Manuf 57:391–403. https://doi.org/10.1016/j.rcim.2018.12.019 es_ES
dc.description.references Mohammad AEK, Hong J, Wang D, Guan Y (2019) Synergistic integrated design of an electrochemical mechanical polishing end-effector for robotic polishing applications. Robot Comput Integr Manuf 55:65–75. https://doi.org/10.1016/j.rcim.2018.07.005 es_ES
dc.description.references Waldron KJ, Schmiedeler J (2016) Kinematics. Springer, Cham, pp 11–36. https://doi.org/10.1007/978-3-319-32552-1_2 es_ES
dc.description.references Featherstone R, Orin DE (2016) Dynamics. Springer, Cham, pp 37–66. https://doi.org/10.1007/978-3-319-32552-1_3 es_ES
dc.description.references Wen K, Necsulescu D, Sasiadek J (2008) Haptic force control based on impedance/admittance control aided by visual feedback. Multimed Tools Appl 37(1):39–52. https://doi.org/10.1007/s11042-007-0172-1 es_ES
dc.description.references Tzafestas C, Velanas S, Fakiridis G (2008) Adaptive impedance control in haptic teleoperation to improve transparency under time-delay. In: IEEE international conference on robotics and automation, pp 212–219. https://doi.org/10.1109/ROBOT.2008.4543211 es_ES
dc.description.references Chiaverini S, Oriolo G, Maciejewski AA (2016) Redundant robots. Springer, Cham, pp 221–242. https://doi.org/10.1007/978-3-319-32552-1_10 es_ES
dc.description.references Ogata K (1987) Discrete-time control systems. McGraw-Hill, New York es_ES
dc.description.references García A, Girbés-Juan V, Solanes JE, Gracia L, Perez-Vidal C, Tornero J (2020) Human–robot cooperation for surface repair combining automatic and manual modes. IEEE Access 8:154024–154035. https://doi.org/10.1109/ACCESS.2020.3014501 es_ES


Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem