- -

Clasificación de objetos usando percepción bimodal de palpación única en acciones de agarre robótico

RiuNet: Repositorio Institucional de la Universidad Politécnica de Valencia

Compartir/Enviar a

Citas

Estadísticas

  • Estadisticas de Uso

Clasificación de objetos usando percepción bimodal de palpación única en acciones de agarre robótico

Mostrar el registro sencillo del ítem

Ficheros en el ítem

dc.contributor.author Velasco, E. es_ES
dc.contributor.author Zapata-Impata, B.S. es_ES
dc.contributor.author Gil, P. es_ES
dc.contributor.author Torres, F. es_ES
dc.date.accessioned 2020-03-04T08:55:25Z
dc.date.available 2020-03-04T08:55:25Z
dc.date.issued 2020-01-01
dc.identifier.issn 1697-7912
dc.identifier.uri http://hdl.handle.net/10251/138322
dc.description.abstract [ES] Este trabajo presenta un método para clasificar objetos agarrados con una mano robótica multidedo combinando en un descriptor híbrido datos propioceptivos y táctiles. Los datos propioceptivos se obtienen a partir de las posiciones articulares de la mano y los táctiles se extraen del contacto registrado por células de presión instaladas en las falanges. La aproximación propuesta permite identificar el objeto aprendiendo de forma implícita su geometría y rigidez usando los datos que facilitan los sensores. En este trabajo demostramos que el uso de datos bimodales con técnicas de aprendizaje supervisado mejora la tasa de reconocimiento. En la experimentación, se han llevado a cabo más de 3000 agarres de hasta 7 objetos domésticos distintos, obteniendo clasificaciones correctas del 95%con métrica F1, realizando una única palpación del objeto. Además, la generalización del método se ha verificado entrenando nuestro sistema con unos objetos y posteriormente, clasificando otros nuevos similar es_ES
dc.description.abstract [EN] This work presents a method to classify grasped objects with a multi-fingered robotic hand combining proprioceptive and tactile data in a hybrid descriptor. The proprioceptive data are obtained from the joint positions of the hand and the tactile data are obtained from the contact registered by pressure cells installed on the phalanges. The proposed approach allows us to identify the grasped object by learning the contact geometry and stiness from the readings by sensors. In this work, we show that using bimodal data of different nature along with supervised learning techniques improves the recognition rate. In experimentation, more than 3000 grasps of up to 7 dierent domestic objects have been carried out, obtaining an average F1 score around 95 %, performing just a single grasp. In addition, the generalization of the method has been verified by training our system with certain objects and classifying new, similar ones without any prior knowledge. es_ES
dc.description.sponsorship Este trabajo ha sido financiado con Fondos Europeos de Desarrollo Regional (FEDER), Ministerio de Economía, Industria y Competitividad a través del proyecto DPI2015-68087-R y la ayuda pre-doctoral BES-2016-078290, y también gracias al apoyo de la Comisión Europea y del programa Interreg V. Sudoe a través del proyecto SOE2/P1/F0638. es_ES
dc.language Español es_ES
dc.publisher Universitat Politècnica de València es_ES
dc.relation.ispartof Revista Iberoamericana de Automática e Informática industrial es_ES
dc.rights Reconocimiento - No comercial - Sin obra derivada (by-nc-nd) es_ES
dc.subject Robotic manipulators es_ES
dc.subject Proprioceptive-tactile perception es_ES
dc.subject Propioceptive-tactile learning es_ES
dc.subject Objects classification es_ES
dc.subject Objects recognition es_ES
dc.subject Manipuladores robóticos es_ES
dc.subject Percepción propioceptiva-táctil es_ES
dc.subject Aprendizaje propioceptivo-táctil es_ES
dc.subject Clasificación de objetos es_ES
dc.subject Reconocimiento de objetos es_ES
dc.title Clasificación de objetos usando percepción bimodal de palpación única en acciones de agarre robótico es_ES
dc.title.alternative Object classification using bimodal perception data extracted from single-touch robotic grasps es_ES
dc.type Artículo es_ES
dc.identifier.doi 10.4995/riai.2019.10923
dc.relation.projectID info:eu-repo/grantAgreement/MINECO//DPI2015-68087-R/ES/SISTEMA ROBOTICO MULTISENSORIAL CON MANIPULACION DUAL PARA TAREAS ASISTENCIALES HUMANO-ROBOT/ es_ES
dc.relation.projectID info:eu-repo/grantAgreement/EC/Interreg Sudoe/SOE2%2FP1%2FF0638/EU/Robotic treatment of deformable objects for industrial application/CoMManDIA/ es_ES
dc.relation.projectID info:eu-repo/grantAgreement/AEI//BES-2016-078290/ es_ES
dc.rights.accessRights Abierto es_ES
dc.description.bibliographicCitation Velasco, E.; Zapata-Impata, B.; Gil, P.; Torres, F. (2020). Clasificación de objetos usando percepción bimodal de palpación única en acciones de agarre robótico. Revista Iberoamericana de Automática e Informática industrial. 17(1):44-55. https://doi.org/10.4995/riai.2019.10923 es_ES
dc.description.accrualMethod OJS es_ES
dc.relation.publisherversion https://doi.org/10.4995/riai.2019.10923 es_ES
dc.description.upvformatpinicio 44 es_ES
dc.description.upvformatpfin 55 es_ES
dc.type.version info:eu-repo/semantics/publishedVersion es_ES
dc.description.volume 17 es_ES
dc.description.issue 1 es_ES
dc.identifier.eissn 1697-7920
dc.relation.pasarela OJS\10923 es_ES
dc.contributor.funder Ministerio de Economía y Competitividad es_ES
dc.contributor.funder European Commission
dc.contributor.funder Agencia Estatal de Investigación es_ES
dc.description.references Bae, J., Park, S., Park, J., Baeg, M., Kim, D., Oh, S., Oct 2012. Development of a low cost anthropomorphic robot hand with high capability. In: 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). pp. 4776-4782. https://doi.org/10.1109/IROS.2012.6386063 es_ES
dc.description.references Baishya, S. S., Bäuml, B., Oct 2016. Robust material classification with a tactile skin using deep learning. In: 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). pp. 8-15. https://doi.org/10.1109/IROS.2016.7758088 es_ES
dc.description.references Bergquist, T., Schenck, C., Ohiri, U., Sinapov, J., Griffith, S., Stoytchev, E., 2009. Interactive object recognition using proprioceptive feedback. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)-Workshop: Semantic Perception for Robot Manipulation. URL: http://www.willowgarage.com/iros09spmm es_ES
dc.description.references Bishop, C., 2006. Pattern Recognition and Machine Learning. Springer-Verlag New York. es_ES
dc.description.references Cervantes, J., Taltempa, J., Lamont, F. G., Castilla, J. S. R., Rendon, A. Y., Jalili, L. D., 2017. Análisis comparativo de las técnicas utilizadas en un sistema de reconocimiento de hojas de planta. Revista Iberoamericana de Automática e Informática Industrial 14 (1), 104-114. https://doi.org/10.1016/j.riai.2016.09.005 es_ES
dc.description.references Delgado, A., Corrales, J., Mezouar, Y., Lequievre, L., Jara, C., Torres, F., 2017. Tactile control based on gaussian images and its application in bi-manual manipulation of deformable objects. Robotics and Autonomous Systems 94, 148 - 161. https://doi.org/10.1016/j.robot.2017.04.017 es_ES
dc.description.references Glorot, X., Bordes, A., Bengio, Y., 11-13 Apr 2011. Deep sparse rectifier neural networks. In: Gordon, G., Dunson, D., Dudík, M. (Eds.), Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics. Vol. 15 of Proceedings of Machine Learning Research. PMLR, Fort Lauderdale, FL, USA, pp. 315-323. URL: http://proceedings.mlr.press/v15/glorot11a.html es_ES
dc.description.references Guo, D., Kong, T., Sun, F., Liu, H., May 2016. Object discovery and grasp detection with a shared convolutional neural network. In: 2016 IEEE International Conference on Robotics and Automation (ICRA). pp. 2038-2043. https://doi.org/10.1109/ICRA.2016.7487351 es_ES
dc.description.references Hastie, T., Tibshirani, R., Friedman, J., 2009. The elements of statistical learning: data mining, inference and prediction. Springer-Verlag New York. https://doi.org/10.1007/978-0-387-84858-7 es_ES
dc.description.references Homberg, B. S., Katzschmann, R. K., Dogar, M. R., Rus, D., Sep. 2015. Haptic identification of objects using a modular soft robotic gripper. In: 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). pp. 1698-1705. https://doi.org/10.1109/IROS.2015.7353596 es_ES
dc.description.references Homberg, B. S., Katzschmann, R. K., Dogar, M. R., Rus, D., Mar 2019. Robust proprioceptive grasping with a soft robot hand. Autonomous Robots 43 (3), 681-696. https://doi.org/10.1007/s10514-018-9754-1 es_ES
dc.description.references Ioffe, S., Szegedy, C., 2015. Batch normalization: Accelerating deep network training by reducing internal covariate shift. In: 32nd International Conference on International Conference on Machine Learning. Vol. 15. JMLR, pp. 448-456. es_ES
dc.description.references Kang, L., Ye, P., Li, Y., Doermann, D., June 2014. Convolutional neural networks for no-reference image quality assessment. In: 2014 IEEE Conference on Computer Vision and Pattern Recognition. pp. 1733-1740. https://doi.org/10.1109/CVPR.2014.224 es_ES
dc.description.references Kohavi, R., 1995. A study of cross-validation and bootstrap for accuracy estimation and model selection. In: Proceedings of the 14th International Joint Conference on Artificial Intelligence - Volume 2. IJCAI'95. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA, pp. 1137-1143. URL: http://dl.acm.org/citation.cfm?id=1643031.1643047 es_ES
dc.description.references Krizhevsky, A., Sutskever, I., Hinton, G. E., 2012. Imagenet classification with deep convolutional neural networks. In: Proceedings of the 25th International Conference on Neural Information Processing Systems - Volume 1. NIPS'12. Curran Associates Inc., USA, pp. 1097-1105. URL: http://dl.acm.org/citation.cfm?id=2999134.2999257 es_ES
dc.description.references Liu, H., Wu, Y., Sun, F., Guo, D., 2017a. Recent progress on tactile object recognition. International Journal of Advanced Robotic Systems 14 (4), 1729881417717056. https://doi.org/10.1177/1729881417717056 es_ES
dc.description.references Liu, H., Yu, Y., Sun, F., Gu, J., 2017b. Visual-tactile fusion for object recognition. IEEE Transactions on Automation Science and Engineering 14 (2), 996-1008. https://doi.org/10.1109/TASE.2016.2549552 es_ES
dc.description.references Montano, A., Su'arez, R., 2013. Object shape reconstruction based on the object manipulation. 2013 16th International Conference on Advanced Robotics, ICAR 2013, 1-6. https://doi.org/10.1109/ICAR.2013.6766571 es_ES
dc.description.references Nasrabadi, N. M., 2007. Pattern recognition and machine learning. Journal of Electronic Imaging 16 (4). https://doi.org/10.1117/1.2819119 es_ES
dc.description.references National Instruments, 2018. The LabView website. http://www.ni.com/en-us/shop/labview.html, online; accedido 05 Noviembre 2018. es_ES
dc.description.references Navarro, S. E., Gorges, N.,Wörn, H., Schill, J., Asfour, T., Dillmann, R., March 2012. Haptic object recognition for multi-fingered robot hands. In: 2012 IEEE Haptics Symposium (HAPTICS). pp. 497-502. https://doi.org/10.1109/HAPTIC.2012.6183837 es_ES
dc.description.references Pascanu, R., Montufar, G., Bengio, Y., April 2014. On the number of inference regions of deep feed forward networks with piece-wise linear activations. In: International Conference on Learning Representations (ICLR). URL: https://arxiv.org/abs/1312.6098 es_ES
dc.description.references Pezzementi, Z., Plaku, E., Reyda, C., Hager, G. D., June 2011. Tactile-object recognition from appearance information. IEEE Transactions on Robotics 27 (3), 473-487. https://doi.org/10.1109/TRO.2011.2125350 es_ES
dc.description.references Powers, D. M. W., 2011. Evaluation: From precision, recall and f-measure to ROC, informedness, markedness & correlation. Journal of Machine Learning Technologies 2 (1), 37-63. es_ES
dc.description.references Quigley, M., Conley, K., Gerkey, B., J.Faust, Foote, T., Leibs, J., Wheeler, R., Ng, A., May 2009. Ros: an open-source robot operating system. In: IEEE International Conference on Robotics and Automation (ICRA): Workshop on Open Source Software. URL: http://www.willowgarage.com/papers/ros-open-source-robot-operating-system es_ES
dc.description.references Reinecke, J., Dietrich, A., Schmidt, F., Chalon, M., May 2014. Experimental comparison of slip detection strategies by tactile sensing with the biotac on the dlr hand arm system. In: IEEE International Conference on Robotics and Automation (ICRA). pp. 2742-2748. https://doi.org/10.1109/ICRA.2014.6907252 es_ES
dc.description.references Rispal, S., Rana, A. K., Duchaine, V., 2017. Texture roughness estimation using dynamic tactile sensing. 2017 3rd International Conference on Control, Automation and Robotics, ICCAR 2017, 555-562. https://doi.org/10.1109/ICCAR.2017.7942759 es_ES
dc.description.references Sanchez, J., Corrales, J.-A., Bouzgarrou, B.-C., Mezouar, Y., 2018. Robotic manipulation and sensing of deformable objects in domestic and industrial applications: a survey. The International Journal of Robotics Research 37 (7), 688-716. https://doi.org/10.1177/0278364918779698 es_ES
dc.description.references Schmitz, A., Bansho, Y., Noda, K., Iwata, H., Ogata, T., Sugano, S., Nov 2014. Tactile object recognition using deep learning and dropout. In: 2014 IEEERAS International Conference on Humanoid Robots. pp. 1044-1050. https://doi.org/10.1109/HUMANOIDS.2014.7041493 es_ES
dc.description.references Schneider, A., Sturm, J., Stachniss, C., Reisert, M., Burkhardt, H., Burgard,W., Oct 2009. Object identification with tactile sensors using bag-of-features. In: IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). pp. 243-248. https://doi.org/10.1109/IROS.2009.5354648 es_ES
dc.description.references Shalabi, L., Shaaban, Z., Kasasbeh, B., David, M., 2006. Data mining: A preprocessing engine. Journal of Computer Science 2 (9), 735-739. https://doi.org/10.3844/jcssp.2006.735.739 es_ES
dc.description.references Sinapov, J., Bergquist, T., Schenck, C., Ohiri, U., Griffith, S., Stoytchev, A., 2011. Interactive object recognition using proprioceptive and auditory feedback. The International Journal of Robotics Research 30 (10), 1250-1262. https://doi.org/10.1177/0278364911408368 es_ES
dc.description.references Spiers, A. J., Liarokapis, M. V., Calli, B., Dollar, A. M., apr 2016. Single-Grasp Object Classification and Feature Extraction with Simple Robot Hands and Tactile Sensors. IEEE Transactions on Haptics 9 (2), 207-220. URL: http://ieeexplore.ieee.org/document/7390277/ https://doi.org/10.1109/TOH.2016.2521378 es_ES
dc.description.references Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R., 2014. Dropout: A simple way to prevent neural networks from overfitting. Journal of Machine Learning Research 15, 1929-1958. URL: http://jmlr.org/papers/v15/srivastava14a.html es_ES
dc.description.references Tekscan, 2018. The Tekscan website. https://www.tekscan.com, online; accedido 05 Noviembre 2018. es_ES
dc.description.references Velasco-Sanchez, 2018. Base de datos de agarres con Allegro y Tekscan. https://github.com/EPVelasco/Descriptores de agares, online; accedido 05 Noviembre 2018. es_ES
dc.description.references Velasco-Sanchez, E., Zapata-Impata, B. S., Gil, P., Torres, F., 2018. Reconocimiento de objetos agarrados con sensorizado híbrido propioceptivo-táctil. In: XXXIX Jornadas de Automática. CEA-IFAC, pp. 224-232. URL: https://www.eweb.unex.es/eweb/ja2018/actas.html es_ES
dc.description.references Vásquez, A., Perdereau, V., 2017. Proprioceptive shape signatures for object manipulation and recognition purposes in a robotic hand. Robotics and Autonomous Systems 98, 135 - 146. URL: http://www.sciencedirect.com/science/article/pii/S092188901630700X https://doi.org/10.1016/j.robot.2017.06.001 es_ES
dc.description.references Zapata-Impata, B. S., Gil, P., Torres, F., 2018. Non-matrix tactile sensors: How can be exploited their local connectivity for predicting grasp stability? In: IEEE/RSJ International Conference on Intelligent Robots And Systems (IROS). Workshop on Robotac: New Progress in Tactile Perception And Learning in Robotics. IEEE. URL: https://arxiv.org/abs/1809.05551 es_ES
dc.description.references Zapata-impata, B. S., Gil, P., Torres, F., 2019. Learning Spatio Temporal Tactile Features with a ConvLSTM for the Direction Of Slip Detection. Sensors 19 (3), 1-16. URL: https://www.mdpi.com/1424-8220/19/3/523 DOI: 10.3390/s19030523 https://doi.org/10.3390/s19030523 es_ES


Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem