- -

Improving Robot Perception Skills Using a Fast Image-Labelling Method with Minimal Human Intervention

RiuNet: Repositorio Institucional de la Universidad Politécnica de Valencia

Compartir/Enviar a

Citas

Estadísticas

  • Estadisticas de Uso

Improving Robot Perception Skills Using a Fast Image-Labelling Method with Minimal Human Intervention

Mostrar el registro sencillo del ítem

Ficheros en el ítem

dc.contributor.author Ricolfe Viala, Carlos es_ES
dc.contributor.author Blanes Campos, Carlos es_ES
dc.date.accessioned 2023-05-29T18:02:58Z
dc.date.available 2023-05-29T18:02:58Z
dc.date.issued 2022-02 es_ES
dc.identifier.uri http://hdl.handle.net/10251/193721
dc.description.abstract [EN] Featured Application Natural interface to enhance human-robot interactions. The aim is to improve robot perception skills. Robot perception skills contribute to natural interfaces that enhance human-robot interactions. This can be notably improved by using convolutional neural networks. To train a convolutional neural network, the labelling process is the crucial first stage, in which image objects are marked with rectangles or masks. There are many image-labelling tools, but all require human interaction to achieve good results. Manual image labelling with rectangles or masks is labor-intensive and unappealing work, which can take months to complete, making the labelling task tedious and lengthy. This paper proposes a fast method to create labelled images with minimal human intervention, which is tested with a robot perception task. Images of objects taken with specific backgrounds are quickly and accurately labelled with rectangles or masks. In a second step, detected objects can be synthesized with different backgrounds to improve the training capabilities of the image set. Experimental results show the effectiveness of this method with an example of human-robot interaction using hand fingers. This labelling method generates a database to train convolutional networks to detect hand fingers easily with minimal labelling work. This labelling method can be applied to new image sets or used to add new samples to existing labelled image sets of any application. This proposed method improves the labelling process noticeably and reduces the time required to start the training process of a convolutional neural network model. es_ES
dc.description.sponsorship The Universitat Politecnica de Valencia has financed the open access fees of this paper with the project number 20200676 (Microinspeccion de superficies). es_ES
dc.language Inglés es_ES
dc.publisher MDPI AG es_ES
dc.relation.ispartof Applied Sciences es_ES
dc.rights Reconocimiento (by) es_ES
dc.subject Human-robot interactions es_ES
dc.subject Image labelling es_ES
dc.subject Deep learning es_ES
dc.subject Image classification es_ES
dc.subject.classification INGENIERIA DE SISTEMAS Y AUTOMATICA es_ES
dc.title Improving Robot Perception Skills Using a Fast Image-Labelling Method with Minimal Human Intervention es_ES
dc.type Artículo es_ES
dc.identifier.doi 10.3390/app12031557 es_ES
dc.relation.projectID info:eu-repo/grantAgreement/UPV//20200676//Microinspección de superficies/ es_ES
dc.rights.accessRights Abierto es_ES
dc.contributor.affiliation Universitat Politècnica de València. Escuela Técnica Superior de Ingeniería del Diseño - Escola Tècnica Superior d'Enginyeria del Disseny es_ES
dc.description.bibliographicCitation Ricolfe Viala, C.; Blanes Campos, C. (2022). Improving Robot Perception Skills Using a Fast Image-Labelling Method with Minimal Human Intervention. Applied Sciences. 12(3):1-14. https://doi.org/10.3390/app12031557 es_ES
dc.description.accrualMethod S es_ES
dc.relation.publisherversion https://doi.org/10.3390/app12031557 es_ES
dc.description.upvformatpinicio 1 es_ES
dc.description.upvformatpfin 14 es_ES
dc.type.version info:eu-repo/semantics/publishedVersion es_ES
dc.description.volume 12 es_ES
dc.description.issue 3 es_ES
dc.identifier.eissn 2076-3417 es_ES
dc.relation.pasarela S\476275 es_ES
dc.contributor.funder Universitat Politècnica de València es_ES
upv.costeAPC 1623,08 es_ES


Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem