- -

PHAROS 2.0-A PHysical Assistant RObot System Improved

RiuNet: Repositorio Institucional de la Universidad Politécnica de Valencia

Compartir/Enviar a

Citas

Estadísticas

  • Estadisticas de Uso

PHAROS 2.0-A PHysical Assistant RObot System Improved

Mostrar el registro sencillo del ítem

Ficheros en el ítem

dc.contributor.author Martinez-Martin, Ester es_ES
dc.contributor.author Araujo, Angelo es_ES
dc.contributor.author Cazorla, Miguel es_ES
dc.date.accessioned 2020-04-06T08:56:57Z
dc.date.available 2020-04-06T08:56:57Z
dc.date.issued 2019-10-02 es_ES
dc.identifier.uri http://hdl.handle.net/10251/140228
dc.description.abstract [EN] There are great physical and cognitive benefits for older adults who are engaged in active aging, a process that should involve daily exercise. In our previous work on the PHysical Assistant RObot System (PHAROS), we developed a system that proposed and monitored physical activities. The system used a social robot to analyse, by means of computer vision, the exercise a person was doing. Then, a recommender system analysed the exercise performed and indicated what exercise to perform next. However, the system needed certain improvements. On the one hand, the vision system captured the movement of the person and indicated whether the exercise had been done correctly or not. On the other hand, the recommender system was based purely on a ranking system that did not take into account temporal evolution and preferences. In this work, we propose an evolution of PHAROS, PHAROS 2.0, incorporating improvements in both of the previously mentioned aspects. In the motion capture aspect, we are now able to indicate the degree of completeness of each exercise, identifying the part that has not been done correctly, and a real-time performance correction. In this way, the recommender system receives a greater amount of information and so can more accurately indicate the exercise to be performed. In terms of the recommender system, an algorithm was developed to weigh the performance, temporal evolution and preferences, providing a more accurate recommendation, as well as expanding the recommendation to a batch of exercises, instead of just one. es_ES
dc.description.sponsorship This work was partly supported by the FCT-Fundacao para a Ciencia e Tecnologia through the Post-Doc scholarship SFRH/BPD/102696/2014 and by the Spanish Government TIN2016-76515-R Grant supported with Feder funds. es_ES
dc.language Inglés es_ES
dc.publisher MDPI AG es_ES
dc.relation.ispartof Sensors es_ES
dc.rights Reconocimiento (by) es_ES
dc.subject Assistive robotics es_ES
dc.subject Active ageing es_ES
dc.subject Decision support system es_ES
dc.subject Cognitive assistant es_ES
dc.subject Deep learning es_ES
dc.title PHAROS 2.0-A PHysical Assistant RObot System Improved es_ES
dc.type Artículo es_ES
dc.identifier.doi 10.3390/s19204531 es_ES
dc.relation.projectID info:eu-repo/grantAgreement/FCT/SFRH/SFRH%2FBPD%2F102696%2F2014/PT/ es_ES
dc.relation.projectID info:eu-repo/grantAgreement/MINECO//TIN2016-76515-R/ES/RETORNO AL HOGAR: SISTEMA DE MEJORA DE LA AUTONOMIA DE PERSONAS CON DAÑO CEREBRAL ADQUIRIDO Y DEPENDIENTES EN SU INTEGRACION EN LA SOCIEDAD/ es_ES
dc.rights.accessRights Abierto es_ES
dc.description.bibliographicCitation Martinez-Martin, E.; Araujo, A.; Cazorla, M. (2019). PHAROS 2.0-A PHysical Assistant RObot System Improved. Sensors. 19(20):1-18. https://doi.org/10.3390/s19204531 es_ES
dc.description.accrualMethod S es_ES
dc.relation.publisherversion https://doi.org/10.3390/s19204531 es_ES
dc.description.upvformatpinicio 1 es_ES
dc.description.upvformatpfin 18 es_ES
dc.type.version info:eu-repo/semantics/publishedVersion es_ES
dc.description.volume 19 es_ES
dc.description.issue 20 es_ES
dc.identifier.eissn 1424-8220 es_ES
dc.relation.pasarela S\402987 es_ES
dc.contributor.funder Fundação para a Ciência e a Tecnologia, Portugal es_ES
dc.contributor.funder Ministerio de Economía y Competitividad es_ES
dc.description.references World Alzheimer Report 2018—The State of the Art of Dementia Research: New Frontiershttps://www.alz.co.uk/research/WorldAlzheimerReport2018.pdf es_ES
dc.description.references World Report on Disabilityhttps://www.who.int/disabilities/world_report/2011/report.pdf es_ES
dc.description.references Global Action Plan for the Prevention and Control of NCDs 2013–2020https://apps.who.int/iris/bitstream/handle/10665/94384/9789241506236_eng.pdf es_ES
dc.description.references Management of Physical Health Conditions in Adults with Severe Mental Disorders: WHO Guidelineshttp://apps.who.int/iris/bitstream/handle/10665/275718/9789241550383-eng.pdf es_ES
dc.description.references Paúl, C., Teixeira, L., & Ribeiro, O. (2017). Active Aging in Very Old Age and the Relevance of Psychological Aspects. Frontiers in Medicine, 4. doi:10.3389/fmed.2017.00181 es_ES
dc.description.references Caprara, M., Molina, M. Á., Schettini, R., Santacreu, M., Orosa, T., Mendoza-Núñez, V. M., … Fernández-Ballesteros, R. (2013). Active Aging Promotion: Results from theVital AgingProgram. Current Gerontology and Geriatrics Research, 2013, 1-14. doi:10.1155/2013/817813 es_ES
dc.description.references Exercises for Older Peoplehttps://www.nhs.uk/Tools/Documents/NHS_ExercisesForOlderPeople.pdf es_ES
dc.description.references Mura, G., & Carta, M. G. (2013). Physical Activity in Depressed Elderly. A Systematic Review. Clinical Practice & Epidemiology in Mental Health, 9(1), 125-135. doi:10.2174/1745017901309010125 es_ES
dc.description.references Unmet Needs: Improper Social Care Assessments for Older People in Englandhttps://www.hrw.org/sites/default/files/report_pdf/uk0119_web3.pdf es_ES
dc.description.references Stefano, M., Patrizia, P., Mario, A., Ferlini, G., Rizzello, R., & Rosati, G. (2014). Robotic Upper Limb Rehabilitation after Acute Stroke by NeReBot: Evaluation of Treatment Costs. BioMed Research International, 2014, 1-5. doi:10.1155/2014/265634 es_ES
dc.description.references Amadeohttps://tyromotion.com/en/produkte/amadeo/ es_ES
dc.description.references Lokomathttps://www.hocoma.com/solutions/lokomat/ es_ES
dc.description.references Riablohttp://www.syncrospain.com/rehabilitacion/ es_ES
dc.description.references G-EO SYSTEM—An Advanced Robotic Gait Trainerhttps://www.rehatechnology.com/en/products/g-eo-system es_ES
dc.description.references IREX—Immersive Rehabilitation EXercisehttp://www.gesturetekhealth.com/products/irex es_ES
dc.description.references Fitbithttps://www.fitbit.com/es/home es_ES
dc.description.references Sergueeva, K., & Shaw, N. (2016). Wearable Technology in Hospitals: Overcoming Patient Concerns About Privacy. Lecture Notes in Computer Science, 446-456. doi:10.1007/978-3-319-39399-5_42 es_ES
dc.description.references Martinez-Martin, E., & Cazorla, M. (2019). Rehabilitation Technology: Assistance from Hospital to Home. Computational Intelligence and Neuroscience, 2019, 1-8. doi:10.1155/2019/1431509 es_ES
dc.description.references Toyrahttp://www.toyra.org/en/ es_ES
dc.description.references Jintronix Rehabilitation System (JRS)http://www.jintronix.com/ es_ES
dc.description.references Gadde, P., Kharrazi, H., Patel, H., & MacDorman, K. F. (2011). Toward Monitoring and Increasing Exercise Adherence in Older Adults by Robotic Intervention: A Proof of Concept Study. Journal of Robotics, 2011, 1-11. doi:10.1155/2011/438514 es_ES
dc.description.references Görer, B., Salah, A. A., & Akın, H. L. (2016). An autonomous robotic exercise tutor for elderly people. Autonomous Robots, 41(3), 657-678. doi:10.1007/s10514-016-9598-5 es_ES
dc.description.references ENRICHMEhttp://www.enrichme.eu/ es_ES
dc.description.references Baillie, L., Breazeal, C., Denman, P., Foster, M. E., Fischer, K., & Cauchard, J. R. (2019). The Challenges of Working on Social Robots that Collaborate with People. Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems. doi:10.1145/3290607.3299022 es_ES
dc.description.references Canal, G., Escalera, S., & Angulo, C. (2016). A real-time Human-Robot Interaction system based on gestures for assistive scenarios. Computer Vision and Image Understanding, 149, 65-77. doi:10.1016/j.cviu.2016.03.004 es_ES
dc.description.references Makris, S., Karagiannis, P., Koukas, S., & Matthaiakis, A.-S. (2016). Augmented reality system for operator support in human–robot collaborative assembly. CIRP Annals, 65(1), 61-64. doi:10.1016/j.cirp.2016.04.038 es_ES
dc.description.references Tsarouchi, P., Makris, S., & Chryssolouris, G. (2016). Human–robot interaction review and challenges on task planning and programming. International Journal of Computer Integrated Manufacturing, 29(8), 916-931. doi:10.1080/0951192x.2015.1130251 es_ES
dc.description.references Tsarouchi, P., Athanasatos, A., Makris, S., Chatzigeorgiou, X., & Chryssolouris, G. (2016). High Level Robot Programming Using Body and Hand Gestures. Procedia CIRP, 55, 1-5. doi:10.1016/j.procir.2016.09.020 es_ES
dc.description.references Poppe, R. (2010). A survey on vision-based human action recognition. Image and Vision Computing, 28(6), 976-990. doi:10.1016/j.imavis.2009.11.014 es_ES
dc.description.references Pishchulin, L., Insafutdinov, E., Tang, S., Andres, B., Andriluka, M., Gehler, P., & Schiele, B. (2016). DeepCut: Joint Subset Partition and Labeling for Multi Person Pose Estimation. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). doi:10.1109/cvpr.2016.533 es_ES
dc.description.references Guo, F., He, Y., & Guan, L. (2017). RGB-D camera pose estimation using deep neural network. 2017 IEEE Global Conference on Signal and Information Processing (GlobalSIP). doi:10.1109/globalsip.2017.8308674 es_ES
dc.description.references Schwarcz, S., & Pollard, T. (2018). 3D Human Pose Estimation from Deep Multi-View 2D Pose. 2018 24th International Conference on Pattern Recognition (ICPR). doi:10.1109/icpr.2018.8545631 es_ES
dc.description.references Cao, Z., Hidalgo Martinez, G., Simon, T., Wei, S.-E., & Sheikh, Y. A. (2019). OpenPose: Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1-1. doi:10.1109/tpami.2019.2929257 es_ES
dc.description.references Cao, Z., Simon, T., Wei, S.-E., & Sheikh, Y. (2017). Realtime Multi-person 2D Pose Estimation Using Part Affinity Fields. 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). doi:10.1109/cvpr.2017.143 es_ES
dc.description.references Martinez-Martin, E., & Cazorla, M. (2019). A Socially Assistive Robot for Elderly Exercise Promotion. IEEE Access, 7, 75515-75529. doi:10.1109/access.2019.2921257 es_ES
dc.description.references He, K., Zhang, X., Ren, S., & Sun, J. (2016). Deep Residual Learning for Image Recognition. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). doi:10.1109/cvpr.2016.90 es_ES
dc.description.references Hochreiter, S., & Schmidhuber, J. (1997). Long Short-Term Memory. Neural Computation, 9(8), 1735-1780. doi:10.1162/neco.1997.9.8.1735 es_ES
dc.description.references Example of the Glicko-2 Systemhttp://www.glicko.net/glicko/glicko2.pdf es_ES
dc.description.references Bethancourt, H. J., Rosenberg, D. E., Beatty, T., & Arterburn, D. E. (2014). Barriers to and Facilitators of Physical Activity Program Use Among Older Adults. Clinical Medicine & Research, 12(1-2), 10-20. doi:10.3121/cmr.2013.1171 es_ES


Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem