- -

Development and Calibration of an Eye-Tracking Fixation Identification Algorithm for Immersive Virtual Reality

RiuNet: Repositorio Institucional de la Universidad Politécnica de Valencia

Compartir/Enviar a

Citas

Estadísticas

  • Estadisticas de Uso

Development and Calibration of an Eye-Tracking Fixation Identification Algorithm for Immersive Virtual Reality

Mostrar el registro sencillo del ítem

Ficheros en el ítem

dc.contributor.author Llanes-Jurado, José es_ES
dc.contributor.author Marín-Morales, Javier es_ES
dc.contributor.author Guixeres Provinciale, Jaime es_ES
dc.contributor.author Alcañiz Raya, Mariano Luis es_ES
dc.date.accessioned 2021-06-12T03:33:42Z
dc.date.available 2021-06-12T03:33:42Z
dc.date.issued 2020-09 es_ES
dc.identifier.uri http://hdl.handle.net/10251/167865
dc.description.abstract [EN] Fixation identification is an essential task in the extraction of relevant information from gaze patterns; various algorithms are used in the identification process. However, the thresholds used in the algorithms greatly affect their sensitivity. Moreover, the application of these algorithm to eye-tracking technologies integrated into head-mounted displays, where the subject's head position is unrestricted, is still an open issue. Therefore, the adaptation of eye-tracking algorithms and their thresholds to immersive virtual reality frameworks needs to be validated. This study presents the development of a dispersion-threshold identification algorithm applied to data obtained from an eye-tracking system integrated into a head-mounted display. Rules-based criteria are proposed to calibrate the thresholds of the algorithm through different features, such as number of fixations and the percentage of points which belong to a fixation. The results show that distance-dispersion thresholds between 1-1.6 degrees and time windows between0.25-0.4s are the acceptable range parameters, with 1 degrees and0.25s being the optimum. The work presents a calibrated algorithm to be applied in future experiments with eye-tracking integrated into head-mounted displays and guidelines for calibrating fixation identification algorithms es_ES
dc.description.sponsorship We thank Pepe Roda Belles for the development of the virtual reality environment and the integration of the HMD with Unity platform. We also thank Masoud Moghaddasi for useful discussions and recommendations. es_ES
dc.language Inglés es_ES
dc.publisher MDPI AG es_ES
dc.relation.ispartof Sensors es_ES
dc.rights Reconocimiento (by) es_ES
dc.subject Eye-Tracking es_ES
dc.subject Fixation identification es_ES
dc.subject Virtual reality es_ES
dc.subject Immersive virtual reality es_ES
dc.subject Head-Mounted display es_ES
dc.subject Calibration es_ES
dc.subject Area of interest es_ES
dc.subject.classification EXPRESION GRAFICA EN LA INGENIERIA es_ES
dc.subject.classification ESTADISTICA E INVESTIGACION OPERATIVA es_ES
dc.title Development and Calibration of an Eye-Tracking Fixation Identification Algorithm for Immersive Virtual Reality es_ES
dc.type Artículo es_ES
dc.identifier.doi 10.3390/s20174956 es_ES
dc.rights.accessRights Abierto es_ES
dc.contributor.affiliation Universitat Politècnica de València. Departamento de Ingeniería Gráfica - Departament d'Enginyeria Gràfica es_ES
dc.contributor.affiliation Universitat Politècnica de València. Instituto Interuniversitario de Investigación en Bioingeniería y Tecnología Orientada al Ser Humano - Institut Interuniversitari d'Investigació en Bioenginyeria i Tecnologia Orientada a l'Ésser Humà es_ES
dc.description.bibliographicCitation Llanes-Jurado, J.; Marín-Morales, J.; Guixeres Provinciale, J.; Alcañiz Raya, ML. (2020). Development and Calibration of an Eye-Tracking Fixation Identification Algorithm for Immersive Virtual Reality. Sensors. 20(17):1-15. https://doi.org/10.3390/s20174956 es_ES
dc.description.accrualMethod S es_ES
dc.relation.publisherversion https://doi.org/10.3390/s20174956 es_ES
dc.description.upvformatpinicio 1 es_ES
dc.description.upvformatpfin 15 es_ES
dc.type.version info:eu-repo/semantics/publishedVersion es_ES
dc.description.volume 20 es_ES
dc.description.issue 17 es_ES
dc.identifier.eissn 1424-8220 es_ES
dc.identifier.pmid 32883026 es_ES
dc.identifier.pmcid PMC7547381 es_ES
dc.relation.pasarela S\417630 es_ES
dc.description.references Cipresso, P., Giglioli, I. A. C., Raya, M. A., & Riva, G. (2018). The Past, Present, and Future of Virtual and Augmented Reality Research: A Network and Cluster Analysis of the Literature. Frontiers in Psychology, 9. doi:10.3389/fpsyg.2018.02086 es_ES
dc.description.references Chicchi Giglioli, I. A., Pravettoni, G., Sutil Martín, D. L., Parra, E., & Raya, M. A. (2017). A Novel Integrating Virtual Reality Approach for the Assessment of the Attachment Behavioral System. Frontiers in Psychology, 8. doi:10.3389/fpsyg.2017.00959 es_ES
dc.description.references Marín-Morales, J., Higuera-Trujillo, J. L., De-Juan-Ripoll, C., Llinares, C., Guixeres, J., Iñarra, S., & Alcañiz, M. (2019). Navigation Comparison between a Real and a Virtual Museum: Time-dependent Differences using a Head Mounted Display. Interacting with Computers, 31(2), 208-220. doi:10.1093/iwc/iwz018 es_ES
dc.description.references Kober, S. E., Kurzmann, J., & Neuper, C. (2012). Cortical correlate of spatial presence in 2D and 3D interactive virtual reality: An EEG study. International Journal of Psychophysiology, 83(3), 365-374. doi:10.1016/j.ijpsycho.2011.12.003 es_ES
dc.description.references Borrego, A., Latorre, J., Llorens, R., Alcañiz, M., & Noé, E. (2016). Feasibility of a walking virtual reality system for rehabilitation: objective and subjective parameters. Journal of NeuroEngineering and Rehabilitation, 13(1). doi:10.1186/s12984-016-0174-1 es_ES
dc.description.references Clemente, M., Rodríguez, A., Rey, B., & Alcañiz, M. (2014). Assessment of the influence of navigation control and screen size on the sense of presence in virtual reality using EEG. Expert Systems with Applications, 41(4), 1584-1592. doi:10.1016/j.eswa.2013.08.055 es_ES
dc.description.references Borrego, A., Latorre, J., Alcañiz, M., & Llorens, R. (2018). Comparison of Oculus Rift and HTC Vive: Feasibility for Virtual Reality-Based Exploration, Navigation, Exergaming, and Rehabilitation. Games for Health Journal, 7(3), 151-156. doi:10.1089/g4h.2017.0114 es_ES
dc.description.references Jensen, L., & Konradsen, F. (2017). A review of the use of virtual reality head-mounted displays in education and training. Education and Information Technologies, 23(4), 1515-1529. doi:10.1007/s10639-017-9676-0 es_ES
dc.description.references Jost, T. A., Drewelow, G., Koziol, S., & Rylander, J. (2019). A quantitative method for evaluation of 6 degree of freedom virtual reality systems. Journal of Biomechanics, 97, 109379. doi:10.1016/j.jbiomech.2019.109379 es_ES
dc.description.references Chandrasekera, T., Fernando, K., & Puig, L. (2019). Effect of Degrees of Freedom on the Sense of Presence Generated by Virtual Reality (VR) Head-Mounted Display Systems: A Case Study on the Use of VR in Early Design Studios. Journal of Educational Technology Systems, 47(4), 513-522. doi:10.1177/0047239518824862 es_ES
dc.description.references Bălan, O., Moise, G., Moldoveanu, A., Leordeanu, M., & Moldoveanu, F. (2020). An Investigation of Various Machine and Deep Learning Techniques Applied in Automatic Fear Level Detection and Acrophobia Virtual Therapy. Sensors, 20(2), 496. doi:10.3390/s20020496 es_ES
dc.description.references Armstrong, T., & Olatunji, B. O. (2012). Eye tracking of attention in the affective disorders: A meta-analytic review and synthesis. Clinical Psychology Review, 32(8), 704-723. doi:10.1016/j.cpr.2012.09.004 es_ES
dc.description.references Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, 124(3), 372-422. doi:10.1037/0033-2909.124.3.372 es_ES
dc.description.references Irwin, D. E. (1992). Memory for position and identity across eye movements. Journal of Experimental Psychology: Learning, Memory, and Cognition, 18(2), 307-317. doi:10.1037/0278-7393.18.2.307 es_ES
dc.description.references Tanriverdi, V., & Jacob, R. J. K. (2000). Interacting with eye movements in virtual environments. Proceedings of the SIGCHI conference on Human factors in computing systems - CHI ’00. doi:10.1145/332040.332443 es_ES
dc.description.references Skulmowski, A., Bunge, A., Kaspar, K., & Pipa, G. (2014). Forced-choice decision-making in modified trolley dilemma situations: a virtual reality and eye tracking study. Frontiers in Behavioral Neuroscience, 8. doi:10.3389/fnbeh.2014.00426 es_ES
dc.description.references Juvrud, J., Gredebäck, G., Åhs, F., Lerin, N., Nyström, P., Kastrati, G., & Rosén, J. (2018). The Immersive Virtual Reality Lab: Possibilities for Remote Experimental Manipulations of Autonomic Activity on a Large Scale. Frontiers in Neuroscience, 12. doi:10.3389/fnins.2018.00305 es_ES
dc.description.references Hessels, R. S., Niehorster, D. C., Nyström, M., Andersson, R., & Hooge, I. T. C. (2018). Is the eye-movement field confused about fixations and saccades? A survey among 124 researchers. Royal Society Open Science, 5(8), 180502. doi:10.1098/rsos.180502 es_ES
dc.description.references Diaz, G., Cooper, J., Kit, D., & Hayhoe, M. (2013). Real-time recording and classification of eye movements in an immersive virtual environment. Journal of Vision, 13(12), 5-5. doi:10.1167/13.12.5 es_ES
dc.description.references Duchowski, A. T., Medlin, E., Gramopadhye, A., Melloy, B., & Nair, S. (2001). Binocular eye tracking in VR for visual inspection training. Proceedings of the ACM symposium on Virtual reality software and technology - VRST ’01. doi:10.1145/505008.505010 es_ES
dc.description.references Lim, J. Z., Mountstephens, J., & Teo, J. (2020). Emotion Recognition Using Eye-Tracking: Taxonomy, Review and Current Challenges. Sensors, 20(8), 2384. doi:10.3390/s20082384 es_ES
dc.description.references Manor, B. R., & Gordon, E. (2003). Defining the temporal threshold for ocular fixation in free-viewing visuocognitive tasks. Journal of Neuroscience Methods, 128(1-2), 85-93. doi:10.1016/s0165-0270(03)00151-1 es_ES
dc.description.references Salvucci, D. D., & Goldberg, J. H. (2000). Identifying fixations and saccades in eye-tracking protocols. Proceedings of the symposium on Eye tracking research & applications - ETRA ’00. doi:10.1145/355017.355028 es_ES
dc.description.references Duchowski, A., Medlin, E., Cournia, N., Murphy, H., Gramopadhye, A., Nair, S., … Melloy, B. (2002). 3-D eye movement analysis. Behavior Research Methods, Instruments, & Computers, 34(4), 573-591. doi:10.3758/bf03195486 es_ES
dc.description.references Bobic, V., & Graovac, S. (2016). Development, implementation and evaluation of new eye tracking methodology. 2016 24th Telecommunications Forum (TELFOR). doi:10.1109/telfor.2016.7818800 es_ES
dc.description.references Sidenmark, L., & Lundström, A. (2019). Gaze behaviour on interacted objects during hand interaction in virtual reality for eye tracking calibration. Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. doi:10.1145/3314111.3319815 es_ES
dc.description.references Alghamdi, N., & Alhalabi, W. (2019). Fixation Detection with Ray-casting in Immersive Virtual Reality. International Journal of Advanced Computer Science and Applications, 10(7). doi:10.14569/ijacsa.2019.0100710 es_ES
dc.description.references Blignaut, P. (2009). Fixation identification: The optimum threshold for a dispersion algorithm. Attention, Perception, & Psychophysics, 71(4), 881-895. doi:10.3758/app.71.4.881 es_ES
dc.description.references Shic, F., Scassellati, B., & Chawarska, K. (2008). The incomplete fixation measure. Proceedings of the 2008 symposium on Eye tracking research & applications - ETRA ’08. doi:10.1145/1344471.1344500 es_ES
dc.description.references Vive Pro Eyehttps://www.vive.com/us/ es_ES


Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem