Szczegóły

Tytuł artykułu

MEMS mirror based eye tracking: simulation of the system parameter effect on the accuracy of pupil position estimation

Tytuł czasopisma

Metrology and Measurement Systems

Rocznik

2021

Wolumin

vol. 28

Numer

No 4

Afiliacje

Pomianek, Mateusz : Military University of Technology, Institute of Optoelectronics, 2 Kaliskiego St., 00-908 Warsaw, Poland ; Piszczek, Marek : Military University of Technology, Institute of Optoelectronics, 2 Kaliskiego St., 00-908 Warsaw, Poland ; Maciejewski, Marcin : Military University of Technology, Institute of Optoelectronics, 2 Kaliskiego St., 00-908 Warsaw, Poland

Autorzy

Słowa kluczowe

eye tracking ; MEMS mirror ; laser scanning ; head-mounted display

Wydział PAN

Nauki Techniczne

Zakres

711-724

Wydawca

Polish Academy of Sciences Committee on Metrology and Scientific Instrumentation

Bibliografia

[1] Duchowski, A. T., (2017). Eye tracking methodology: Theory and practice. Springer. https://doi.org/10.1007/978-3-319-57883-5
[2] Judd, T., Ehinger, K., Durand, F., & Torralba, A. (2009, September). Learning to predict where humans look. IEEE 12th International Conference on Computer Vision (pp. 2106–2113). IEEE. https://doi.org/10.1109/ICCV.2009.5459462
[3] Goldberg, J. H., & Kotval, X. P. (1999). Computer interface evaluation using eye movements: methods and constructs. International Journal of Industrial Ergonomics, 24(6), 631–645. https://doi.org/10.1016/S0169-8141(98)00068-7
[4] Hansen, D. W., & Ji, Q. (2009). In the eye of the beholder: A survey of models for eyes and gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(3), 478–500. https://doi.org/10.1109/TPAMI.2009.30
[5] Carvalho, N., Laurent, E., Noiret, N., Chopard, G., Haffen, E., Bennabi, D., & Vandel, P. (2015). Eye movement in unipolar and bipolar depression: A systematic review of the literature. Frontiers in Psychology, 6, 1809. https://doi.org/10.3389/fpsyg.2015.01809
[6] Bittencourt, J., Velasques, B., Teixeira, S., Basile, L. F., Salles, J. I., Nardi, A. E., Budde, H., Cagy, M., Piedade, R., & Ribeiro, P. (2013). Saccadic eye movement applications for psychiatric disorders. Neuropsychiatric Disease and Treatment, 9, 1393. https://doi.org/10.2147/NDT.S45931
[7] Duchowski, A. T., Medlin, E., Gramopadhye, A., Melloy, B., & Nair, S. (2001, November). Binocular eye tracking in VR for visual inspection training. Proceedings of the ACM symposium on Virtual reality software and technology (pp. 1–8). https://doi.org/10.1145/505008.505010
[8] Blattgerste, J., Renner, P., & Pfeiffer, T. (2018, June). Advantages of eye-gaze over head-gaze-based selection in virtual and augmented reality under varying field of views. Proceedings of the Workshop on Communication by Gaze Interaction (pp. 1–9). https://doi.org/10.1145/3206343.3206349
[9] Pasarica, A., Bozomitu, R. G., Cehan, V., Lupu, R. G., & Rotariu, C. (2015, October). Pupil detection algorithms for eye tracking applications. 2015 IEEE 21st International Symposium for Design and Technology in Electronic Packaging (SIITME) (pp. 161–164). IEEE. https://doi.org/10.1109/SIITME.2015.7342317 [10] Stengel, M., Grogorick, S., Eisemann, M., Eisemann, E., & Magnor, M. A. (2015, October). An affordable solution for binocular eye tracking and calibration in head-mounted displays. Proceedings of the 23rd ACM international conference on Multimedia (pp. 15–24). https://doi.org/10.1145/2733373.2806265
[11] Wen, Q., Bradley, D., Beeler, T., Park, S., Hilliges, O.,Yong, J.,&Xu, F. (2020).Accurate Real-time 3D Gaze Tracking Using a Lightweight Eyeball Calibration. Computer Graphics Forum, 39(2), 475–485. https://doi.org/10.1111/cgf.13945
[12] Lee, G. J., Jang, S. W., & Kim, G. Y. (2020). Pupil detection and gaze tracking using a deformable template. Multimedia Tools and Applications, 79(19), 12939–12958. https://doi.org/10.1007/ s11042-020-08638-7
[13] Gegenfurtner, A., Lehtinen, E., & Säljö, R. (2011). Expertise differences in the comprehension of visualizations: A meta-analysis of eye-tracking research in professional domains. Educational Psychology Review, 23(4), 523–552. https://doi.org/10.1007/s10648-011-9174-7
[14] Sarkar, N., O’Hanlon, B., Rohani, A., Strathearn, D., Lee, G., Olfat, M., & Mansour, R. R. (2017, January). A resonant eye-tracking microsystem for velocity estimation of saccades and foveated rendering. IEEE 30th International Conference on Micro Electro Mechanical Systems (MEMS) (pp. 304–307). IEEE. https://doi.org/10.1109/MEMSYS.2017.7863402
[15] Bartuzel, M. M., Wróbel, K., Tamborski, S., Meina, M., Nowakowski, M., Dalasinski, K., Szkulmowska, A. & Szkulmowski, M. (2020). High-resolution, ultrafast, wide-field retinal eye-tracking for enhanced quantification of fixational and saccadic motion. Biomedical Optics Express, 11(6), 3164–3180. https://doi.org/10.1364/BOE.392849
[16] Meyer, J., Schlebusch, T., Fuhl, W., & Kasneci, E. (2020). A novel camera-free eye tracking sensor for augmented reality based on laser scanning. IEEE Sensors Journal, 20(24), 15204–15212. https://doi.org/10.1109/JSEN.2020.3011985
[17] Pomianek, M., Piszczek, M., Maciejewski, M., & Krukowski, P. (2020, October). Pupil Position Estimation Error in an Eye Tracking System Based on the MEMS Mirror Scanning Method. Proceedings of the 3rd International Conference on Microelectronic Devices and Technologies (MicDAT’ 2020) (pp. 28–30). IFSA.
[18] Pengfei, Y., Zhengming, C., Jing, T., & Lina, Q. (2016). Virtual Simulation System of Cutter Suction Dredger Based on Unity3D. Journal of Systems Simulation, 28(9), 2069–2075.
[19] Richards, D., & Taylor, M. (2015). A Comparison of learning gains when using a 2D simulation tool versus a 3D virtual world: An experiment to find the right representation involving the Marginal Value Theorem. Computers & Education, 86, 157–171. https://doi.org/10.1016/j.compedu.2015.03.009
[20] Müller, L. M., Mandon, K., Gliesche, P., Weiß, S., & Heuten, W. (2020, November). Visualization of Eye Tracking Data in Unity3D. 19th International Conference on Mobile and Ubiquitous Multimedia (pp. 343–344). https://doi.org/10.1145/3428361.3431194

Data

2021.12.22

Typ

Article

Identyfikator

DOI: 10.24425/mms.2021.137704 ; ISSN 0860-8229
×