Utilization of eye tracking technology to control lights at operating room
Main Article Content
Abstract
The development of technology for control systems is increasing, especially to help people with disabilities and facilitate the performance of health workers. Where it is required to maintain the level of sterilization of equipment in hospitals. Eye tracking technology in the last few decades has developed very rapidly. This control system using eye tracking technology can be done with eye movements for those who experience mobility problems. This research aims to develop a light control system through eye activity using the Mediapipe framework from Google. In this study, 2 lamps (A and B) were used, each with a light intensity of 10W. In lamp A, the light intensity can be controlled by turning the light on or off using the blink of the right eye and the blink of the left eye, while lamp B can adjust the intensity of the light by opening both eyes (right and left). Research on a lighting control system using the eye tracking method with an image processing system has been successfully carried out. All data generated is based on activity, distance, eye position on the camera and differences in participant backgrounds. Apart from that, a system that can work well means consistent results are obtained. However, based on distance, the system can read with precision at distances of 50 cm and 60 cm.
Downloads
Article Details
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
References
I. A. Ahmed et al., “Eye Tracking-Based Diagnosis and Early Detection of Autism Spectrum Disorder Using Machine Learning and Deep Learning Techniques,” Electronics, vol. 11, no. 4, p. 530, Feb. 2022, doi: 10.3390/electronics11040530.
K. Rayner, “Eye movements in reading and information processing: 20 years of research.,” Psychol. Bull., vol. 124, no. 3, pp. 372–422, 1998, doi: 10.1037/0033-2909.124.3.372.
M. Singh, P. Jain, and S. Chopra, “Eye movement detection for wheelchair control application,” in 2015 International Conference on Electrical, Electronics, Signals, Communication and Optimization (EESCO), Visakhapatnam: IEEE, Jan. 2015, pp. 1–5. doi: 10.1109/EESCO.2015.7253877.
K. S. Arslan, “EYE TRACKING FOR INTERACTIVE ACCESSIBILITY: A USABILITY ANALYSIS OF COMMUNICATION INTERFACES”.
F. Fahrudin, M. Andriani, Muallimin, and E. Altiarika, “Gerakan Tangan Pemain Otomatis Menggunakan Computer Vision,” J. Inf. Technol. Soc., vol. 1, no. 1, pp. 15–19, Jun. 2023, doi: 10.35438/jits.v1i1.19.
X. Zhang, X. Liu, S.-M. Yuan, and S.-F. Lin, “Eye Tracking Based Control System for Natural Human-Computer Interaction,” Comput. Intell. Neurosci., vol. 2017, pp. 1–9, Dec. 2017, doi: 10.1155/2017/5739301.
N. P. Rahmawati et al., “Design of ANFIS system to detect the condition of generator set model P22-6 based on Omron CJ1M PLC,” vol. 5, no. 3, 2024.
S. A. Lavender et al., “Developing Evidence-Based Design Guidelines for Medical/Surgical Hospital Patient Rooms That Meet the Needs of Staff, Patients, and Visitors,” HERD Health Environ. Res. Des. J., vol. 13, no. 1, pp. 145–178, Jan. 2020, doi: 10.1177/1937586719856009.
S. Chatzidimitriadis, S. M. Bafti, and K. Sirlantzis, “Non-Intrusive Head Movement Control for Powered Wheelchairs: A Vision-Based Approach,” IEEE Access, vol. 11, pp. 65663–65674, 2023, doi: 10.1109/ACCESS.2023.3275529.
“A Method of Measuring Eye Movemnent Using a Scieral Search Coil in a Magnetic Field,” IEEE Trans. Bio-Med. Electron., vol. 10, no. 4, pp. 137–145, Oct. 1963, doi: 10.1109/TBMEL.1963.4322822.
C. Barber, “Electrodes and the recording of the human electroretinogram (ERG),” Int. J. Psychophysiol., vol. 16, no. 2–3, pp. 131–136, May 1994, doi: 10.1016/0167-8760(89)90039-1.
S. T. Moore, I. S. Curthoys, and S. G. McCoy, “VTM — an image-processing system for measuring ocular torsion,” Comput. Methods Programs Biomed., vol. 35, no. 3, pp. 219–230, Jul. 1991, doi: 10.1016/0169-2607(91)90124-C.
“ANN Based Head Movement Detection With Eye Tracking2020 Nimi M R et al, / (IJCSIT) International Journal of Computer Science and Information Technologies, Vol. 6 (2) , 2020, 1513-1517.”
M. P. Paing, A. Juhong, and C. Pintavirooj, “Design and Development of an Assistive System Based on Eye Tracking,” Electronics, vol. 11, no. 4, p. 535, Feb. 2022, doi: 10.3390/electronics11040535.
L. J. Bradley and N. G. Wright, “Optimising SD Saving Events to Maximise Battery Lifetime for ArduinoTM/Atmega328P Data Loggers,” IEEE Access, vol. 8, pp. 214832–214841, 2020, doi: 10.1109/ACCESS.2020.3041373.
“PENGENALAN_ARDUINO_Oleh_Feri_Djuandi.”
S. Kumbhkarn and A. Khuspe, “Posture Detection and Comparison of Different Physical Exercises Based on Deep Learning Using Media Pipe, Opencv,” vol. 07, no. 04, 2023.
“Twinbase_Open-Source_Server_Software_for_the_Digital_Twin_Web.pdf.”
C. E. Widodo, K. Adi, and R. Gernowo, “Medical image processing using python and open cv,” J. Phys. Conf. Ser., vol. 1524, no. 1, p. 012003, Apr. 2020, doi: 10.1088/1742-6596/1524/1/012003.
J. Sigut, M. Castro, R. Arnay, and M. Sigut, “OpenCV Basics: A Mobile Application to Support the Teaching of Computer Vision Concepts,” IEEE Trans. Educ., vol. 63, no. 4, pp. 328–335, Nov. 2020, doi: 10.1109/TE.2020.2993013.
S. Raschka, J. Patterson, and C. Nolet, “Machine Learning in Python: Main Developments and Technology Trends in Data Science, Machine Learning, and Artificial Intelligence,” Information, vol. 11, no. 4, p. 193, Apr. 2020, doi: 10.3390/info11040193.