Calculate Target Position of Object in 3-Dimensional Area Based on the Perceived Locations Using EOG Signals

Abstract

EOG is a biosignal which occurs during eye activities such as eye movement and blink. EOG has a linear relationship with gaze distance. Detection object position in 3-dimensional area using gaze motion was proposed in this research. To calculate the distance of gaze motion in pixel unit, affine transform method was developed. The homogeneous matrix from five geometry processes: transformation-1, rotation, transformation-2, shear, and dilatation was determined. To give tracking ability in 3-dimensional area, two cameras were attached each in front of and top side of object. The cameras were accessed by voluntary blink. The EOG characteristic of blink eye was determined based on the absolute ratio between positive peak and negative peak which was greater than 1. Every blink toggled the active camera. The position of object was given by the perceived locations from the two cameras. Every movement in pixel coordinate was converted into centimeter unit. Then, the perceived location was used to calculate to the base coordinate. The result shows that the blink method successfully accessed the camera. Both of the cameras could show the location of object from their side. Calculating the gaze distance using affine transform also gave a satisfied result. Using this method controlling a machine in 3-dimensional area by EOG could be developed.

Share and Cite:

Rusydi, M. , Sasaki, M. and Ito, S. (2014) Calculate Target Position of Object in 3-Dimensional Area Based on the Perceived Locations Using EOG Signals. Journal of Computer and Communications, 2, 53-60. doi: 10.4236/jcc.2014.211007.

Conflicts of Interest

The authors declare no conflicts of interest.

References

[1] Cristian-Cezar, P., Florin, G. and Doru, T. (2012) EOG-Based Visual Navigation Interface Development. Expert Systems with Application, 9, 10857-10866.
[2] Malmmivuo, J. and Plonsey, R. (1995) Bioelectromagnetism. Oxford University Press, New York.
[3] Dhine, K. and Eric, P. (2002) Classification of EOG for Human Computer Interface. 2nd Joint EMBS/BMES Conference, Houston, TX, USA, 23-26 October 2002.
[4] Coughlin, M.J., Cutmore, T.R.H. and Hine, T.J. (2004) Automated Eye Tracking System Calibration Using Artificial Neural Networks. Comput Methods Programs Biomed, 76, 207-220. http://dx.doi.org/10.1016/j.cmpb.2004.06.001
[5] Banerjee, A., Datta, S., Pal, M., Konar, A., Tibarelawa, D.N. and Janarthanan, R. (2013) Classifying Electrooculogram to Detect Directional Eye Movements. Proceedia Technology, 10, 67-75.
[6] Gu, J.J., Meng, M., Cook, A. and Liu, X.P. (2006) Design, Sensing and Control of a Robotic Prosthetic Eye for Natural Eye Movement. Applied Bionics and Biomechanics, 1, 29-41. http://dx.doi.org/10.1533/abbi.2005.0024
[7] Deng, Y.L., Hsu, C.L., Lin, T.C., Tuan, J.S. and Chang, S.H. (2010) EOG-Based Human-Computer Interface System Development. Expert Systems with Application, 37, 3337-3343. http://dx.doi.org/10.1016/j.eswa.2009.10.017
[8] Lledo, L.D., Ubeda, A., Ianez, E. and Azorin, J.M. (2013) Internet Browsing Application based on Electrooculography for Disabled People. Expert Systems with Applications, 40, 2640-2648. http://dx.doi.org/10.1016/j.eswa.2012.11.012
[9] Harun, H. and Mansor, W. (2009) EOG Signal Detection for Home Appliances Activation. In: Proceedings of 5th International Colloquium on Signal Processing and Its Applications (CSPA), Kuala Lumpur, Malaysia, 6-8 March 2009.
[10] Ghandi, T., Trikha, M., Santosh, J. and Anand, S. (2010) Development of an Expert Multitask Gadget Controlled by Voluntary Eye Movements. Expert Systems with Application, 37, 4204-4211. http://dx.doi.org/10.1016/j.eswa.2009.11.082
[11] Bera, R., Boquete, L., Mazo, M. and Lopez, E. (2002) Wheelchair Guidance Strategies Using EOG. Journal of Intelligent and Robotic Systems, 34, 279-299. http://dx.doi.org/10.1023/A:1016359503796
[12] Dev, A., Chacko, H.C. and Varghese, R. (2012) Eye Controlled Wheel Chair Using EOG. In: Proceedings of International Conference on Computing and Control Engineering, Chennai, India, April 2012.
[13] Eduardo, I., Jose, M.A., Eduardo, F. and Andre, U. (2010) Interface Based on Electroocu-lography for Velocity Control of a Robot Arm. Applied Bionics and Biomechanics, 7, 199-207. http://dx.doi.org/10.1080/11762322.2010.503107
[14] Sasaki, M., Ito, S., Takeda, K., Okamoto, T. and Rusydi, M.I. (2013) Developing a Two-Link Robot Arm Controller Using Voluntary Blink. In: Proceedings of the 22nd MAGDA Conference, Miyazaki, Japan, December 2013.
[15] Rusydi, M.I., Okamoto, T., Ito, S. and Sasaki, M. (2014) Rotation Matrix to Operate a Robot Manipulator for 2D Analog Tracking Object Using Electrooculography. Robotics, 3, 289-309. http://dx.doi.org/10.3390/robotics3030289
[16] Rusydi, M.I., Mori, Y., Okamoto, T., Sasaki, M. and Ito, S. (2012) Using EOG Signal to Control Manipulator. In: Proceedings of the 7th Asia Pacific Symposium on Applied Electromagnetics and Mechanis, Ho Chi Minh City, Vietnam, July 2012.
[17] Rusydi, M.I., Sasaki, M. and Ito, S. (2014) Affine Transform to Reform Pixel Coordinate of EOG Signals for Controlling Robot Manipulators Using Gaze Motions. Sensor, 14, 10107-10123. http://dx.doi.org/10.3390/s140610107
[18] Okamaoto, T., Sasaki, M., Ito, S., Takeda, K. and Rusydi, M.I. (2013) Using Gaze Point and Blink Detection to Control Robot Arm by Use of the EOG Signal. In: Proceedings of the 22nd MAGDA Conference, Miyazaki, Japan, December 2013.

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.