3D Localization and Tracking of Objects Using Miniature Microphones
Radu Ionescu, Riccardo Carotenuto, Fabio Urbani
.
DOI: 10.4236/wsn.2011.35017   PDF    HTML     7,533 Downloads   13,321 Views   Citations

Abstract

Asystemfor accurate localization and trackingof remote objects is introduced, which employs a reference frame of four coplanar ultrasound sources as transmitters and miniature microphones that equip the remote objects as receivers. The transmitters are forced to emit pulses in the 17 - 40 kHz band. A central processing unit, knowing the positions of the transmitters and the time of flight of the ultrasound signals until they reach the microphones, computes the positions of the microphones, identifying and discarding possible false signals due to echoes and environmental noise. Once the microphones are localized, the position of the object is computed by finding the placement of the geometrical reconstructed object that fitsbest with the calculated microphones positions. The operating principle of the localization system is based on successive frames. The data are processed in parallel for all the microphones that equip the remote objects, leading to a high repetition rate of localization frames. In the proposed prototype, all the computation, including signal filtering, time of flight detection, localization and results display, is carried out about 25 times per second on a notebook PC.

Share and Cite:

R. Ionescu, R. Carotenuto and F. Urbani, "3D Localization and Tracking of Objects Using Miniature Microphones," Wireless Sensor Network, Vol. 3 No. 5, 2011, pp. 147-157. doi: 10.4236/wsn.2011.35017.

Conflicts of Interest

The authors declare no conflicts of interest.

References

[1] H. Zhou, Y. Yuan, Y. Zhang, C. Shi, “Non-rigid object tracking in complex scenes”, Pattern Recognition Letters, Vol. 30, Iss. 2, pp. 98-102 (2009).
[2] Z. Zivkovic, B. Kr?se, “An EM-like algorithm for color-histogram-based object tracking”, Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition – CVPR 2004, pp. 798-803 (2004).
[3] L. Peihua, “A clustering-based color model and integral images for fast object tracking”, Signal Processing: Image Communication, Vol. 21, Iss. 8, pp. 676-687 (2006).
[4] S. Valette, I. Magnin, R. RémyProst, “Mesh-based video objects tracking combining motion and luminance discontinuities criteria”, Signal Processing, Vol. 84, Iss. 7, pp. 1213-1224 (2004).
[5] D. Greenhill, J. Renno, J. Orwell, G.A. Jones, “Occlusion analysis: Learning and utilising depth maps in object tracking”, Image and Vision Computing, Vol. 26, Iss. 3, pp. 430-441 (2008).
[6] [6] J. Jeyakar, R.V. Babu, K.R. Ramakrishnan, “Robust object tracking with background-weighted local kernels”, Computer Vision and Image Understanding, Vol. 112, Iss. 3, pp. 296-309 (2008).
[7] R. Marfil, L. Molina-Tanco, J.A. Rodríguez, F. Sandoval, “Real-time object tracking using bounded irregular pyramids”, Pattern Recognition Letters, Vol. 28, Iss. 9, pp. 985-1001 (2007).
[8] M.S. Allili, D. Ziou, “Object tracking in videos using adaptive mixture models and active contours”, Neurocomputing, Vol. 71, Iss. 10-12, pp. 2001-2011 (2008).
[9] J.S. Hu, C.W. Juan, J.J. Wang, “A spatial-color mean- shift object tracking algorithm with scale and orientation estimation”, Pattern Recognition Letters, Vol. 29, Iss. 16, pp. 2165-2173 (2008).
[10] S. Colantonio, M. Benvenuti, M.G. Di Bono, G. Pieri, O. Salvetti, “Object tracking in a stereo and infrared vision system”, Infrared Physics & Technology, Vol. 49, Iss. 3, pp. 266-271 (2007).
[11] J. Shaik, K.M. Iftekharuddin, “Detection and tracking of targets in infrared images using Bayesian techniques”, Optics & Laser Technology, Vol. 41, Iss. 6, pp. 832-842 (2009).
[12] A. Treptow, G. Cielniak, T. Duckett, “Real-time people tracking for mobile robots using thermal vision”, Robotics and Autonomous Systems, Vol. 54, Iss. 9, pp. 729-739 (2006).
[13] J. Zhoua, J. Shi, “Performance evaluation of object localization based on active radio frequency identification technology”, Computers in Industry, In press, doi:10.1016/j.compind.2009.05.002.
[14] J. Song, C.T. Haas, C.H. Caldas, “A proximity-based method for locating RFID tagged objects”, Advanced Engineering Informatics, Vol. 21, Iss. 4, pp. 367-376 (2007).
[15] Laitinen, J. Lahteenmaki, T. Nordstrom, “Database correlation method for GSM location”, Proceedings of the 53rd IEEE Vehicular Technology Conference, Rhodes, Greece, pp. 2504-2508 (2001).
[16] M. Berbineau, C. Tatkeu, J.P. Ghys, J. Rioult, “Localisation de véhicules en milieu urbain par GSM ouradiogoniométrieVehicle self-positioning in urban using or radiogoniometer”, Recherche - Transports - Sécurité, Vol. 61, pp. 38-52 (1998).
[17] A. Varshavsky, E. de Lara, J. Hightower, A. LaMarca, V. Otsason, “GSM indoor localization”, Pervasive and Mobile Computing, Vol. 3, Iss. 6, pp. 698-720 (2007).
[18] M. Vossiek, L. Wiebking, P. Gulden, J. Wieghardt, C. Hoffmann, P. Heide, “Wireless local positioning”, IEEE Microwave Magazine, Vol. 4, Iss. 4, pp. 77-86 (2003).
[19] M. Lu, W. Chen, X.S. Shen, H.C. Lam, J. Liu, “Positioning and tracking construction vehicles in high dense urban areas and building construction sites”, Automation in Construction, Vol. 16, Iss. (5), pp. 647-656 (2007).
[20] H.M. Khoury, V.R. Kamat, “Evaluation of position tracking technologies for user localization in indoor construction environments”, Automation in Construction, Vol. 18, Iss.4, 444-457 (2009).
[21] A. Huhtala, K. Suhonen, P. M?kel?, M. Hakoj?rvi, J. Ahokas, “Evaluation of instrumentation for cow positioning and tracking indoors”, Biosystems Engineering, Vol. 96, Iss. 3, pp. 399-405 (2007).
[22] X. Shen, W. Chen, M. Lu, “Wireless Sensor Networks for Resources Tracking at Building Construction Sites”, Tsinghua Science & Technology, Vol. 13, Supplement 1, pp. 78-83 (2008).
[23] B.S. Choi, J.W. Lee, J.J. Lee, K.T. Park, “Distributed Sensor Network Based on RFID System for Localization of Multiple Mobile Agents”, Wireless Sensor Network, Vol. 3, pp. 1-9 (2011).
[24] J.M. Valin, F. Michaud, J. Rouat, “Robust localization and tracking of simultaneous moving sound sources using beamforming and particle filtering”, Robotics and Autonomous Systems, Vol. 55, Iss. 3, pp. 216-228 (2007).
[25] Q.H. Wang, T. Ivanov, P. Aarabi, “Acoustic robot navigation using distributed microphone arrays”, Information Fusion, Vol. 5, Iss. 2, pp. 131-140 (2004).
[26] A. Harter, A. Hopper, P. Steggles, A. Ward, P. Webster, “The anatomy of a context-aware application”, Wireless Networks, Vol. 8, Iss. 2-3, pp. 187-197 (2002).
[27] F. Tong, S.K. Tso, T.Z. Xu, “A high precision ultrasonic docking system used for automatic guided vehicle”, Sensors and Actuators A: Physical, Vol. 118, Iss. 2, pp. 183-189 (2005).
[28] A. Smith, H. Balakrishnan, M. Goraczko, N. Priyantha, “Tracking Moving Devices with the Cricket Location System”, Proc. 2nd USENIX/ACM MOBISYS Conf., Boston, MA, June 2004.
[29] H. Schweinzer, G. Kaniak, “Ultrasonic device localization and its potential for wireless sensor network security”, Control Engineering Practice, In Press, doi:10.1016/j.conengprac.2008.12.007.
[30] M. Hazas, A. Ward, “A novel broadband ultrasonic location system”, In Proceedings of UbiComp 2002: Fourth International Conference on Ubiquitous Computing, Lecture Notes in Computer Science Vol. 2498, pp 264- 280, Goteborg, Sweden (2002).
[31] M. Hazas, A. Ward, “A high performance privacy-ori- ented location system”. In Proceedings of PERCOM 04: Pervasive computing and communications, pp. 216-223 (2004).
[32] R. Carotenuto, R. Ionescu, P. Tripodi, F. Urbani, “Three Dimensional Gestural Interface”, 2009 IEEE Ultrasonics Symposium, 20-23 September 2009, Rome, Italy, 2009 IEEE International Ultrasonics Symposium Proceedings, pp. 690-693.
[33] L. Girod, D. Estrin, “Robust range estimation using acoustic and multimodal sensing”, Proceedings of 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vol. 3, pp. 1312-1320.
[34] D.A. Bohn, “Environmental effects on the speed of sound”, Journal of the Audio Engineering Society, Vol. 36, Iss. 4, pp. 223-231 (1988)

Copyright © 2024 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.