Cover
Vol. 14 No. 1 (2018)

Published: June 30, 2018

Pages: 41-50

Original Article

Control of Robot Directions Based on Online Hand Gestures

Abstract

The evolution of wireless communication technology increases human machine interaction capabilities especially in controlling robotic systems. This paper introduces an effective wireless system in controlling the directions of a wheeled robot based on online hand gestures. The hand gesture images are captured and processed to be recognized and classified using neural network (NN). The NN is trained using extracted features to distinguish five different gestures; accordingly it produces five different signals. These signals are transmitted to control the directions of the cited robot. The main contribution of this paper is, the technique used to recognize hand gestures is required only two features, these features can be extracted in very short time using quite easy methodology, and this makes the proposed technique so suitable for online interaction. In this methodology, the preprocessed image is partitioned column-wise into two half segments; from each half one feature is extracted. This feature represents the ratio of white to black pixels of the segment histogram. The NN showed very high accuracy in recognizing all of the proposed gesture classes. The NN output signals are transmitted to the robot microcontroller wirelessly using Bluetooth. Accordingly the microcontroller guides the robot to the desired direction. The overall system showed high performance in controlling the robot movement directions.

References

  1. Chun Z., and Weihua S. “Online hand gesture recognition using neural network based segmentation,” in IEEE/RSJ International Conference on Robots and Systems, St. Louis, USA, pp. 2415-2420, 2009.
  2. Raheja J.L., Shyam R., Kumar U., and Prasad P. B., “Real-time robotic hand control using hand gestures,” in Second International Conference on Machine Learning and Computing (ICMLC), Bangalore, India, pp. 12-16, 2010.
  3. Faudzi A, Ali M, Azman M, and Ismail Z. “Real-time hand gestures system for mobile robots control,” in ELSEVIER International Symposium on Robotics and Intelligent Sensors (IRIS) Procedia Engineering, Kuching, Sarawak, Malaysia, ELSEVIER, vol. 41, pp. 798–804, 2012.
  4. Chaudhary A, and Jagdish R., “Bent fingers’ angle calculation using supervised ANN to control electro-mechanical robotic hand,” Comput Electr Eng, vol. 39, no. 2, pp.560-570, 2013.
  5. Dominio F., Donadeo M., Marin G., Zanuttigh P., and Cortelazzo G. M., “Hand ACM/IEEE Proceedings of the 4th ACM/IEEE Workshop on Analysis and Retrieval of Tracked Events and Motion in Imagery Stream, Barcelona, Spain, ARTEMIS ’13, pp. 9–16, 2013.
  6. Ren Z., Meng J., and Zhang Z., “ Robust part-based hand gesture recognition using kinect sensor,” IEEE T MULTIMEDIA, vol. 15, no. 5, pp. 1110-1120, 2013.
  7. Yao Y., and Fu Y., “Contour model based hand gesture recognition using kinect sensor,” pp. 1935-1944, 2014.
  8. Jiang Y., Tao J., Ye W., Wang W., and Ye Z., “An isolated sign language recognition system using RGB-D sensor with sparse coding,” in IEEE Computational Science and Engineering (CSE) IEEE 17th International Conference, Chengdu, China, pp. 21-26, 2014.
  9. Murugeswari M., and Veluchamy S., “Hand application,” Conference on Advanced Communication Control and Computing Technologies (ICACCCT), Ramanathapuram, India, pp. 1220-1225, 2014.
  10. Ohn-Bar E., and Trivedi M. M., “Hand and evaluations,” IEEE T INTELL TRANSP, vol. 15, no. 6, pp. 2368-2377, 2014.
  11. Molchanov P., Gupta S., Kim K., and Kautz J., “Hand gesture recognition with 3d convolutional neural networks,” in IEEE Computer Vision and Pattern Recognition Workshops Conference (CVPRW), Boston, Massachusetts, USA, pp 1-7, 2015.
  12. Hamid A. J., and Herman K. O. Human, “computer interface using hand recognition based on neural network,” in IEEE 5th National Symposium on Information Technology: Towards New Smart World (NSITNSW), Riyadh, Saudi Arabia, pp 1-6, 2015.
  13. Maleki B., and Ebrahimnezhad H., “Intelligent visual mouse system based on hand pose trajectory recognition in video sequences,” Multimedia Syst, vol. 21, no. 6, pp. 581-601, 2015.
  14. Devine S., Rafferty K., and Ferguson S., “Real time robotic arm control using hand UKACC 11th International Conference on Control (CONTROL), Belfast, United Kingdom, pp.719-723, 2016.
  15. Lamb K., and Madhe S., “Hand gesture recognition based bed position control for disabled patients,” in IEEE Advances in Signal Processing Conference (CASP), Pune,
  16. Liu Z., Zhang C., and Tian Y., “3D-based deep convolutional neural network for action recognition with depth sequences,” Image vision comput, vol. 55, part 2, pp. 93-100, 2016.
  17. David J. Rios-Soria, Satu E. S., and Sara E. using computer-vision techniques,” in WSCG 21st International Conference on Computer Visualization and Computer Vision, Czech Republic, pp. 1-8, 2013.
  18. Bluetooth SIG. Bluetooth specification Version 2.0 + EDR. Bluetooth®, 2004.
  19. Electronica 60 Norte. Datasheet Bluetooth to serial port module HC05. Yucatan, Mexico, 2017.
  20. Sparkfun Electronics. Rover 5 datasheet. Colorado, USA.
  21. JAMECO electronics. Motor controller 4 channel 4.5a 4.5-12v- for rover 5 chassis.