Design and Implementation of an Autonomous Vehicle to Collect Tennis Balls Using Artificial Vision

Caren Guerrero, José Luis Tinajero, David Moreno, Edgar Salazar

Abstract


The objective of this work was to design and implement an autonomous vehicle (robot) to collect tennis balls using different digital image processing techniques. The robot was built from an Arduino Nano microcontroller.  A radio frequency antenna NRF24L01 receives the data from the control stage and the locomotion system integrated by motors and an odometry system composed of MPU6050 gyroscope encoders; additionally, the system has an emitter module that consists of an Arduino Uno and an antenna with the same characteristics. The prototype consists of two separate subsystems, one for collecting and processing information and the other specific for the vehicle on the ground. It is equipped with a Kinect camera that captures information from a defined area for image processing through a visual control algorithm that detects the balls by color and shape segmentation, determining their location in rectangular coordinates and sending them to the robot through a data transmission system. The Ackerman configuration mobile robot equipped with the wireless communication system receives the coordinates to carry out the movements that are controlled by sensors located on the wheels, with a maximum capacity of 4 balls. The complete running of the system obtained an accuracy of 96.9% in the collection of balls; it should be noted that the tests were carried out with several distractors whose objective was to confuse the system; these tests were carried out at various times the day in a real scenario.

Keywords


Digital image processing; mobile robot; color segmentation; Kinect camera; wireless communication system.

Full Text:

PDF

References


W. Wieclawek and E. Pietka, “Car segmentation and colour recognition,†Proceedings of the 21st International Conference on Mixed Design of Integrated Circuits and Systems, MIXDES 2014, pp. 426–429, 2014, doi: 10.1109/MIXDES.2014.6872234.

J.L. Tinajero, P. Lozada, and F. Cabrera, “Visual control and sensorial fusion system for the implementation of a differential mobile platform,†Espacios, vol. 39, no. 51, 2018.

J. L. Tinajero, L. Acosta, E. Chango, Erika, and E. Moyon, Jhonny, “Sistema de visión artificial para clasificación de latas de pintura por color considerando el espacio de color RGB,†no. 0798 1015, p. 18, 2020.

B. Siciliano, L. Sciavicco, L. Villani, and G. Oriolo, Robotics Modelling, Planning and Control. 2009.

“Magnetic encoders for industrial robotics.†https://www.rls.si/en/magnetic-encoders-for-industrial-robotics (accessed Jun. 23, 2020).

“O que é Encoder? Para que serve? Como escolher? Como interfacear? | HI Tecnologia.†https://www.hitecnologia.com.br/blog/o-que-%C3%A9-encoder-para-que-serve-como-escolher-como-interfacear/ (accessed Jun. 23, 2020).

N. Jiang, “Intelligent Stereo Camera Mobile Platform for Indoor Service Robot Re-search,†Proceedings of the 2016 IEEE 20th International Conference on Computer Supported Cooperative Work in Desing, pp. 5–9, 2016.

“Knowing How a Stereoscopic Camera Works - Steve’s Digicams.†http://www.steves-digicams.com/knowledge-center/how-tos/photo-accessories/knowing-how-a-stereoscopic-camera-works.html#b (accessed Jun. 23, 2020).




DOI: http://dx.doi.org/10.18517/ijaseit.11.4.13666

Refbacks

  • There are currently no refbacks.



Published by INSIGHT - Indonesian Society for Knowledge and Human Development