- ALL COMPUTER, ELECTRONICS AND MECHANICAL COURSES AVAILABLE…. PROJECT GUIDANCE SINCE 2004. FOR FURTHER DETAILS CALL 9443117328
Projects > ELECTRONICS > 2019 > IEEE >
This paper proposes a secondary reactive collision avoidance system for micro class of robots based on a novel approach known as the Furcated Luminance-Difference Processing (FLDP) inspired by the Lobula Giant Movement Detector, a wide-field visual neuron located in the lobula layer of a locust nervous system. This paper addresses some of the major collision avoidance challenges; obstacle proximity & direction estimation, and operation in GPS-denied environment with irregular lighting. Additionally, it has proven effective in detecting edges independent of background color, size, and contour. The FLDP executes a series of image enhancement and edge detection algorithms to estimate collision threat-level which further determines whether or not the robot’s field of view must be dissected where each section’s response is compared against the others to generate a simple collision-free maneuver. Ultimately, the computation load and the performance of the model is assessed against an eclectic set of off-line as well as real-time real-world collision scenarios validating the proposed model’s asserted capability to avoid obstacles at more than 670 mm prior to collision, moving at 1.2 ms-1 with a successful avoidance rate of 90% processing at 120 Hz on a simple single core microcontroller, sufficient to conclude the system’s feasibility for real-time real-world applications that possess fail-safe collision avoidance system.
DCMD (Descending Contra-Lateral Movement Detector)
The proposed model is implemented on a 3-DOF (degree of freedom) ground robot designed. It exhibits the necessary agility to accomplish successful collision avoidance using a DC motor, 9g servo motor, Arduino nano development board, motor shield, and a 2-Megapixel CMOS sensor. The Atmel ATmega328 8-bit AVR micro-controller with a maximum of 20 MHz operating frequency built into an Arduino development board is implemented to interface the robot with the proposed algorithm developed in MATLAB. Image frames captured by the CMOS sensor are transmitted through a USB cable to the ground control station (Intel Core i5-6500, 8GB RAM) where the images are processed and motor control commands generated. These commands are transmitted through the micro-controller to the servo and motor shield to control the robot’s locomotion.
BLOCK DIAGRAM