An autonomous mobile robot project that combines ROS, Arduino-based motor control, and OpenCV for sensor-driven navigation and interactive human-robot behavior.
This project was developed as an integrated robotics system with two main capabilities:
- Autonomous navigation using light and IR beacon signals
- Interactive photo capture using hand, face, and smile detection
The system uses a two-layer architecture:
- Arduino handles low-level motor control, encoder feedback, PID speed control, and sensor publishing.
- ROS handles high-level navigation logic, behavior switching, computer vision, and photo capture.
- Differential-drive mobile robot control
- PID-based wheel speed regulation
- Encoder-based closed-loop motion control
- Light sensor and IR receiver integration
- Bumper-based obstacle response
- ROS state-machine navigation
- Hand-triggered interaction mode
- Face and smile detection using OpenCV
- Automatic photo capture and local image saving
+-------------------------------+
| High-Level Layer |
| ROS + OpenCV |
|-------------------------------|
| light_receive_data.cpp |
| - Navigation state machine |
| - Beacon search / approach |
| |
| smile_photo_node.cpp |
| - Hand detection |
| - Face / smile detection |
| - Auto photo capture |
+---------------+---------------+
|
| ROS topics / serial communication
v
+-------------------------------+
| Low-Level Layer |
| Arduino |
|-------------------------------|
| light_receive_data.ino |
| - Light / IR sensing |
| - PID motor control |
| |
| smile_photo_node.ino |
| - cmd_vel motor execution |
| - Encoder-based PID control |
+---------------+---------------+
|
v
+-------------------------------+
| Hardware |
|-------------------------------|
| L298N motor driver |
| DC motors + encoders |
| Light sensor |
| IR receiver |
| Bumper switches |
| USB camera |
+-------------------------------+
.
├── light_receive_data.cpp # ROS node for light / IR beacon navigation
├── smile_photo_node.cpp # ROS node for hand / face / smile interaction
├── light_receive_data.ino # Arduino firmware for navigation mode
└── smile_photo_node.ino # Arduino firmware for motion control in photo mode
ROS navigation node for autonomous movement.
Main responsibilities:
- subscribes to
light_dataandir_ratio - publishes
cmd_vel - reads bumper and GPIO input
- controls robot behavior using a state machine
- handles beacon search, approach, fallback, and obstacle recovery
Arduino firmware for light / IR navigation mode.
Main responsibilities:
- reads analog light sensor data
- measures IR low-ratio signal
- publishes sensor data to ROS
- receives
/cmd_vel - controls wheel motors with PID and encoder feedback
ROS vision node for interactive photo capture.
Main responsibilities:
- captures live video from camera
- detects hand gesture as an interaction trigger
- detects face and smile using OpenCV Haar cascades
- adjusts robot position for better framing
- saves captured photos automatically
Arduino firmware for photo interaction mode.
Main responsibilities:
- receives
/cmd_velfrom ROS - controls differential-drive motion
- uses encoder-based PID speed control
- stops safely if command updates are lost
- Raspberry Pi or Linux host running ROS
- Arduino Uno
- L298N motor driver
- DC motors with encoders
- Differential-drive chassis
- Analog light sensor
- IR receiver module
- Bumper switches
- USB camera
- ROS 1
- C++
- Arduino
- OpenCV
- rosserial
- PID_v1 library
- wiringPi
In this mode, the robot moves autonomously while monitoring:
- light intensity
- IR beacon ratio
- bumper input
The robot uses a state machine to:
- wait for start
- move forward
- recover from collisions
- search for a beacon
- approach the beacon when detected
In this mode, the robot behaves like an interactive photo robot.
Workflow:
- patrols by turning and pausing
- waits for a hand trigger
- switches to face and smile detection
- adjusts position if framing is poor
- captures a photo when a smile is detected consistently
- enters cooldown before restarting
light_data(std_msgs/Int16)ir_ratio(std_msgs/Float32)cmd_vel(geometry_msgs/Twist)
- sensor data from Arduino to ROS
- motion commands from ROS to Arduino
Before running the project, make sure the following are installed:
- ROS 1 environment
- OpenCV with Haar cascade files
- rosserial
- wiringPi
- Arduino IDE or Arduino CLI
- PID_v1 Arduino library
Typical cascade files used in this project:
haarcascade_frontalface_default.xmlhaarcascade_smile.xml
Choose the correct firmware depending on the demo mode:
light_receive_data.inofor navigation modesmile_photo_node.inofor smile-photo mode
catkin_make
source devel/setup.bashNavigation mode:
rosrun your_package light_receive_dataSmile photo mode:
rosrun your_package smile_photo_nodeReplace
your_packagewith your actual ROS package name.
- Autonomous mobile robot demo
- Interactive exhibition robot
- Robotics course project
- Embedded systems + ROS integration project
- Computer vision based human-robot interaction prototype
- Replace Haar cascades with deep learning-based detection
- Improve navigation robustness in complex environments
- Add SLAM or localization support
- Add remote monitoring or dashboard interface
- Support voice interaction
- Store metadata together with captured photos
This repository demonstrates the integration of:
- embedded motor control
- sensor feedback
- state-machine-based autonomy
- ROS communication
- vision-based interaction
It is suitable for academic presentation, portfolio use, and further robotics development.
This project is shared for academic and portfolio demonstration purposes.
