TB5-WaLI Blog: Jazzy TB4 on Raspberry Pi 5 - Ubuntu 24.04 #517
Replies: 26 comments 12 replies
-
|
Full Success getting
My install Turtlebot4 Jazzy on Raspberry Pi 5 is here Bot arrives Thursday, so more testing has to wait. (And creating modified URDF and SDF.) |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
|
WaLI's "ride" arrived today, but the battery was totally shot - not just shutdown ... cannot wake. |
Beta Was this translation helpful? Give feedback.
-
Boring Stuff: My "TurtleBot4 History"This Incarnation: TB5-WaLI Jan 2025 - Create3 plus Raspberry Pi 5, RPLidar C1, Oak-D-Lite
Consolation Period: Implemented TurtleBot3 Cartographer, Navigation and Gazebo Classic simulation for 24/7 "alive" GoPi5Go-Dave (GoPiGo3) Raspberry Pi 5 robot (in Docker) Second Incarnation: Create3-WaLI Dec 2023 - Mar 2024 (Returned Crashed running RTABmap)
First Incarnation: TurtleBot4 Nov 2022 (Returned Immediately - Damage in Shipment)
Consolation Period: Created ROS 2 Humble GoPiGo3 Raspberry Pi 4 image and "Humble-Dave" Beta Incarnation: Create3 Simulation Beta Oct 2021
|
Beta Was this translation helpful? Give feedback.
-
What is inside the iRobot Create3 battery?DANGER - DO NOT ATTEMPT THIS - FOR CURIOSITY ONLYThe iRobot Create3 battery is a true engineering marvel. Really! The amount of design thinking that went into the battery really surprised me when I managed to open the "dead" battery without destroying it. |
Beta Was this translation helpful? Give feedback.
-
TB5-WaLI Is "Alive"!Well the "5-WaLI" part is alive, the turtlebot4.service did not start everything successfully. I stopped the "TB" (mainly to turn republisher off), returned the Create3 to an empty namespace, and started my WaLI nodes (wali_node, say_server, odometer). Problem appears to be USB surge power required is greater than Pi 5 will supply, designing aux powered USB hub. WaLI will get off the dock when the battery is 99% charged (if not during "sleeptime" 10pm to 8am), TB5-WaLI is drawing 9.8W total, 4.3W for the Create3, and 5.5W for the RPi5, and speaker. (LIDAR and Oak-D-Lite powered but not operating). His very first "playtime" lasted 2.6 hours: His first "after playtime" recharge took 2.7 hours and ATM is off the dock for his second "playtime": |
Beta Was this translation helpful? Give feedback.
-
TB5-WaLI Survives His First Night As A True Turtlebot4TB5-WaLI with the turtlebot4 service running in Simple Discovery mode was republishing topics just fine, but actions and services were not being republished to the Create3 and the responses were not successfully being republished, TB5-WaLI docked successfully when his battery dropped below 15%, When "sleep time" ended at 8:00AM local, WaLI was trickle charging at below 99% so he did not immediately get off the dock, |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Runaway Oakd (Camera) ContainerInteresting issue with Oak-D-Lite camera operation: /stop_camera service call when docked eventually will cause camera driver to stop publishing /oakd/rgb/preview/image_raw topics and the oakd container to runaway (100% of one CPU). I submitted an issue to Luxonis - need to compare launching the camera separate from the turtlebot4_bringup launch. Since I am trying to run pure "installed turtlebot4" code (/opt/ros/jazzy/share/turtlebot4_*) I am currently testing with a modified turtlebot4_bringup/config/turtlebot4.yaml that sets In my current test: "turn off turtlebot4_node power_saver feature" (which prevents stopping camera when on the dock) the
Next I'll figure out how to properly stop TB4 oakd container to "run the camera driver via a launch file" and then test repeated stop_camera and start_camera service calls. (The camera is drawing 1W from a separated 5v power supply, with very good voltage regulation of +/-10mV.) |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Robots make different sounds when they are in pain.Last night as I was watching the late night weather, WaLI started screaming. He was obviously in pain, so I ran to see if I could help him. The screaming was coming from inside the Raspberry Pi 5 case, so I went to my computer to safely shutdown WaLI's "brain". WaLI spends the night on his dock, so I went to sleep leaving WaLI's brain repair for the morning before his usual 8AM undocking. Turns out the Raspberry Pi 5 case fan developed a bad bearing. Luckily I had a Pi5 Active Cooler handy to cool WaLI's pain. WaLI is back to himself, off his dock, quietly observing the world around him. |
Beta Was this translation helpful? Give feedback.
-
KLUDGE: How to re-modify TurtleBot4 code after update/upgradeThe proper way to customize ROS code is to build overlays and setup the path to find the overlay code rather than the stock code, but two issues with this:
So I am kludging my changes onto the updated turtlebot4 packages with three "install____.sh" scripts. Full instructions: NOTE: After running sudo apt update && sudo apt upgrade -y run: Then wait till oakd container is up with "camera ready" and then find the PID to kill the camera (this is the workaround till the Luxonis Jazzy depthai_ros_driver will /stop_camera without crashing): (if need the camera cmds/launch_camera.sh) |
Beta Was this translation helpful? Give feedback.
-
TB5-WaLI's SecretThe secret to the success of this WaLI incarnation is the Create3-Republisher-Node and ROS 2 Discovery-Server running on the Raspberry Pi, that isolate the Create3 from unrelated ROS 2 topics to prevent overloading the Create3 processor. Big shout out to @alsora for the development of the Create3 Republisher - The critical secret to the iRobot Create3 platform success! |
Beta Was this translation helpful? Give feedback.
-
Logitech F710 Wireless USB Game Controller for Turtlebot4 TB5-WaLIAdded a mapping for the Logitech F710 buttons to correlate to the stock TB4 Game Controller Logitech F710 Game Controller for TB5-WaLI
added to turtlebot4_robot/turtlebot4_bringup/config/turtlebot4.yaml: Using Undock, Dock Button on F710 Game Controller: |
Beta Was this translation helpful? Give feedback.
-
Turtlebot3 vs Turtlebot4 Mapping - Interesting resultWith another of my ROS 2 robots, I was able to create a great map of my house using turtlebot3_cartographer, where my efforts to create a map with slam_toolbox were stymied. With TB5-WaLI I ran the turtlebot4_navigation async SLAM and created a map, and for comparison have created "turtlebot4_cartographer" and built a map. With turtlebot4_cartographer:
The map with my "turtlebot4_cartographer" (Jazzy turtlebot3_cartographer with name changed to tb4): |
Beta Was this translation helpful? Give feedback.
-
Safe to charge to 100%It is suggested (for Electric Vehicles) that Li-Ion batteries be charged to 80% and discharged no more than 20%, to maximize battery life (about 3000-4000 cycles over 10 years for EVs). Currently I have my "TB5-WaLI" (Turtlebot4 with a Raspberry Pi 5 - WallFollower Looking For Intelligence) remain charging on the dock till 99.5% charged and return to the dock at 20.5% charge. (Docked charge time is extended currently because I disabled power_save due to the Luxonis DepthAI ROS driver issue crashes the camera container.) ATM, charge time is about 3 hours, and “playtime” is 2 hours. I asked iRobot support: “Is this going to shorten the battery life?” Is it realistic to expect 3000-4000 cycles (5 per day) making the battery last 2-3 years if I dock at 20% and undock at 100% charge? Received this response from iRobot support: Current stats for TB5-WaLI with "Charge to 100%, Play to 20%": |
Beta Was this translation helpful? Give feedback.
-
Exploring Turtlebot4 NavigationAfter working around an issue that caused several components of the localization and navigation stack to When returning to the docking area to face away from the dock, navigation likes to arrive close to the goal then back up to be within the tolerance settings. The Create3 comes with default protection against backing very far since the cliff sensors are only in the front. Since my home does not have any "cliffs", and since navigation is pretty good at slowing near obstacles (so bumpers have time to be effective), I disabled the safety features completely: |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Bed Time For WaLITB5-WaLI goes to bed around 10PM. Guess he likes a lot of "nite lights."
|
Beta Was this translation helpful? Give feedback.
-
Second Time I've Caught TB5-WaLI Hugging Kilted-DaveNot sure why WaLI keeps trying to hug robot Kilted-Dave. Second time I've had to rescue him so his eventual docking maneuver can succeed. |
Beta Was this translation helpful? Give feedback.
-
TB5-WaLI Wishing Everyone Happy New Year 2026TurtleBot4 (with Raspberry Pi 5) WaLI is wishing everyone, robots and owners, "Happy New Year 2026" tonight. Package Versionsturtlebot4/turtlebot4 dc744fe turtlebot4-setup:About Ubuntu 24.04 ROS 2 Jazzy Raspberry Pi 5 8GB NTP Setup:
|
Beta Was this translation helpful? Give feedback.
-
"TB5-WaLI, What Do You See?" (When you look in the mirror)With all the talk of the progress of "Local LLM" models for both text and vision understanding, I have been trying out a few different models for my Raspberry Pi5 Turtlebot4 clone TB5-WaLI. At first just testing if textual large language models might enable TB5-WaLI to be more conversational. tinyllama textual "large language model" 600Mb 1.1b tokensHi there, I understand how important your shower is. Don’t hesitate to call me if you need any assistance or have any questions about AI-related topics or concerns. Enjoy the shower! ⌚ 🤖 p.s. The emojis are from the tinyllama model - not me. Gemma 1.7GB 2b tokensStrange response that I got from WaLI running the twice larger Gemma model: I better not let WaLI near Amazon, he’s going to start ordering “cleaning supplies”. Next, I started wondering what else WaLI could do with LLMs? Vision Language Assistants for "WaLI Looking In The Mirror"More recently local vision language assistant models have been released, so I decided to try "What Do You See?" Local MoonDream Vision Language Assistant 0.5bLocal MoonDream vLM 2b modelLOCAL MODELS SUMMARYIt should be noted that running any of these models caused CPU utilization to rise to 120% (all four cores screaming for cold air) and REMOTE "Local" Vision Language AssistantI have always resisted making my robots dependant on any external processing, not cloud speech reco, not realistic sounding cloud text-to-speech, and not cloud image processing for object recognition. The recent releases of local vision language assistant models too large to fit on my robots' Raspberry Pi processors, but small enough to fit in the 32GB of my MacMini, had me wondering what TB5-WaLI could do with an off-board "local" vision language server. I let Google Gemini write the code to send images from the robot to a "local remote" Ollama server in the prompt: "What do you see? [encoded image bytes]” Here I sent an image of WaLI's "room mate" Kilted-Dave to the "local remote" Quen3-vl:8b model on 6-core 3.7GHz Intel I-7 with 32GB memory (not GPU enabled)Key Components:
Context & Setting:
Overall Impression:This robot blends functional robotics (wheels, electronics) with whimsical customization "LOCAL REMOTE" VISION LANGUAGE MODEL SUMMARYThere you have it - TB5-WaLI looked at "Kilted-Dave" and responded:
using 1MB RAM and less than 1% of the Processor (and waiting 4 minutes for the answer to return from the remote Mac Ollama server). Pretty impressive result. While the list of "trained objects" for these models is quite low, the models are able to recognize many, many more "object characteristics" like color, shape, size, textures, materials, position, relationships, lighting, surfaces, corners, transitions, and more with proper inquisitive prompting. |
Beta Was this translation helpful? Give feedback.
-
Years Asking Humans - Google Gemini Solved ROS ExceptionFor years and years, my ROS robots have displayed an innocuous exception when I terminated a node with ctrl-c or killed all my robot processes. For years I have searched Google for a solution. I posted about the issue on robotics.stackexchange and discourse.ros.org and many other places. Last year I logged issues against the ROS tutorials in the May 2025 ROS 2 Kilted Test Party about this. No one could tell me how to fix it - ever. More than a year ago, a buddy joined the “vibe coding” rage and reported using a large language model to write useful code. I stayed away from trying it, mostly due to distraction by other hobbies, but last week gave in and successfully used Google Gemini to write a new ROS 2 node. It was a very basic node that would have taken me about 30 minutes to refresh my brain from existing nodes and then create the new node. Telling Gemini what I wanted produced suggested code for the new node that did not work at first, but I knew how to fix the problem and added the required lines to make the Python3 node run. BUT! the Gemini created node had the same “cntr-c exception” issue that I have put up with for years. I decided to ask Gemini for a solution to the error traceback. The first modification to the code suggested by Gemini did not fix the problem, so I provided the traceback again to Gemini and asked for a new suggestion. At that point I had spent more time having Gemini write the code than it would have taken me to produce it, but my code would have had the same cntrl-c exception issue. Voila! The Gemini large language model agent has produced a solution to an issue that has vexed me for more years than I can remember.
It may be that only the "if rclpy.ok():" stanza will fix the problem - need to continue testing. |
Beta Was this translation helpful? Give feedback.
-
Google Thought It Knew The Secret For TurtleBot4 Donut DeliveryBack when I believed TurtleBot4 was going to set the ROS world alite with extraordinary features and capabilities, ClearPath produced a demo of TurtleBot4 navigation delivering donuts: The TurtleBot4 comes with LIDAR for 2D sensing of the environment, and a neural-net processor enhanced stereo-depth camera for 3D sensing. When my TurtleBot4 was delivered, I was quite disappointed that all the example TurtleBot4 software only used the RGB camera as a remote web camera, and only the LIDAR was used for turtlebot4_navigation. (I need to add here that I ended up with a "home-built TurtleBot4 lite" that runs the official TurtleBot4 lite software stack.) As I proceeded to get familiar with all the TurtleBot4 software provided, I first used the turtlebot4_navigation (running on a Raspberry Pi4) to create a good map my home, and rViz2 for remote visualization. Life was good, and I was optimistic I would soon be delivering a morning donut to my wife in the other room. When the Raspberry Pi5 came out, I simply had to have one to upgrade my TurtleBot4 lite (clone) robot - thus TB5-WaLI was "born". I began to familiarize myself with configuring turtlebot4_navigation to navigate the map my Turtlebot4 WaLI had created. TB5-WaLI could manage simple navigation goals, but he liked to plan too close to corners and developed an overpowering facination with my bar stools. I couldn't get him to ignore the bar stools on his way to the front door. Additionally, his dock was in our formal dining room close to a pedestal table with six normal "four-legged" chairs. Also there was a weird triangular wall corner next to WaLI's dock which would distract him sometimes causing navigation goals to fail soon after starting to move. Oh and navigation can't begin until WaLI is off the dock, because the default nav2 planner likes to spin first then drive toward the goal. It was apparent that I was going to have to tune the pages and pages of nav2 parameters. Try as I might, I could not get any reliability. Also it was clear the navigation nodes, plus localization amcl_node, plus my WaLI nodes was stressing the Raspberry Pi5 to the limit with one-minute CPU loads of 8 and 9 with CPU% reaching 80-90% causing navigation goal failures. Lately, I started asking Google Gemini for nav2 tuning advice and have finally achieved what appears to be reliable navigation, and have also been asking Gemini for tuning advice to lower the total CPU load. One of the suggestions gave a 36% reduction in the maximum 1-minute total processor load (193% to 157%) and brought CPU% down to 60% max while navigating. I asked for more suggestions and set out to test each of them separately to avoid introducing "slop code" into TB5-WaLI. Thinking about the difficulty I have been having and recalling the Donut Delivery Navigation was using the stock TurtleBot4 Raspberry Pi 4, and TB5-WaLI has a Pi5 which has three to four times the processing power of the stock TurtleBot4 Pi4, I asked Gemini: And oh-boy ... Gemini (thinks it) knows a secret that does not come standard in the TurtleBot4 software stack: The previous days, Gemini was helping me fix a wierd "out of map bounds" (in the Z dimension) error when WaLI navigated across tile-carpet transitions. It was this that helped me understand the default Turtlebot4 navigation parameters configure 3D navigation, but only use the 2D LIDAR - stereo-depth point clouds from the camera are not used. And so, with the secrets Gemini thinks it has "revealed", I configured the local cost map for 2D layer, rather than using voxel_layer.
WaLI is not delivering any donuts to Google! Changing to a 2D layer local cost map did not appear to change the CPU usage, and it did not keep the local cost map aligned with the map as well in narrow areas with many obstacles the navigation frequently failed to reach the goal. Back to the basic changes:
Note: Load is a measure of system demand, CPU% is a measure of actual RPi 5 processing usage. Having demand higher than the processor can supply should be minimized to prevent starving critical processes. Where I'm at today (max total CPU%) :
I have a much better handle on measuring total CPU usage, and it appears that navigation is reliable as long as the CPU usage stays below 70%. If I so much as issue a ros2 topic echo --once /battery_state while the bot is navigating, the CPU usage goes up to 80% and navigation fails. Terrible to be running at the exact edge of robustness. |
Beta Was this translation helpful? Give feedback.
-
TurtleBot4 Navigation Tuning ResultsI have spent the month of March tuning and testing, (and chasing rabbits down deeper and deeper holes dug by Google Gemini) to optimize Nav2 navigation robustness and reliability for TB5-WaLI (TurtleBot4 clone on an 8GB Raspberry Pi5, using odometry and LIDAR) This is TB5-WaLI (TurtleBot4 on Raspberry Pi 5 - Wallfollower Looking for Intelligence) WaLI runs Jazzy over Ubuntu 24.04 Server with FastDDS, a Discovery Server, the wali_node that keeps him alive 24/7/365, turtlebot4_navigation, and a few convenience nodes. At the moment: ********** TB5-WaLI MONITOR NODE ****************************** Saturday 03/28/26 15:48:35 up 3 hours, 48 minutes Total CPU Usage: 38.5% Voltage: 14.36 Current: -0.96 Watts: -13.85 *** TB5-WaLI TOTAL LIFE STATISTICS *** Total Awake: 10434.9 hrs Last Docking: 2026-03-28 09:44|wali_node.py| ** WaLI dock goal result - Docking: success at battery 20% after 1.7 hrs playtime ** EnvironmentThis is the environment in which I have 6 designated goal positions, used to test each parameter change. Results:
And do not trust Google Gemini - especially do not ask Gemini to fix an error that resulted from the prior Gemini suggestion! You will chase rabbits down holes each one deeper than the prior. Do not believe Gemini if it says this will reduce CPU load. Do not believe Gemini if it says it read a file you gave it a URL to review. (It will confidently hallucinate about what it read at the URL it didn’t visit, and apologize with “You are right. I am wrong. This is embarrassing. I will try to do better.” - but it won’t.) |
Beta Was this translation helpful? Give feedback.
-
Adding Costmap_Filter Keep Out ZonesNeed: Nav2 will plan shortest path from current position to goal, Solution:
REF: https://docs.nav2.org/tutorials/docs/navigation2_with_keepout_filter.html(I should have followed this more closely, instead of trying it on my own with Claude.) |
Beta Was this translation helpful? Give feedback.



























Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Blog for TB5-WaLI (TurtleBot5 Wallfollower Looking for Intelligence)
This Incarnation: TB5-WaLI Jan 2025 - Create3 plus Raspberry Pi 5, RPLidar C1, Oak-D-Lite running Jazzy TB4 service
Second Incarnation: Create3-WaLI Dec 2023 - Mar 2024 (Returned Crashed running RTABmap)
First Incarnation: TurtleBot4 Nov 2022 (Returned Immediately - Damage in Shipment)
Beta Incarnation: Create3 Simulation Beta Oct 2021 - (Built Raspberry Pi 4 ROS 2 Galactic to test Create3 Simulation Beta)
Jump to latest post (2026-04-11)->"Adding Costmap_Filter Keepout Zones”
Beta Was this translation helpful? Give feedback.
All reactions