Skip to content

DesignLabUCF/spot_ar_navigation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Evaluating the Effectiveness of Augmented Reality Interfaces for Quadrupedal Robot Motion Control

This repository is the official implementation of Evaluating the Effectiveness of Augmented Reality Interfaces for Quadrupedal Robot Motion Control.

Heder Imager for project.

This study examines how Augmented Reality (AR) interfaces affect navigation control of the Boston Dynamics Spot quadrupedal robot UCF TapeMeasure. Testing with 33 non-experts and two experts showed that AR provided smoother robot movement and similar usability and trust compared to tablet controls, despite longer task times. The findings highlight AR’s potential to make advanced robotic systems more accessible and trusted by a wider range of users. This repository allows researchers to replicate the environment for our study and help aid their own studies that leverage quadrupedal robotic platforms.

John Sermarini, Crystal Maraj, Lori C. Walters, Mustapha Mouloua, and Joseph T. Kider Jr. 2025. Evaluating the Effectiveness of Augmented Reality Interfaces for Quadrupedal Robot Motion Control. In Proceedings of the 2025 ACM Symposium on Spatial User Interaction (SUI '25). Association for Computing Machinery, New York, NY, USA, Article 17, 1–12. https://doi.org/10.1145/3694907.3765931

BibTeX

@inproceedings{10.1145/3694907.3765931,
  author       = {Sermarini, John and Maraj, Crystal and Walters, Lori C. and Mouloua, Mustapha and Kider, Joseph T., Jr.},
  title        = {Evaluating the Effectiveness of Augmented Reality Interfaces for Quadrupedal Robot Motion Control},
  year         = {2025},
  isbn         = {9798400712593},
  publisher    = {Association for Computing Machinery},
  address      = {New York, NY, USA},
  url          = {https://doi.org/10.1145/3694907.3765931},
  doi          = {10.1145/3694907.3765931},
  booktitle    = {Proceedings of the 2025 ACM Symposium on Spatial User Interaction},
  articleno    = {17},
  numpages     = {12},
  keywords     = {Augmented Reality (AR), Robot Navigation, Quadrupedal Robots, Trust, Human-Robot Interaction (HRI).},
  series       = {SUI '25}
}

Spot-AR-main

The Spot-AR-main folder contains the main Unity projects used in this study and for general control of the Spot robot via the Microsoft HoloLens 2. These projects provide the interface with ROS2, the Boston Dynamics SDK/API, and include all the necessary code for conducting the experiments. The Spot-AR-main README contains more detail documentation for this project.

Spot Digital Twin

In this mode, a virtual Spot model is augmented directly over the real Spot robot when the HoloLens 2 is connected to Spot’s network Wi-Fi. This creates a synchronized digital twin that mirrors Spot’s movements in real time, allowing users to see both the physical robot and its digital counterpart aligned in the same space. The virtual twin provides enhanced situational awareness and helps visualize movement and orientation during operation.

Joystick Left Joystick Right
Digital/Physical Spot Network UI
Spot Navigation Blue Boxes
Motion View Hover View

With the digital twin view, Spot’s motion path can be plotted in Unity as real-time movement data is streamed back to a control computer. As the physical robot moves, the virtual Spot’s joints animate in sync, replicating its exact posture and gait. An orange path traces the movement of the root node in space, providing a continuous visual history of the robot’s trajectory for analysis and review.

Joystick Left Joystick Right

Spot Control UI

ROS2 Interface

The ROS2 controls provide a standardized interface for sending commands to Spot and receiving sensor data. Through this interface, the HoloLens or control computer can manage robot movement, joint positions, and operational states, enabling precise and coordinated control during AR-assisted navigation and experiments.

To connect the HoloLens to the ROS2 interface, it must first be connected to the Spot robot’s Wi-Fi network. The ROS interface menu displays the IP address and port required for the connection, and its status indicator changes from red to green when the link is successfully established after the user hits the connect button. Four box indicators at the top of the menu show each stage of the connection process, making it easy to identify and troubleshoot any specific step that may be causing networking issues.

Spot HoloLens 2 UI Control

This study used a Microsoft HoloLens 2 to control Spot.

When the user raises their left hand, a set of buttons appears, allowing them to start, stop, or change the robot’s motion direction. Additionally, a settings button provides access to more advanced features, such as configuring the ROS2 controller and other system options.

hololens training view

Users navigate Spot by using a ray cast and pinch gesture to indicate their desired target location. Once the robot begins moving, a blue arrow appears over Spot to show its current direction, while a red dot on the floor marks the target destination. This visual feedback helps the user monitor both the intended path and the robot’s orientation during navigation.

Joystick Left Joystick Right

AR indicators are overlaid onto the physical environment by the HoloLens, providing real-time visual cues such as the ray cast pointer (white dash line) and the directional target (red circle). This augmentation allows users to see exactly where Spot will move and interact with the environment, enhancing spatial awareness and precision during navigation.

Joystick Left Joystick Right

GO and STOP Controls

The Stop-and-Go AR interface uses clear visual and audio cues to indicate Spot’s operational state. When in GO mode, a blue arrow is shown over Spot and dot appear on the left-hand menu along with an audio prompt. When in STOP mode, a red arrow is shown over Spot and dot are displayed on the user’s hand, ensuring the human controller can quickly recognize the robot’s current state. Additionally, the arrow and text prompt shows which direct Spot is moving, for example * Walking Foward*.

Joystick Left Joystick Right
GO UI Indicator STOP UI Indicator

Spot Tablet UI Control

The Spot tablet UI from Boston Dynamics provides a touchscreen interface for controlling the robot. It includes two main joystick controllers: the left joystick controls Spot’s rotation, while the right joystick manages motion direction.

The tablet also offers more detailed controls, such as walk speed, operational modes, camera views, and additional options in the Boston Dynamics app, but these features were not utilized in the study.

tablet training view

HoloLens 2 Training

hololens training view

The Spot-AR-main folder contains the Unity projects used for the AR control study, including the participant HoloLens 2 training module. In this module, participants see three virtual blue balls in the AR environment. Using a ray cast pointer, they must hover over each ball and perform a pinch gesture to make it disappear. This exercise allows participants to practice the core interaction mechanics—target selection and gesture input—before moving on to the main navigation tasks. The training also provides an opportunity to get comfortable with controlling Spot’s movement in AR, ensuring participants are ready for the full study.

Ball 1 Ball 2 Ball 3

Tablet Training

tablet training view

The Spot-TabletTraining folder contains a Unity project designed to help users understand and practice with the two major joystick controls for Spot:

  • Left joystick – Controls Spot’s rotation (turning left or right).
  • Right joystick – Controls Spot’s movement (forward, backward, and sideways).

Training Objective

The goal of the training is to navigate to the four blue boxes displayed on screen as quickly and precisely as possible. This exercise develops both accuracy and speed in operating Spot’s controls.

Project Details

  • Unity Version: 2021.3.5 (Works with upgrading to ver 6+)
  • Main Scene: SampleScene.unity

To launch the training, open: SampleScene.unity and start the scene. This project is desigend for tablet touch screen controls, but the left and right mouse buttons work to simulate the robot controls.

Joystick Left Joystick Right
Left Joystick Right Joystick
Spot Navigation Blue Boxes
Navigation View Target Blue Boxes

Study Environment

ramp image

For the experiment, we custom-built a wooden stage to provide a private and controlled setting for participants. The apparatus included two sets of staircases and a ramp, allowing users to test features relevant to typical built environments. This setup leveraged Spot’s unique ability to climb stairs and navigate elevated surfaces, demonstrating how it can reach task objectives in real-world conditions that many other robots cannot.

ramp image ramp image
ramp image ramp image

PDF

PDF of Paper: https://dl.acm.org/doi/pdf/10.1145/3694907.3765931 - Open Access

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published