Turn your phone's GPS, IMU and cameras into ROS2 sensors. Connect an Arduino over USB to close the control loop. No custom app, just a browser.
Sensor-only use case (e.g. VSLAM, localization): mount the phone on a robot or handheld rig. It streams camera, IMU and GPS into ROS2 topics, ready for robot_localization, ORB-SLAM3, or any visual-inertial pipeline.
Full mobile robot use case: connect an Arduino or Teensy to the phone over USB. The phone relays commands from ROS2 down to the microcontroller (motor drivers, servos, ...) while simultaneously streaming sensors upward. Your robot only needs firmware for low-level actuation, the phone handles perception and communication.
The bridge relies on the mobile browser rather than a dedicated app. A webpage is served from the ROS2 node; opening it on the phone prompts for permissions and starts streaming over WebSockets.
This repository is inspired by a project I did with students as a TA called phone-imu.
source /opt/ros/humble/setup.bash
rosdep install -i --from-path src --rosdistro humble -y --ignore-src
colcon build --packages-up-to phone_sensors_bridge
# Or build for development, but remember to clean and rebuild every time the JS files are changed
# colcon build --symlink-install --packages-up-to phone_sensors_bridge_examples --event-handlers console_direct+Run the following in a different terminal than the one used for building, otherwise the server might start from inside the build folder which will not contain the template and static folders for the webpage.
source install/setup.bash
# Ubuntu IP
EXTRA_IP=$(ip route get 8.8.8.8 | grep -oP 'src \K[^ ]+')
echo "My IP is $EXTRA_IP"
# Generate SSL certificates for local webserver
ros2 run phone_sensors_bridge generate_dev_certificates.sh $EXTRA_IP
# Start server
ros2 run phone_sensors_bridge server --ros-args -p camera1_video_width:=720 -p camera1_video_height:=720Open the webpage from your mobile device. The URL contains the server host IP where the node is running. It depends on your network, but most likely is https://<EXTRA_IP>:2000.
If you have multiple network interfaces, favour the fastest such as ethernet over wifi.
The page will prompt for permissions, then display the chosen camera
| Topic | Type | Description |
|---|---|---|
time/device |
sensor_msgs/TimeReference |
Device system clock versus ROS time |
time/gnss |
sensor_msgs/TimeReference |
GNSS fix acquisition time versus device clock |
imu |
sensor_msgs/Imu |
Orientation (ENU quaternion), angular velocity and linear acceleration in the device frame |
gnss |
sensor_msgs/NavSatFix |
GPS fix: latitude, longitude, altitude with horizontal/vertical accuracy covariance |
gnss/odometry |
nav_msgs/Odometry |
GPS-derived velocity in the ENU frame; pose covariance is 1e9 (fuse twist only) |
camera1/image_raw/compressed |
sensor_msgs/CompressedImage |
Camera 1 JPEG stream |
camera2/image_raw/compressed |
sensor_msgs/CompressedImage |
Camera 2 JPEG stream |
camera1/camera_info |
sensor_msgs/CameraInfo |
Camera 1 calibration (only published when camera1_calibration_file is set) |
camera2/camera_info |
sensor_msgs/CameraInfo |
Camera 2 calibration (only published when camera2_calibration_file is set) |
usb/rx |
std_msgs/UInt8MultiArray |
Raw bytes received from the USB device (only active when usb_enabled is True) |
| Topic | Type | Description |
|---|---|---|
usb/tx |
std_msgs/UInt8MultiArray |
Raw bytes to forward to the USB device (only active when usb_enabled is True) |
See the server configuration guide
The current server is tested with Firefox and Chrome on Android.
The webpage includes a "Measure latency" button that runs 100 sequential WebSocket ping-pong round trips and reports one-way latency and clock offset using the NTP formula:
RTT = t2_client - t1_client
latency = RTT / 2
clock_offset = t_server - (t1_client + t2_client) / 2Note: browser timestamps use the Performance API for sub-millisecond resolution, but precision is still reduced by browsers (100 µs floor in Firefox, up to 100 ms with fingerprinting resistance).
Example measurement over a local WiFi network when running two video feeds, IMU and GNSS:
Latency (N=100): mean=5.19 ms, min=1.50 ms, max=12.50 ms | Clock offset (ROS-client): mean=-293.10 ms, min=-300.00 ms, max=-285.00 msA negative clock offset means the ROS clock is behind the client (phone) clock, in this case by ~293 ms. A positive offset would mean the ROS clock is ahead. This offset is stable across samples (spread of ~15 ms), which indicates a consistent skew rather than jitter. If the parameter time_reference_frequency is strictly positive, then the time/device TimeReference topic reports the clock difference including network latency.
We observe the same clock offset in PlotJuggler expressed in seconds, in this case measured running only time_reference_frequency:=100.0 and no video. The offset here is the difference in clock time plus network latency
- Works in both Firefox and Chrome via
watchPosition. To obtain anOdometrymessage, you may need to move around your device. - If your log show a GNSS starting error on Chrome, sometimes waiting 10s will fix it. Otherwise a page reload can help solve some cases.
The published NavSatFix and Odometry messages use the GNSS time, which may be 50 ms old when the browser on the client side obtains the data (red line). This directly depends on the quality of GNSS reception, this delay may be much higher indoor far away from a window. A TimeReference message on /time/gnss is always published to report this delay.
- Camera device labels differ between Firefox and Chrome; the
test_video_permissionspage (linked from the home page) shows the correctcamera1_device_labelandcamera2_device_labelfor your browser
- The Web Serial API is not supported in Firefox. On Android Chrome, version 145 seems to have WebSerial available but does not work for USB devices. It might work for Bluetooth devices, which may be added as a feature if any Issue requests it. Use the test pages to verify your state.
- For connecting USB serial devices (microcontrollers, sensors), the project uses WebUSB instead of Web Serial. See the USB serial devices guide for compatible hardware, supported chip protocols, and example code.
To calibrate you camera, print the checkerboard in maximum page size. While printing, do not adjust the size.
# Calibrate camera using GUI
source install/setup.bash
ros2 launch phone_sensors_bridge_examples calibrate.launch.py
# Generate SSL certificates for local webserver
#ros2 run phone_sensors_bridge generate_dev_certificates.sh $(ip route get 8.8.8.8 | grep -oP 'src \K[^ ]+')
# Extract the calibration so that launch files will know where to look
tar -xvf /tmp/calibrationdata.tar.gz --exclude=*.png --directory src/phone_sensors_bridge_examples/config/
# Server, image rectification, RVIZ
ros2 launch phone_sensors_bridge_examples rviz.launch.py
# Start only the node
#ros2 run phone_sensors_bridge server --ros-args -p camera_calibration_file:=src/phone_sensors_bridge_examples/config/ost.yaml- Browser-based, no app required. Works with Firefox and Chrome on Android
- Up to 2 simultaneous camera streams with configurable resolution, FPS and compression; camera calibration support via CameraInfo
- IMU: orientation (ENU quaternion), angular velocity, linear acceleration via Imu
- GNSS: position via NavSatFix + ENU velocity via Odometry when heading and speed are available
- Time synchronization between ROS, device and GNSS clocks via TimeReference
- Fully configurable via ROS2 parameters; launch file examples included
- WebUSB serial bridge: forward raw bytes between ROS2 (
usb/rx,usb/tx) and a USB-connected microcontroller; supports CDC-ACM boards (Teensy, Arduino Leonardo/Micro/Zero) and CP2102 USB-UART converters; disabled by default (usb_enabled:=False)
- Document server-side architecture
-
robot_localizationexample with visual-inertial odometry -
robot_localizationexample with control feedback replacing speed odometry - Tests for
message_converters.py - Publish
sensor_msgs/CompressedImageoncamera1/image_raw/compressedfollowing the image_transport convention - the JPEG bytes from the browser are placed directly intoCompressedImage.data, skipping the decode → numpy → OpenCV → CvBridge chain entirely - Switch the video channel to binary WebSocket frames to eliminate the base64 encoding overhead (~33% size reduction); pairs naturally with the
CompressedImagechange - WebUSB serial bridge to microcontroller over USB (
usb/rx,usb/txasstd_msgs/UInt8MultiArray) - Bluetooth serial support
- Remove
debugandsecret_keyparameters and hardcode safe defaults (debug=False, randomised secret); these are Flask internals that should not be exposed as ROS parameters

