Setup
This guide walks you through setting up the JOSHUA robotics framework on your development machine, building the project, and running your first robot configuration. JOSHUA supports both native installation on Ubuntu and containerized development via Docker.
Prerequisites
Before installing JOSHUA, ensure your system meets the following requirements. JOSHUA supports two Ubuntu LTS releases, each paired with its corresponding ROS2 distribution.
Operating System
| Ubuntu Version | ROS2 Distribution | Python Version | Status |
|---|---|---|---|
| Ubuntu 22.04 LTS (Jammy) | ROS2 Humble Hawksbill | Python 3.10 | Fully supported |
| Ubuntu 24.04 LTS (Noble) | ROS2 Jazzy Jalisco | Python 3.12 | Fully supported |
Required Software
- Bazel 8.3.1 — Installed automatically via Bazelisk. Bazelisk manages the correct Bazel version based on the project's
.bazelversionfile. - Python 3.10 (Ubuntu 22.04) or Python 3.12 (Ubuntu 24.04) — System default Python for the respective Ubuntu version.
- Git — For cloning the repository and version control.
Optional Software
- NVIDIA GPU with CUDA — Required for AI model training (SmolVLA, PPO reinforcement learning) and GPU-accelerated MuJoCo-XLA simulation. Not needed for teleoperation, basic simulation, or mock testing.
- Docker & Docker Compose — For containerized development and deployment. Recommended for CI/CD workflows and cross-platform builds (AMD64/ARM64).
bazel commands assume you have Bazelisk installed and aliased as bazel. Bazelisk will automatically download and use Bazel 8.3.1 as specified in the project's .bazelversion file.
Installation
JOSHUA provides two installation methods: native installation directly on Ubuntu, or Docker-based containerized development. Choose the method that best fits your workflow.
Option A: Native Installation
Native installation gives you the fastest development experience with direct hardware access. The setup script auto-detects your Ubuntu version and installs all required dependencies.
1. Clone the Repository
git clone https://github.com/Joshua-AI-Robotics/Joshua.git
cd Joshua
2. Run the Setup Script
JOSHUA provides a setup script that handles all dependency installation. Choose the environment that matches your use case:
Full Development Environment (includes all development tools, linters, testing frameworks):
scripts/setup.sh --env=dev
Minimal Runtime Environment (only essential runtime dependencies):
scripts/setup.sh --env=runtime
- ROS2 — Humble (Ubuntu 22.04) or Jazzy (Ubuntu 24.04) with core packages
- OpenCV — Computer vision library for camera processing
- Python dependencies — PyTorch, HuggingFace, JAX, and other AI/ML packages
- Docker & Docker Compose — Container runtime for deployment
- Bazelisk — Bazel version manager (installs as
bazel) - System libraries — Protobuf, Boost, serial communication libraries
3. Source the ROS2 Environment
After installation, source the ROS2 setup file. This must be done in every new terminal session:
For Ubuntu 22.04 (Humble):
source /opt/ros/humble/setup.bash
For Ubuntu 24.04 (Jazzy):
source /opt/ros/jazzy/setup.bash
source command to your ~/.bashrc file so it runs automatically in every new terminal session.
Option B: Docker Installation
Docker provides a fully isolated, reproducible development environment. This is the recommended approach for CI/CD pipelines, cross-platform development, and quick evaluation of the framework.
1. Prerequisites
Ensure Docker and Docker Compose are installed:
sudo apt-get update
sudo apt-get install docker.io docker-compose-v2
sudo usermod -aG docker $USER
Log out and back in for the group change to take effect.
2. Clone and Start Containers
git clone https://github.com/Joshua-AI-Robotics/Joshua.git
cd Joshua
Start the container for your target platform:
# Ubuntu 22.04 (AMD64)
docker-compose up joshua-u22
# Ubuntu 24.04 (AMD64)
docker-compose up joshua-u24
# Ubuntu 22.04 (ARM64 - for Jetson Orin Nano)
docker-compose up joshua-u22-arm64
# Ubuntu 24.04 (ARM64)
docker-compose up joshua-u24-arm64
# Web UI only
docker-compose up joshua-ui
Available Docker Services
| Service | Platform | Description |
|---|---|---|
joshua-u22 |
Ubuntu 22.04 / AMD64 | Full development environment with ROS2 Humble |
joshua-u24 |
Ubuntu 24.04 / AMD64 | Full development environment with ROS2 Jazzy |
joshua-u22-arm64 |
Ubuntu 22.04 / ARM64 | ARM64 environment for NVIDIA Jetson deployment |
joshua-u24-arm64 |
Ubuntu 24.04 / ARM64 | ARM64 environment with ROS2 Jazzy |
joshua-ui |
Any | React/TypeScript web monitoring UI |
--gpus all to your Docker run command or configure the deploy section in docker-compose.yml.
Building the Project
JOSHUA uses Bazel for hermetic, reproducible builds. The build system supports both Ubuntu 22.04 and 24.04 through platform-specific configurations.
Full Build
Build all targets in the project:
bazel build //...
Run All Tests
Execute the full test suite:
bazel test //...
Platform-Specific Builds
The .bazelrc file provides platform-specific configurations. Use the --config flag to target a specific Ubuntu version:
# Build for Ubuntu 22.04
bazel build //... --config=u22
# Build for Ubuntu 24.04
bazel build //... --config=u24
Building Specific Targets
You can build individual targets for faster iteration:
# Build only the main launcher
bazel build //launcher:joshua_main
# Build only robot drivers
bazel build //robot/...
# Build only AI inference modules
bazel build //ai/...
# Build only simulation targets
bazel build //simulation/...
bazel build --jobs=auto to automatically use the optimal number of parallel build jobs for your machine.
Running Your First Robot
The JOSHUA launcher (joshua_main) is the central entry point for all robot operations. It reads a Protocol Buffers text configuration file (.pbtxt) that defines the complete system: hardware, sensors, AI models, communication, and operation mode.
The Mock Test Configuration
The simplest way to verify your installation is by running the mock test configuration. This preset uses mock (simulated) hardware drivers, so no physical robot hardware is required.
bazel run //launcher:joshua_main -- --config=config/config_preset/mock_py_test.pbtxt
The mock_py_test.pbtxt configuration does the following:
- Instantiates mock servo motors and sensors using the Python-based driver backend
- Sets up ROS2 nodes for sensor publishing and command subscription
- Runs the Node Generator orchestrator to manage the lifecycle of all child processes
- Validates the full configuration pipeline without requiring physical hardware
joshua_main, it performs these steps in order:
- Parses the
.pbtxtconfiguration file into Protocol Buffer objects - Validates configuration integrity (checks for port conflicts, missing dependencies, invalid parameters)
- Determines the backend (C++ or Python) for each configured node
- Spawns and manages child processes for each ROS2 node
- Monitors node health and handles graceful shutdown on termination
Running Other Configurations
The general pattern for running any configuration preset is:
bazel run //launcher:joshua_main -- --config=config/config_preset/<preset_name>.pbtxt
For example, to run the interactive MuJoCo simulation:
bazel run //launcher:joshua_main -- --config=config/config_preset/sim_interactive.pbtxt
Configuration Presets
JOSHUA ships with 25+ configuration presets in config/config_preset/. Each preset is a complete, ready-to-run system definition. The table below lists the available presets organized by operation mode.
Teleoperation Presets
| Preset Name | Mode | Description |
|---|---|---|
so100_teleoperate |
Teleoperation | Leader-follower teleoperation for SO-100 robot arm. Leader arm drives follower arm in real-time. |
so100_teleoperate_data_collection |
Teleoperation + Data | Teleoperation with episode recording for imitation learning dataset collection. |
so100_keyboard_teleoperate |
Teleoperation | Keyboard-controlled teleoperation for the SO-100 arm. Useful for quick testing. |
so100_xbox_teleoperate |
Teleoperation | Xbox controller-based teleoperation for the SO-100 arm. |
AI Inference Presets
| Preset Name | Mode | Description |
|---|---|---|
so100_smolvla |
AI Inference | SmolVLA vision-language-action model inference on SO-100 hardware. Takes camera input and natural language task descriptions. |
so100_smolvla_data_collection |
AI Inference + Data | SmolVLA inference with simultaneous data recording for evaluation. |
so100_random_noise |
AI Inference | Random noise action generation for testing the action pipeline without a trained model. |
so100_decision_transformer |
AI Inference | Decision Transformer model inference for sequential decision-making tasks. |
Simulation Presets
| Preset Name | Mode | Description |
|---|---|---|
sim_interactive |
Simulation | Interactive MuJoCo simulation with GUI. Allows manual interaction with the simulated robot. |
sim_passive |
Simulation | Passive simulation playback of pre-recorded trajectories in the MuJoCo viewer. |
sim_mirror |
Simulation | Digital twin mode. Mirrors a physical robot's state in real-time via ROS2 topics. |
sim_offscreen |
Simulation | Headless offscreen rendering for data generation and batch simulation. |
sim_mjx_training |
Training | GPU-accelerated MuJoCo-XLA parallel training with 2048 simultaneous environments. |
sim_ppo_training |
Training | PPO reinforcement learning training loop using JAX/Flax with MuJoCo simulation. |
sim_isaac |
Simulation | NVIDIA Isaac Sim integration for industrial-grade physics simulation with USD assets. |
Calibration & Testing Presets
| Preset Name | Mode | Description |
|---|---|---|
so100_calibrate |
Calibration | Servo motor calibration for the SO-100 arm. Detects operational limits and publishes calibration data. |
mock_py_test |
Testing | Mock hardware with Python backend. Validates the full configuration pipeline without physical hardware. |
mock_cpp_test |
Testing | Mock hardware with C++ backend. Tests the C++ driver and node orchestration paths. |
mock_camera_test |
Testing | Mock camera sensor pipeline. Tests image capture, encoding, and ROS2 image topic publishing. |
mock_lidar_test |
Testing | Mock LiDAR sensor pipeline. Tests point cloud generation and ROS2 topic publishing. |
Hardware-Specific Presets
| Preset Name | Mode | Description |
|---|---|---|
so100_single_arm |
Operation | Single SO-100 arm operation with full sensor suite (camera, encoders). |
so100_dual_arm |
Operation | Dual SO-100 arm configuration for bimanual manipulation tasks. |
so100_mobile_base |
Operation | SO-100 arm mounted on a mobile base with LiDAR navigation. |
pybricks_spike |
Operation | LEGO SPIKE Prime hub integration via Pybricks for educational robotics. |
jetson_orin_deploy |
Deploy | Optimized deployment configuration for NVIDIA Jetson Orin Nano edge computing. |
mock_py_test preset. It exercises the full framework pipeline — configuration parsing, node orchestration, ROS2 communication — without needing any hardware. Once that works, try sim_interactive for a visual simulation experience.
Project Structure
The JOSHUA repository is organized into clearly separated layers, each with its own Bazel BUILD files for independent builds and testing.
Joshua/
├── ai/ # AI inference models and training pipelines
│ ├── model/ # Model implementations (SmolVLA, Decision Transformer)
│ ├── training/ # RL training loops (PPO, MJX)
│ └── registry/ # Model registry and factory
├── config/ # Configuration system
│ ├── config_preset/ # Ready-to-run .pbtxt configuration presets
│ └── proto/ # Protocol Buffers schema definitions
├── launcher/ # joshua_main entry point and node generator
├── robot/ # Robot Hardware Abstraction Layer
│ ├── actuator/ # Servo motors (STS3215, Pybricks)
│ ├── sensor/ # Camera, LiDAR, encoder drivers
│ ├── communication/ # Serial, Bluetooth, USB protocols
│ └── mock/ # Mock drivers for testing
├── simulation/ # MuJoCo simulation engine
│ ├── mujoco/ # MuJoCo wrapper and modes
│ ├── mjx/ # MuJoCo-XLA GPU training
│ ├── isaac/ # NVIDIA Isaac Sim integration
│ └── models/ # MJCF robot model files
├── ui/ # User interfaces
│ ├── qt/ # Qt6 C++ desktop control panel
│ └── web/ # React/TypeScript web monitoring UI
├── ros2/ # ROS2 message definitions and utilities
│ └── msg/ # Custom message types
├── scripts/ # Setup, build, and deployment scripts
│ └── setup.sh # Automated environment setup
├── docker/ # Dockerfiles and docker-compose.yml
├── third_party/ # Third-party dependencies (Bazel external)
├── .bazelrc # Bazel configuration (platform configs)
├── .bazelversion # Pinned Bazel version (8.3.1)
├── BUILD # Root BUILD file
├── MODULE.bazel # Bazel module definition
└── WORKSPACE # Bazel workspace configuration
BUILD file defining build targets, dependencies, and test rules. This enables fine-grained dependency tracking and fast incremental builds. You can build and test any subdirectory independently using Bazel target patterns like bazel test //robot/actuator/....
Next Steps
Now that you have JOSHUA installed and running, explore the following documentation to deepen your understanding of the framework:
Configuration System →
Learn the Protocol Buffers schema, understand every field in the .pbtxt files, and create your own custom configurations for new robot setups.
Robot Hardware Layer →
Explore the hardware abstraction layer, supported actuators and sensors, factory pattern instantiation, and how to add drivers for new hardware.
AI System →
Understand the model registry, SmolVLA integration, reinforcement learning pipelines, and how to plug in your own AI models for inference.
Simulation Engine →
Dive into MuJoCo simulation modes, MJCF robot models, GPU-accelerated training with MuJoCo-XLA, and NVIDIA Isaac Sim integration.
System Architecture →
Get the full picture of JOSHUA's layered architecture, data flow, node orchestration, and the dual-layer data type system.
Deployment & CI/CD →
Learn about Docker containerization, ARM64 cross-compilation for Jetson, GitHub Actions CI pipelines, and production deployment workflows.