Launch Files

Overview

The sn4_bringup package contains ROS 2 launch files that orchestrate the startup of the entire FEB autonomous racing system. Launch files are organized hierarchically to enable modular testing and deployment of individual subsystems or the complete system.

Launch File Hierarchy

The launch system follows a hierarchical structure:

full_launch.py (Complete System)
├── microcontroller.py
├── perception.py
│   ├── camera.py
│   └── lidar.py
├── localization.py
└── pnc.py

Launch Files Reference

full_launch.py

Purpose: Master launch file that brings up the complete autonomous racing system.

Description:

Launches all subsystems required for autonomous operation:

  • Microcontroller communication

  • Perception (cameras and LiDAR)

  • Localization (SLAM)

  • Planning and Control (PnC)

Subsystems Included:

Launch File

Description

microcontroller.py

Dual microcontroller interface

perception.py

Complete perception system

localization.py

GraphSLAM localization

pnc.py

Planning and control stack

Usage:

ros2 launch sn4_bringup full_launch.py

When to Use:

  • Full system testing on the vehicle

  • Competition runs

  • Complete end-to-end validation

Source:

from launch import LaunchDescription
from launch.actions import IncludeLaunchDescription
from launch.launch_description_sources import PythonLaunchDescriptionSource
from ament_index_python.packages import get_package_share_directory
import os

def generate_launch_description():
    launch_dir = os.path.join(
        get_package_share_directory('racecar_bringup'), 'launch'
    )

    microcontroller_launch = IncludeLaunchDescription(
        PythonLaunchDescriptionSource(os.path.join(launch_dir, 'microcontroller.py'))
    )

    perception_launch = IncludeLaunchDescription(
        PythonLaunchDescriptionSource(os.path.join(launch_dir, 'perception.py'))
    )

    localization_launch = IncludeLaunchDescription(
        PythonLaunchDescriptionSource(os.path.join(launch_dir, 'localization.py'))
    )

    pnc_launch = IncludeLaunchDescription(
        PythonLaunchDescriptionSource(os.path.join(launch_dir, 'pnc.py'))
    )

    return LaunchDescription([
        microcontroller_launch,
        perception_launch,
        localization_launch,
        pnc_launch
    ])

perception.py

Purpose: Master perception launch file that starts both camera and LiDAR perception.

Description:

Orchestrates the complete perception subsystem by including both camera and LiDAR launch files.

Subsystems Included:

Launch File

Description

camera.py

Dual RealSense camera setup

lidar.py

RSLiDAR sensor

Usage:

ros2 launch sn4_bringup perception.py

When to Use:

  • Testing complete perception system

  • Sensor data collection

  • Perception algorithm development

Source:

from launch import LaunchDescription
from launch.actions import IncludeLaunchDescription
from launch.launch_description_sources import PythonLaunchDescriptionSource
from ament_index_python.packages import get_package_share_directory
import os

def generate_launch_description():
    """
    Master perception launch file.

    Launches both camera and LiDAR perception subsystems:
    - camera.py: Dual RealSense camera setup
    - lidar.py: RSLiDAR sensor
    """
    launch_dir = os.path.join(get_package_share_directory('sn4_bringup'), 'launch')

    camera_launch = IncludeLaunchDescription(
        PythonLaunchDescriptionSource(os.path.join(launch_dir, 'camera.py'))
    )

    lidar_launch = IncludeLaunchDescription(
        PythonLaunchDescriptionSource(os.path.join(launch_dir, 'lidar.py'))
    )

    return LaunchDescription([
        camera_launch,
        lidar_launch,
    ])

camera.py

Purpose: Launch dual Intel RealSense depth cameras for vision-based cone detection.

Description:

Starts two RealSense camera nodes with hardware-specific serial numbers and configurations:

  • D435i (Left Camera): Includes IMU, mounted upside down - Serial: 344422072170 - Resolution: 1280x720 @ 30 FPS - Features: RGB + aligned depth + IMU

  • D435 (Right Camera): Standard mounting - Serial: 042222071145 - Features: RGB + aligned depth

Published Topics:

Topic

Description

/camera/realsense_imu/color/image_raw

Left camera RGB image

/camera/realsense_imu/aligned_depth_to_color/image_raw

Left camera aligned depth

/camera/realsense2/color/image_raw

Right camera RGB image

/camera/realsense2/aligned_depth_to_color/image_raw

Right camera aligned depth

Configuration:

Both cameras use:

  • Color enabled: Yes

  • Depth enabled: Yes

  • Depth alignment: Enabled (aligned to color frame)

  • Profile: 1280x720 @ 30 FPS

Usage:

ros2 launch sn4_bringup camera.py

When to Use:

  • Testing camera perception only

  • Camera calibration

  • Vision algorithm development

lidar.py

Purpose: Launch RSLiDAR sensor for point cloud acquisition.

Description:

Starts the RSLiDAR SDK node for LiDAR point cloud streaming. Includes optional RViz2 visualization (commented out by default).

Published Topics:

Topic

Description

/rslidar_points

Raw LiDAR point cloud (PointCloud2)

Configuration:

  • Namespace: rslidar_sdk

  • Output: Screen (for debugging)

Usage:

ros2 launch sn4_bringup lidar.py

When to Use:

  • Testing LiDAR perception only

  • Ground plane calibration

  • Point cloud visualization

localization.py

Purpose: Launch localization and perception fusion for SLAM.

Description:

Starts the GraphSLAM node for vehicle localization and map building, along with the dual camera perception node for early-stage sensor fusion.

Nodes Launched:

Package

Executable

Purpose

graphslam_global

graphslam_global

SLAM and mapping

early_stage_fusion

dual_realsense

Camera+LiDAR fusion

Published Topics:

  • /slam/state - Vehicle state estimate

  • /slam/map - Track map with cone boundaries

Usage:

ros2 launch sn4_bringup localization.py

When to Use:

  • SLAM algorithm testing

  • Map building

  • Localization debugging

pnc.py (Planning and Control)

Purpose: Launch the complete planning and control stack.

Description:

Starts all planning and control nodes required for autonomous navigation:

Nodes Launched:

Package

Executable

Purpose

global_path

global_path

Global racing line planning

local_path

local_path

Local dynamic path planning

mpc

mpc

Model Predictive Control

Typical Data Flow:

  1. Global path planner receives track map from SLAM

  2. Generates optimal racing line

  3. Local path planner adjusts for dynamic obstacles

  4. MPC controller tracks the reference trajectory

Usage:

ros2 launch sn4_bringup pnc.py

When to Use:

  • Planning algorithm development

  • Control tuning

  • Path planning testing without full system

microcontroller.py

Purpose: Launch dual microcontroller interface for vehicle control.

Description:

Starts the microcontroller communication node that interfaces with the vehicle’s low-level control systems.

Nodes Launched:

Package

Executable

Purpose

microcontroller

dual_mcu

Dual MCU communication

Published Topics:

  • Vehicle state feedback

  • Sensor data from MCU

Subscribed Topics:

  • Control commands (steering, throttle, brake)

Usage:

ros2 launch sn4_bringup microcontroller.py

When to Use:

  • Testing vehicle interface

  • Low-level control debugging

  • Hardware-in-the-loop testing

all_sensors.py

Purpose: Launch minimal sensor suite (microcontroller + perception).

Description:

Lightweight launch file for sensor data collection without planning/control. Useful for data logging and sensor validation.

Subsystems Included:

Launch File

Description

microcontroller.py

Vehicle interface

perception.py

Cameras and LiDAR

Usage:

ros2 launch sn4_bringup all_sensors.py

When to Use:

  • Sensor data collection

  • Bag recording sessions

  • Sensor calibration and validation

p131_sim.py (Simulation)

Purpose: Launch configuration for simulation testing.

Description:

Starts planning, control, and SLAM nodes for simulation environments. Excludes hardware drivers (cameras, LiDAR, microcontroller).

Nodes Launched:

Package

Executable

Purpose

mpc

mpc

Model Predictive Control

global_path

global_path

Global path planning

local_path

local_path

Local path planning

graphslam_global

graphslam_global

SLAM (using simulated data)

Usage:

ros2 launch sn4_bringup p131_sim.py

When to Use:

  • Algorithm testing in simulation (Gazebo, EUFS sim)

  • Safe testing without hardware

  • Development without vehicle access

Launch File Best Practices

Testing Individual Subsystems:

Test components individually before running the full system:

# Test cameras only
ros2 launch sn4_bringup camera.py

# Test LiDAR only
ros2 launch sn4_bringup lidar.py

# Test perception (cameras + LiDAR)
ros2 launch sn4_bringup perception.py

# Test full system
ros2 launch sn4_bringup full_launch.py

Common Launch Workflows:

  1. Hardware Bring-up:

    # Step 1: Verify sensors
    ros2 launch sn4_bringup all_sensors.py
    
    # Step 2: Add localization
    ros2 launch sn4_bringup localization.py
    
    # Step 3: Full system
    ros2 launch sn4_bringup full_launch.py
    
  2. Algorithm Development (Simulation):

    # Use simulation launch
    ros2 launch sn4_bringup p131_sim.py
    
  3. Sensor Calibration:

    # Cameras
    ros2 launch sn4_bringup camera.py
    
    # LiDAR ground plane
    ros2 launch sn4_bringup lidar.py
    

Troubleshooting

Launch File Not Found:

Ensure package is built and sourced:

cd ~/feb-system-integration
colcon build --packages-select sn4_bringup
source install/setup.bash

Camera Serial Number Issues:

If cameras fail to start, verify serial numbers:

rs-enumerate-devices | grep Serial

Update camera.py with correct serial numbers if hardware changed.

Node Launch Failures:

Check individual node status:

ros2 node list
ros2 node info /node_name

View logs for specific nodes:

ros2 launch sn4_bringup <launch_file> --show-args