Simulated Perception

Overview

The simulated perception module provides tools for testing perception algorithms in simulation environments. It simulates realistic sensor noise and delays for both camera and LiDAR perception systems.

Cone Noise Simulator

The Cone Noise Simulator (ConeNoiseSimulator) adds realistic noise and delays to ground truth cone positions to simulate camera and LiDAR perception in testing environments.

Key Features:

  • Subscribes to ground truth cone positions from the simulator

  • Adds configurable Gaussian noise to simulate sensor inaccuracies

  • Publishes separate camera and LiDAR cone detections

  • Simulates realistic sensor delays and publishing rates

  • Supports visualization in RViz

ROS Topics:

Subscribed Topics:

  • /ground_truth/cones (ConeArrayWithCovariance) - Ground truth cone positions from simulator

Published Topics:

  • /camera/cones (ConesCartesian) - Simulated camera cone detections with noise

  • /lidar/cones (ConesCartesian) - Simulated LiDAR cone detections with noise

  • /cones_noise_viz (PointCloud) - Visualization of noisy camera cones for RViz

Configuration Parameters:

The simulator uses settings from SimulatorPerceptionSettings:

  • camera_hz - Camera publishing rate (Hz)

  • lidar_hz - LiDAR publishing rate (Hz)

  • camera_noise_std - Standard deviation of Gaussian noise for camera (meters)

  • lidar_noise_std - Standard deviation of Gaussian noise for LiDAR (meters)

  • camera_delay_mean - Mean delay for camera publishing (seconds)

  • camera_delay_std - Standard deviation of camera delay (seconds)

  • lidar_delay_mean - Mean delay for LiDAR publishing (seconds)

  • lidar_delay_std - Standard deviation of LiDAR delay (seconds)

Usage:

The simulator is designed to run alongside the EUFS simulator or other ground truth sources. It processes ground truth cone data and outputs simulated sensor measurements that can be used to test perception fusion, SLAM, and planning algorithms under realistic sensor conditions.

Noise Model:

The noise is applied independently to x and y coordinates using a Gaussian distribution:

\[ \begin{align}\begin{aligned}x_{noisy} = x_{true} + \mathcal{N}(0, \sigma^2)\\y_{noisy} = y_{true} + \mathcal{N}(0, \sigma^2)\end{aligned}\end{align} \]

where \(\sigma\) is the configured standard deviation for the respective sensor.

Color Mapping:

  • Camera detections include color information: - Blue cones: color = 2 - Yellow cones: color = 1

  • LiDAR detections do not include color: color = -1

API Reference

class perception.perception_sim.perception_sim.cone_noise_sim.ConeNoiseSimulator[source]

Bases: rclpy.node.Node

add_gaussian_noise(x, y, is_camera)[source]

Add Gaussian noise to x, y coordinates.

cone_callback(msg)[source]

Process ground truth cone data and prepare noisy simulated measurements.

This callback receives ground truth cone positions from the simulator and generates separate camera and LiDAR measurements by adding Gaussian noise. The noisy measurements are stored internally but not published immediately - publishing is controlled by separate timers to simulate realistic sensor rates.

Parameters

msg (ConeArrayWithCovariance) – Ground truth cone positions from simulator, containing blue_cones and yellow_cones arrays.

Note

  • Camera measurements include color information (blue=2, yellow=1)

  • LiDAR measurements do not include color (color=-1)

  • Camera only includes cones with x > 0.1 (in front of vehicle)

  • Different noise levels applied for camera vs LiDAR based on settings

publish_camera()[source]

Publish simulated camera cone detections at the configured camera rate.

This method is called by a timer at the camera publishing frequency (camera_hz). It adds a realistic delay based on configured mean and standard deviation, then publishes both the ConesCartesian message and a PointCloud visualization.

The publishing delay simulates real camera processing latency with:

delay ~ max(0, N(camera_delay_mean, camera_delay_std^2))

publish_lidar()[source]

Publish simulated LiDAR cone detections at the configured LiDAR rate.

This method is called by a timer at the LiDAR publishing frequency (lidar_hz). It adds a realistic delay based on configured parameters, then publishes the LiDAR cone detections without color information.

The publishing delay simulates real LiDAR processing latency with:

delay ~ max(0, lidar_delay_mean, lidar_delay_std)

Published Topics:

/lidar/cones (ConesCartesian): Noisy LiDAR cone detections without color

Note

Does nothing if no LiDAR message has been prepared yet (lidar_msg is None).

perception.perception_sim.perception_sim.cone_noise_sim.main(args=None)[source]