Simulated Perception ==================== Overview -------- The simulated perception module provides tools for testing perception algorithms in simulation environments. It simulates realistic sensor noise and delays for both camera and LiDAR perception systems. Cone Noise Simulator -------------------- The Cone Noise Simulator (`ConeNoiseSimulator`) adds realistic noise and delays to ground truth cone positions to simulate camera and LiDAR perception in testing environments. **Key Features:** - Subscribes to ground truth cone positions from the simulator - Adds configurable Gaussian noise to simulate sensor inaccuracies - Publishes separate camera and LiDAR cone detections - Simulates realistic sensor delays and publishing rates - Supports visualization in RViz **ROS Topics:** *Subscribed Topics:* - `/ground_truth/cones` (`ConeArrayWithCovariance`) - Ground truth cone positions from simulator *Published Topics:* - `/camera/cones` (`ConesCartesian`) - Simulated camera cone detections with noise - `/lidar/cones` (`ConesCartesian`) - Simulated LiDAR cone detections with noise - `/cones_noise_viz` (`PointCloud`) - Visualization of noisy camera cones for RViz **Configuration Parameters:** The simulator uses settings from `SimulatorPerceptionSettings`: - `camera_hz` - Camera publishing rate (Hz) - `lidar_hz` - LiDAR publishing rate (Hz) - `camera_noise_std` - Standard deviation of Gaussian noise for camera (meters) - `lidar_noise_std` - Standard deviation of Gaussian noise for LiDAR (meters) - `camera_delay_mean` - Mean delay for camera publishing (seconds) - `camera_delay_std` - Standard deviation of camera delay (seconds) - `lidar_delay_mean` - Mean delay for LiDAR publishing (seconds) - `lidar_delay_std` - Standard deviation of LiDAR delay (seconds) **Usage:** The simulator is designed to run alongside the EUFS simulator or other ground truth sources. It processes ground truth cone data and outputs simulated sensor measurements that can be used to test perception fusion, SLAM, and planning algorithms under realistic sensor conditions. **Noise Model:** The noise is applied independently to x and y coordinates using a Gaussian distribution: .. math:: x_{noisy} = x_{true} + \mathcal{N}(0, \sigma^2) y_{noisy} = y_{true} + \mathcal{N}(0, \sigma^2) where :math:`\sigma` is the configured standard deviation for the respective sensor. **Color Mapping:** - Camera detections include color information: - Blue cones: color = 2 - Yellow cones: color = 1 - LiDAR detections do not include color: color = -1 API Reference ------------- .. automodule:: perception.perception_sim.perception_sim.cone_noise_sim :members: :undoc-members: :show-inheritance: