Agrarsense
AGRARSENSE Sensors

Documentation

Topic Link
Getting Started Getting Started
Using the Simulator Using the Simulator
Using Vehicles Using Vehicles
Using Walkers Using Walkers
Sensors Sensors
Setupping sensors Setupping sensor
Creating own levels Creating own levels
Building from source on Linux Building on Linux
Building from source on Windows Building on Windows
Installing on Linux Installing on Linux
Installing on Windows Installing on Windows

General Information

Each sensor publishes its own data and data type to its own ROS topic. ROS topic name is the same as Sensor ID. You can query all active ROS topics with below command from Terminal.

rostopic list

All sensors in the AGRARSENSE simulator are implemented and controlled in C++.

Transform sensor

Transform sensor is created automatically for each Vehicle.

Transform sensor runs enterily on the CPU and adds minor CPU load.

Transform sensor publishes its Transform component (Translation and Rotation) data into ROS topic.

Collision sensor

Collision sensor is created automatically for each Vehicle.

Collision sensor runs enterily on the CPU and adds minor CPU load.

Collision sensor publishes its Collision data into ROS topic, if ROS is connected.

Overlap sensor

Overlap sensor is created automatically for each Vehicle. Overlap sensor runs enterily on the CPU and adds minor CPU load.

This sensor can be used to detect when walker or another vehicle enters/exists vehicle overlap bounds i.e vehicle safe zone area. The sensor is configured that only Walkers and other Vehicles are detected when they enter the overlap area. This sensor is automatically attached to all vehicles when a vehicle is spawned. The output of the sensor is simple string ROS message with following format:

Overlap starts: "Overlap begin with: [actor name] ID: [actor ID]"

Overlap ends: "Overlap ends with: [actor name] ID: [actor ID]"

You can visualize all Overlap sensors bounds with following ROS commands. This is only visible with the Spectator camera. See Using the Simulator for how to use ROS commands.

# Set visualize all overlap sensors bounds
rostopic pub /Agrarsense/Commands std_msgs/String "visoverlapbounds true" --once

You can also change the bounds area and relative offset with following ROS commands, where "forwarderoverlap" here is the ID of the sensor.

# Change overlap bounds
rostopic pub /Agrarsense/Commands std_msgs/String "changeoverlapbounds forwarderoverlap 500,500,500" --once
# Change overlap relative position
rostopic pub /Agrarsense/Commands std_msgs/String "changeoverlapposition forwarderoverlap 0,0,0" --once

Radar sensor

Unreal Linetrace based Radar sensor.

Radar sensor runs enterily on the CPU and adds minor to medium CPU load.

Radar sensor parameters:

Parameter Type Default Description
Range float 100.0 Radar maximum range in meters.
HorizontalFOV float 30.0 Radar horizontal field of view in degrees.
VerticalFOV float 30.0 Radar vertical field of view in degrees.
PointsPerSecond int 1500 Number of points per second to send from the radar.
SendDataToRos bool true Indicates whether to send radar data to ROS.
VisualizeRadarHits bool false Indicates whether to visualize radar hit locations.
DebugLines bool false Indicates whether to show radar debug lines.

Lidar sensor

Unreal Linetrace based Lidar sensor.

Lidar sensor runs enterily on the CPU and adds medium to high CPU load and minor GPU load if VisualizePointcloud is set to true.

Depending on sensor parameters and PC's CPU, The simulator should be able to handle 3+ Lidar sensors at the same time without significant performance degradation.

Lidar sensor point cloud visualization (Unreal Niagara particle system) only shows up on Spectator camera, for all Camera sensors, this point cloud visualization is hidden by default. If you don't need point cloud visualization at all then turning this off can save ~1-2 ms of processing per frame since we can skip data conversion and particle rendering.

If ROS is setup correctly and it's active, each Lidar sensor creates its own ROS topic and publishes pointcloud data into the topic at the Lidar RotationFrequency.

The simulator also creates a secondary ROS topic, Agrarsense/Sensors/lidars_combined, where the combined point cloud from all Lidar sensors is sent. Lidar sensor can be excluded from this by setting SendDataToCombinedROSTopic to false.

UseLidarNoiseModel is dependent on the Simulator Weather settings. Noise model is only active when following Weather conditions are met:

  • Temperature is below zero.
  • Precipitation exceeds zero.

Lidar Sensor parameters:

Parameter Type Default Description
Semantic bool false Whether this lidar is Semantic. When this is true, each lidar hit is colored based on hit object. See labels and colors below.
Channels int32 32 Number of channels.
Range float 120.0f Lidar linetrace max range in meters
PointsPerSecond int32 655360 Number of points per second.
RotationFrequency float 10.0f Lidar rotation frequency.
UpperFovLimit float 22.5f Upper laser angle in degrees, relative to the horizontal line. (Positive values mean above horizontal.)
LowerFovLimit float -22.5f Lower laser angle in degrees, relative to the horizontal line. (Negative values mean below horizontal.)
HorizontalFov float 360.0f Horizontal field of view in degrees.
UseTerrainSnowHitAdjustment bool true Indicates whether to use linetrace terrain snow hit adjustment in the Lidar processing (simulate if Lidar linetrace "hits" snow layer).
SendDataAtRotationFrequency bool true Indicates whether to send data every frame to a ROS topic or at the specified RotationFrequency.
SendDataToCombinedROSTopic bool true Indicates whether to send this lidar data to a combined ROS topic.
SaveDataToDisk bool false Indicates whether to save this lidar data to disk in .ply format. Point cloud will be saved to SIMULATION_ROOT/Data directory.
SendDataToROS bool true Indicates whether to send this lidar data to a ROS topic.
VisualizePointcloud bool true Indicates whether to visualize this Lidar sensor with a Niagara particle system.
UseLidarNoiseModel bool false Should Lidar noise model be used. Simulation Weather parameters (Temperature, precipication particle size) affect lidar noise model formula.
SavePointcloudWithoutNoiseModel bool false Should save point cloud without lidar noise model applied. Note. Both UseLidarNoiseModel and SaveDataToDisk need to be true.

Saved pointcloud visualized in CloudCompare.

Image

if Semantic parameter is enabled, linetrace checks the hit component and changes adjusts point color based on hit object. Lidar semantic colors are as follows:

Label Color
None 0, 0, 0
Other 255, 255, 255
Terrain 192, 96, 192
Prop 128, 64, 128
Pedestrian 220, 20, 60
Animal 255, 0, 255
Vehicle 0, 0, 142
Foliage 107, 142, 35
Birch 0, 255, 0
Pine 0, 128, 0
Spruce 0, 192, 0
Alder 0, 255, 128
Willow 0, 255, 255
Snowflake 255, 255, 0
Road 169, 169, 169
Building 0, 0, 255

Saved semantic pointcloud visualized in CloudCompare.

Image

Camera sensors

Cameras are by far the most performance heavy sensors in the simulator. Each Camera sensor adds significant GPU load and minor/medium CPU load.

Each Camera sensor creates its own custom Unreal window.

Camera sensor parameters:

Parameter Type Default Description
PostProcessingEffects bool true Enables or disables Post Process effects.
Enable16BitFormat bool true Enables 16-bit format.
FOV float 90.0 Camera Field of View in degrees.
TargetGamma float 2.2 Camera gamma value.
ShutterSpeed float 60.0 Camera shutter speed.
ISO float 100.0 Camera ISO value.
Width int32 800 Camera resolution width.
Height int32 600 Camera resolution height.
FocalDistance float 0.0 Distance in which the Depth of Field effect should be sharp, in centimeters.
DepthBlurAmount float 1.0 Depth blur amount for 50%.
DepthBlurRadius float 0.0 Depth blur radius in pixels at 1920x resolution.
DofMinFStop float 1.2 Defines the minimum opening of the camera lens to control the curvature of the diaphragm. Set it to 0 to get straight blades.
DofBladeCount int32 5 Defines the number of blades of the diaphragm within the lens (between 4 and 16).
FilmSlope float 0.88 Film slope.
FilmToe float 0.55 Film toe.
FilmShoulder float 0.26 Film shoulder.
FilmBlackClip float 0.0 Film black clip.
FilmWhiteClip float 0.04 Film white clip.
ExposureMinBrightness float -10.0 Auto-Exposure minimum adaptation. Eye Adaptation is disabled if Min = Max.
ExposureMaxBrightness float 20.0 Auto-Exposure maximum adaptation. Eye Adaptation is disabled if Min = Max.
ExposureSpeedUp float 3.0 Auto-Exposure speed-up in F-stops per second (should be greater than 0).
ExposureSpeedDown float 1.0 Auto-Exposure speed-down in F-stops per second (should be greater than 0).
MotionBlurIntensity float 0.5 Motion Blur intensity (0: off).
MotionBlurMax float 5.0 Maximum distortion caused by motion blur, in percent of the screen width (0: off).
MotionBlurMinObjSize float 0.0 Minimum projected screen radius for a primitive to be drawn in the velocity pass, as a percentage of the screen width (default: 0%).
LensFlareIntensity float 1.0 Brightness scale of image-caused lens flares (linear).
BloomIntensity float 0.675 Multiplier for all bloom contributions (0: off, >1: brighter).
WhiteTemp float 6500.0 White temperature.
WhiteTint float 0.0 White tint.
ChromAberrIntensity float 0.0 Scene chromatic aberration / color fringe intensity (in percentage).
ChromAberrOffset float 0.0 Normalized distance to the center of the framebuffer where the chromatic aberration effect takes place.
Aperture float 4.0 Defines the opening of the camera lens (aperture is 1/f-stop, larger numbers reduce the depth of field effect, default = 4.0).
SaveImageToDisk bool false Indicates whether to save sensor data to disk.
SendDataToROS bool true Indicates whether to send this camera data to a ROS topic.
TargetFrameRate float 0.0 Camera sensor target frame rate (0.0 means every simulation frame, 30.0 means 30 frames per second, etc.). Targeting does not guarantee the simulation runs at the specified frame rate.
UsePhysicLensDistortionEffect bool true Should use camera physics lens distortion effect
UseIceLensEffect bool false Should use camera ice lens effect, simulation Weather parameters do not affect this, IceLensEffectStrength and IceLensEffectAngle do
IceLensEffectStrength float 0.3 Camera ice lens effect strength
IceLensEffectAngle float 1.0 Camera ice lens effect angle

Note. Camera sensors should not see Lidar sensor particles even if Lidar sensor VisualizePointcloud is set to true.

RGB camera

RGB camera acts as a regular camera capturing images from the scene.

See above for parameters

Image

Depth camera

Depth Camera provides a raw data of the scene codifying the distance of each pixel to the camera to create a depth map of the elements.

Depth Camera parameters:

Parameter Type Default Description
ConvertToGrayscale bool false Should image be converted to grayscale.
CameraParameters FCameraParameters Default Camera parameters Camera parameters.

Depth camera image where ConvertToGrayscale is set to false.

Image

Depth camera image where ConvertToGrayscale is set to true.

Image

DVS camera

A Dynamic Vision Sensor (DVS) or event camera is a sensor that works differently from a conventional camera. Instead of capturing images at a fixed rate, event cameras measure changes of intensity.

DVS camera sends both converted Image to ROS Image topic and the raw DVS camera data as pointcloud2 message.

DVS Camera parameters:

Parameter Type Default Description
PositiveThreshold float 0.3 Positive threshold C associated with an increment in brightness change (0-1).
NegativeThreshold float 0.3 Negative threshold C associated with a decrement in brightness change (0-1).
SigmaPositiveThreshold float 0.0 White noise standard deviation for positive events (0-1).
SigmaNegativeThreshold float 0.0 White noise standard deviation for negative events (0-1).
RefractoryPeriodNs int32 0 Refractory period in nanoseconds. It limits the highest frequency of triggering events.
UseLog bool true Indicates whether to work in the logarithmic intensity scale.
LogEps float 0.001 Epsilon value used to convert images to the log scale.
SortByIncreasingTimestamp bool true Should DVS events be ordered by increasing timestamps. Turning this to false will improve performance.
ParallelImageConversion bool true Indicates whether to use ParallelFor for the DVS camera image grayscale conversion.
ParallelSimulation bool true Indicates whether to use ParallelFor for the DVS camera simulation.
VisualizeDVSCamera bool true Indicates whether to visualize DVS camera output in a separate window.
CameraParameters FCameraParameters Default Camera parameters Camera parameters.

Image

Semantic segmentation camera

This camera classifies every object in the view by displaying it in a different color according to its stencil value.

Semantic camera colors are as follows:

Label Description Vector4 value
None Transparent or Black (0, 0, 0, 1.0)
Terrain Earthy Brown (0.4, 0.26, 0.13, 1.0)
Props Faded Red (0.5, 0.06, 0.06, 1.0)
Human Skin tone (0.96, 0.8, 0.69, 1.0)
Reindeer Bright Blue (0.0, 0.0, 1.0, 1.0)
Foliage Leaf Green (0.3, 0.69, 0.31, 1.0)
Birch Vibrant Yellow (1.0, 0.78, 0.0, 1.0)
Pine Dark Green (0.0, 0.39, 0.0, 1.0)
Spruce Deep Green (0.02, 0.49, 0.27, 1.0)
Alder Medium Green (0.4, 0.6, 0.4, 1.0)
Willow Yellow-Green (0.53, 0.75, 0.34, 1.0)
Rocks Vibrant Purple (0.31, 0.0, 1.0, 1.0)
Road Asphalt Grey (0.3, 0.3, 0.3, 1.0)
Building Brick Red (0.7, 0.3, 0.2, 1.0)
Sky Sky Blue (0.53, 0.81, 0.92, 1.0)
Water Water Blue (0.25, 0.46, 0.7, 1.0)
Drone Metallic Silver (0.75, 0.75, 0.75, 1.0)
Harvester Machinery Red (0.8, 0, 0, 1.0)
Forwarder Forest Green (0.13, 0.55, 0.13, 1.0)
Sensors Electric Blue (0.27, 0.39, 0.89, 1.0)
Snow Pure White (1, 1, 1, 1.0)
Leaves Autumn Orange (0.8, 0.4, 0.0, 1.0)

You can change the semantic colors by defining colors in JSON format and passing it through ROS with following command:

# ROS1
rostopic pub /Agrarsense/Commands std_msgs/String "changecolors C:/Agrarsense/Examples/ExampleJsonFiles/semantic_camera_colors.json" --once
# ROS2
ros2 topic pub /Agrarsense/Commands std_msgs/String "data: ChangeColors C:/Agrarsense/Examples/ExampleJsonFiles/semantic_camera_colors.json" --once

Image

Thermal camera

The simulator offers simplified thermal camera where animals such as reindeer are marked with warm colors and everything else with cold colors.

Under the hood this sensor works similairly to semantic segmentation camera and uses post processing material and Unreal engine stencil value to determine how to color the objects, but with different color table.

Thermal Camera parameters:

Parameter Type Default Description
WarmColor Vector4 1.0, 0.0, 0.0, 1.0 Warm color
WarmColor2 Vector4 1.0, 0.55, 0.0, 1.0 Second warm color
ColdColor Vector4 0.0, 0.07, 0.36, 1.0 Cold color
ColdColor2 Vector4 0.0, 0.11, 1.0, 1.0 Second cold color
AllowCustomNoiseResolution bool false Should allow custom noise resolution
WidthResolutionNoise int32 1280 Width noise resolution, only applies if AllowCustomNoiseResolution is true
HeightResolutionNoise int32 720 height noise resolution, only applies if AllowCustomNoiseResolution is true
CameraParameters FCameraParameters Default Camera parameters Camera parameters.

Image