Agrarsense
AGRARSENSE Sensors

Documentation

Topic Link
Getting Started Getting Started
Using the Simulator Using the Simulator
Using Vehicles Using Vehicles
Using Walkers Using Walkers
Sensors Sensors
Creating own maps Creating own maps
Building from source on Linux Building on Linux
Building from source on Windows Building on Windows
Installing on Linux Installing on Linux
Installing on Windows Installing on Windows

General Information

Each sensor publishes its own data and data type to its own ROS topic. ROS topic name is the same as Sensor ID. You can query all active ROS topics with below command from Terminal.

rostopic list

All sensors in the AGRARSENSE simulator are implemented and controlled in C++.

Transform Sensor

The Transform Sensor is automatically created and attached to each Vehicle. It operates entirely on the CPU with minimal impact on overall CPU load.

This sensor remains invisible in all the UI's, cannot be deleted, and will automatically be destroyed when the Vehicle is removed.

Note: This topic is for information only. The simulator does not subscribe to it. To control the Vehicle through ROS see Using Vehicles.

This sensor publishes two types of messages:

  1. Parent Actor Transform
    Publishes the Transform component data of the parent Actor, including Translation and Rotation information, using the geometry_msgs/Transform message format.

    # Transform topic naming
    # /Agrarsense/Sensors/ID/transform
    # For example
    /Agrarsense/Sensors/forwarder/transform
    /Agrarsense/Sensors/harvester/transform
    /Agrarsense/Sensors/drone/transform
    # If you echo the topic the data should look like this:
    translation:
    x: 85487.27554432102
    y: 70866.33801191082
    z: -2485.388920965257
    rotation:
    x: 0.03496819132314945
    y: -0.007906817838893326
    z: 0.0065712834606426775
    w: 0.9993355422784332
  2. GPS Coordinates
    Publishes the GPS data in std_msgs/String format, structured as a comma-separated string with the following order:

    Latitude, Longitude, Altitude

    Note: GPS coordinates are only published in maps where map has been georeferenced using Unreal Engine Georefering plugin (Suomi and Vindeln map).

    # GPS topic naming
    # /Agrarsense/Sensors/ID/gnss
    # For example
    /Agrarsense/Sensors/forwarder/gnss
    /Agrarsense/Sensors/harvester/gnss
    /Agrarsense/Sensors/drone/gnss
    # If you echo the topic the data should look like this:
    64.271063, 19.777301, 291.649881

Collision sensor

Collision sensor is created automatically for each Vehicle. Collision sensor runs enterily on the CPU and adds minor CPU load.

Collision sensor publishes its Collision data into ROS topic, if ROS is connected.

# GPS topic naming
# /Agrarsense/Sensors/ID/gnss
# For example
/Agrarsense/Sensors/forwarder/gnss
/Agrarsense/Sensors/harvester/gnss
/Agrarsense/Sensors/drone/gnss
# If you echo the topic the data should look like this:
64.271063, 19.777301, 291.649881

Overlap sensor

Overlap sensor is created automatically for each Vehicle. Overlap sensor runs enterily on the CPU and adds minor CPU load.

This sensor remains invisible in all the UI's, cannot be deleted, and will automatically be destroyed when the Vehicle is removed.

This sensor can be used to detect when walker or another vehicle enters/exists vehicle overlap bounds i.e vehicle safe zone area. The sensor is configured that only Walkers and other Vehicles are detected when they enter the overlap area.

The output of the sensor is simple string ROS message with following format:

# Overlap starts:
"Overlap begin with: [actor name] ID: [actor ID]"
# Overlap ends:
"Overlap ends with: [actor name] ID: [actor ID]"

You can visualize all Overlap sensors bounds with following ROS commands. This is only visible with the Spectator camera. See Using the Simulator for how to use ROS commands.

# Set visualize all overlap sensors bounds
rostopic pub /Agrarsense/Commands std_msgs/String "visoverlapbounds true" --once

You can also change the bounds area and relative offset with following ROS commands, where "forwarderoverlap" here is the ID of the sensor.

# Change overlap bounds
rostopic pub /Agrarsense/Commands std_msgs/String "changeoverlapbounds forwarderoverlap 500,500,500" --once
# Change overlap relative position
rostopic pub /Agrarsense/Commands std_msgs/String "changeoverlapposition forwarderoverlap 0,0,0" --once

Radar sensor

Unreal Linetrace based Radar sensor.

Radar sensor runs enterily on the CPU and adds minor to medium CPU load.

Radar sensor parameters:

Parameter Type Default Description
Range float 100.0 Radar maximum range in meters.
HorizontalFOV float 30.0 Radar horizontal field of view in degrees.
VerticalFOV float 30.0 Radar vertical field of view in degrees.
PointsPerSecond int 1500 Number of points per second to send from the radar.
SendDataToRos bool true Indicates whether to send radar data to ROS.
VisualizeRadarHits bool false Indicates whether to visualize radar hit locations.

Lidar sensor

Unreal Linetrace based Lidar sensor.

Lidar sensor runs enterily on the CPU and adds medium to high CPU load and minor GPU load if VisualizePointcloud is set to true.

Depending on sensor parameters and PC's CPU, The simulator should be able to handle 3+ Lidar sensors at the same time without significant performance degradation.

Lidar sensor point cloud visualization (Unreal Niagara particle system) only shows up on Spectator camera.

For other cameras, this visualization is hidden by default. For Lidar sensors with high PointsPerSecond value, it's recommended to set VisualizePointcloud to false to save performance. Also Note that each point cloud visualization lasts for 1 second, so multiple point clouds may overlap.

If ROS is setup correctly and it's active, each Lidar sensor creates its own ROS topic and publishes pointcloud data into the topic.

The simulator also creates a secondary ROS topic, Agrarsense/Sensors/lidars_combined, where the combined point cloud from all Lidar sensors is sent. Lidar sensor can be excluded from this by setting SendDataToCombinedROSTopic to false.

UseLidarNoiseModel is dependent on the Simulator Weather settings. Noise model is only active when following Weather conditions are met:

  • Temperature is below zero.
  • Precipitation is higher than zero.

UseLidarNoiseModel refers to a specific noise model developed by LUAS. DistanceNoiseStdDev and LateralNoiseStdDev adds additional noise to the line trace results.

When SavePointcloudWithoutNoiseModel is to to true, point clouds without the noise is only saved when either DistanceNoiseStdDev is higher than 0.0 or LateralNoiseStdDev is higher than 0.0 or when UseLidarNoiseModel is set to true and simulation weather parameters support usage of the lidar noise model formula.

When SendDataAtRotationFrequency is set to true, the sensor simulates a full point cloud at the specified RotationFrequency. For example, at 10 Hz, the sensor generates 10 point clouds per simulation second. If this is set to false, a part of the point cloud is sent/saved to ROS/disk every simulation frame. The sensor is also optimized for SendDataAtRotationFrequency being true.

Note. SaveDataToDisk works asynchronously as is not guaranteed to be saved in the same frame.

Lidar Sensor parameters:

Parameter Type Default Description
Channels int32 32 Number of channels.
Range float 120.0f Lidar max range in meters
PointsPerSecond int32 655360 Number of points per second
RotationFrequency float 10.0f Sensor rotation frequency / frame rate.
UpperFovLimit float 22.5f Angle in degrees of the highest laser.
LowerFovLimit float -22.5f Angle in degrees of the lowest laser.
HorizontalFov float 360.0f Horizontal field of view in degrees.
DistanceNoiseStdDev float 0.00f Controls the amount of variability or "spread" in the distance measurements due to noise. A higher value means more noise.
LateralNoiseStdDev float 0.00f Controls the amount of lateral noise applied perpendicular to linetrace, affects randomly in either horizontal or vertical direction.
Semantic bool false Whether this lidar is Semantic. When this is true, each lidar hit is colored based on hit object. See labels and colors below.
UseComplexCollisionTrace bool true Whether to use complex mesh collision trace. Turning this to false is more peformant but results in linetraces to use simple collision.
UseTerrainSnowHitAdjustment bool true Whether to use linetrace terrain snow hit adjustment in the Lidar processing (simulate if Lidar linetrace "hits" snow layer).
SendDataAtRotationFrequency bool true Whether to send/save data every simulation frame or at the specified RotationFrequency.
SendDataToCombinedROSTopic bool true Whether to send this lidar data to a combined ROS topic.
SaveDataToDisk bool false Whether to save this lidar data to disk in .ply format. Point cloud will be saved to SIMULATION_ROOT/Data directory.
SendDataToROS bool true Whether to send this lidar data to a ROS topic.
VisualizePointcloud bool true Whether to visualize this Lidar sensor with a Niagara particle system.
UseLidarNoiseModel bool false Should LUAS snowfall Lidar noise model be used. Weather parameters (temp, precipication particle size) affect lidar noise model formula.
SavePointcloudWithoutNoiseModel bool false Should save point cloud without lidar noise model applied. Note. Both UseLidarNoiseModel and SaveDataToDisk need to be true.

Saved pointcloud visualized in CloudCompare software.

Image

if Semantic parameter is enabled, linetrace checks the hit component and changes adjusts point color based on hit object. Lidar semantic colors are as follows:

Label Color
None 0, 0, 0
Other 255, 255, 255
Terrain 192, 96, 192
Prop 128, 64, 128
Pedestrian 220, 20, 60
Animal 255, 0, 255
Vehicle 0, 0, 142
Foliage 107, 142, 35
Birch 0, 255, 0
Pine 0, 128, 0
Spruce 0, 192, 0
Alder 0, 255, 128
Willow 0, 255, 255
Snowflake 255, 255, 0
Road 169, 169, 169
Building 0, 0, 255

Saved semantic pointcloud visualized in CloudCompare software.

Image

Camera sensors

Cameras are by far the most performance heavy sensors in the simulator. Each Camera sensor adds significant GPU load and minor/medium CPU load.

Each Camera sensor creates its own custom Unreal window where the camera view is rendered.

If you are using Lapland UAS fork of the Unreal Engine, you can minimize the Window which will continue rendering the camera view. If you using Unreal Engine from Epic Games launcher, minimizing the window is disabled because the camera would stop rendering. This is because Unreal Engine stops painting the Window when its minimized. You can also move and resize the window as you like. This will not affect the camera resolution or output when saving images to disk or sending image data to ROS, only the preview window image may stretch incorrectly. Closing the Custom camera window destroyes the sensor and its ROS topic.

All Camera sensors share same base Camera parameters, but all indivial cameras also has its own camera parameters, for example Depth camera has its own depth camera parameters but also the base camera parameters.

Base camera parameters:

Parameter Type Default Description
PostProcessingEffects bool true Enables or disables Post Process effects.
Enable16BitFormat bool true Enables 16-bit format.
FOV float 90.0 Camera Field of View in degrees.
TargetGamma float 2.2 Camera gamma value.
ShutterSpeed float 60.0 Camera shutter speed.
ISO float 100.0 Camera ISO value.
Width int32 800 Camera resolution width.
Height int32 600 Camera resolution height.
FocalDistance float 0.0 Distance in which the Depth of Field effect should be sharp, in centimeters.
DepthBlurAmount float 1.0 Depth blur amount for 50%.
DepthBlurRadius float 0.0 Depth blur radius in pixels at 1920x resolution.
DofMinFStop float 1.2 Defines the minimum opening of the camera lens to control the curvature of the diaphragm. Set it to 0 to get straight blades.
DofBladeCount int32 5 Defines the number of blades of the diaphragm within the lens (between 4 and 16).
FilmSlope float 0.88 Film slope.
FilmToe float 0.55 Film toe.
FilmShoulder float 0.26 Film shoulder.
FilmBlackClip float 0.0 Film black clip.
FilmWhiteClip float 0.04 Film white clip.
ExposureMinBrightness float -10.0 Auto-Exposure minimum adaptation. Eye Adaptation is disabled if Min = Max.
ExposureMaxBrightness float 20.0 Auto-Exposure maximum adaptation. Eye Adaptation is disabled if Min = Max.
ExposureSpeedUp float 10.0 Auto-Exposure speed-up in F-stops per second (should be greater than 0).
ExposureSpeedDown float 1.0 Auto-Exposure speed-down in F-stops per second (should be greater than 0).
MotionBlurIntensity float 0.5 Motion Blur intensity (0: off).
MotionBlurMax float 5.0 Maximum distortion caused by motion blur, in percent of the screen width (0: off).
MotionBlurMinObjSize float 0.0 Minimum projected screen radius for a primitive to be drawn in the velocity pass, as a percentage of the screen width (default: 0%).
LensFlareIntensity float 1.0 Brightness scale of image-caused lens flares (linear).
BloomIntensity float 0.675 Multiplier for all bloom contributions (0: off, >1: brighter).
WhiteTemp float 6500.0 White temperature.
WhiteTint float 0.0 White tint.
ChromAberrIntensity float 0.0 Scene chromatic aberration / color fringe intensity (in percentage).
ChromAberrOffset float 0.0 Normalized distance to the center of the framebuffer where the chromatic aberration effect takes place.
Aperture float 4.0 Defines the opening of the camera lens (aperture is 1/f-stop, larger numbers reduce the depth of field effect, default = 4.0).
SaveImageToDisk bool false Indicates whether to save sensor data to disk.
SendDataToROS bool true Indicates whether to send this camera data to a ROS topic.
TargetFrameRate float 0.0 Camera sensor target frame rate. 0.0 means every simulation frame, 30.0 means 30 frames per second etc. Targeting doesn't guarantee the simulation runs at the specified frame rate.
UsePhysicLensDistortionEffect bool true Should use camera physics lens distortion effect
UseIceLensEffect bool false Should use camera ice lens effect, simulation Weather parameters do not affect this, IceLensEffectStrength and IceLensEffectAngle do
IceLensEffectStrength float 0.3 Camera ice lens effect strength
IceLensEffectAngle float 1.0 Camera ice lens effect angle

Note 1. SaveImageToDisk works asynchronously as is not guaranteed to be saved in the same frame. When SaveImageToDisk is set to true, separate camera_metadata.txt is also created to the same save location. This text file contains every image metadata in following format: image_name, X location, Y location, Z location, yaw rotation, pitch rotation, roll rotation

Note 2. Camera sensors should not see Lidar sensor particles even if Lidar sensor VisualizePointcloud is set to true. This is intended behaviour.

RGB camera

RGB camera acts as a regular camera capturing images from the scene.

This camera only has the base camera parameters. See above for parameters

Output image:

Image

Depth camera

Depth Camera provides a raw data of the scene codifying the distance of each pixel to the camera to create a depth map of the elements.

Depth Camera parameters:

Parameter Type Default Description
ConvertToGrayscale bool false Should image be converted to grayscale.
CameraParameters FCameraParameters Default Camera parameters Camera parameters.

Depth camera image where ConvertToGrayscale is set to false.

Output image:

Image

Depth camera image where ConvertToGrayscale is set to true.

Output image:

Image

DVS camera

A Dynamic Vision Sensor (DVS) or event camera is a sensor that works differently from a conventional camera. Instead of capturing images at a fixed rate, event camera measures changes of intensity.

DVS camera sends both converted Image to ROS Image topic and the raw DVS camera data as pointcloud2 message.

Note: When VisualizeDVSCamera is set to false, the sensor creates a single camera window without DVS visualization. When set to true, it opens a separate window to display the DVS camera output. The need for two windows arises from how the camera sensor reads pixel output. Reading, manipulating the pixels, and visualizing the output in one window causes flickering.

DVS Camera parameters:

Parameter Type Default Description
PositiveThreshold float 0.3 Positive threshold C associated with an increment in brightness change (0-1).
NegativeThreshold float 0.3 Negative threshold C associated with a decrement in brightness change (0-1).
SigmaPositiveThreshold float 0.0 White noise standard deviation for positive events (0-1).
SigmaNegativeThreshold float 0.0 White noise standard deviation for negative events (0-1).
RefractoryPeriodNs int32 0 Refractory period in nanoseconds. It limits the highest frequency of triggering events.
UseLog bool true Indicates whether to work in the logarithmic intensity scale.
LogEps float 0.001 Epsilon value used to convert images to the log scale.
SortByIncreasingTimestamp bool true Should DVS events be ordered by increasing timestamps. Turning this to false will improve performance.
ParallelImageConversion bool true Indicates whether to use ParallelFor for the DVS camera image grayscale conversion.
ParallelSimulation bool true Indicates whether to use ParallelFor for the DVS camera simulation.
VisualizeDVSCamera bool true Indicates whether to visualize DVS camera output in a separate window.
CameraParameters FCameraParameters Default Camera parameters Camera parameters.

Output image:

Image

Semantic segmentation camera

This camera classifies every object in the view by displaying it in a different color according to its set stencil value.

Semantic camera colors are as follows:

| Label | Description | Vector4 value | Stencil value (set in Unreal) |---------—|---------------------------------------—|------------------------—| | None | Transparent or Black | (0, 0, 0, 1.0) | | Terrain | Earthy Brown | (0.4, 0.26, 0.13, 1.0) | | Props | Faded Red | (0.5, 0.06, 0.06, 1.0) | | Human | Skin tone | (0.96, 0.8, 0.69, 1.0) | | Reindeer | Bright Blue | (0.0, 0.0, 1.0, 1.0) | | Foliage | Leaf Green | (0.3, 0.69, 0.31, 1.0) | | Birch | Vibrant Yellow | (1.0, 0.78, 0.0, 1.0) | | Pine | Dark Green | (0.0, 0.39, 0.0, 1.0) | | Spruce | Deep Green | (0.02, 0.49, 0.27, 1.0) | | Alder | Medium Green | (0.4, 0.6, 0.4, 1.0) | | Willow | Yellow-Green | (0.53, 0.75, 0.34, 1.0) | | Rocks | Vibrant Purple | (0.31, 0.0, 1.0, 1.0) | | Road | Asphalt Grey | (0.3, 0.3, 0.3, 1.0) | | Building | Brick Red | (0.7, 0.3, 0.2, 1.0) | | Sky | Sky Blue | (0.53, 0.81, 0.92, 1.0) | | Water | Water Blue | (0.25, 0.46, 0.7, 1.0) | | Drone | Metallic Silver | (0.75, 0.75, 0.75, 1.0) | | Harvester | Machinery Red | (0.8, 0, 0, 1.0) | | Forwarder | Forest Green | (0.13, 0.55, 0.13, 1.0) | | Sensors | Electric Blue | (0.27, 0.39, 0.89, 1.0) | | Snow | Pure White | (1, 1, 1, 1.0) | | Leaves | Autumn Orange | (0.8, 0.4, 0.0, 1.0) |

Output image:

Image

You can change the semantic colors by defining colors in JSON format and passing it through ROS with following command:

# ROS1
rostopic pub /agrarsense/in/commands std_msgs/String "changecolors C:/Agrarsense/Examples/ExampleJsonFiles/semantic_camera_colors.json" --once
# ROS2
ros2 topic pub /agrarsense/in/commands std_msgs/String "data: ChangeColors C:/Agrarsense/Examples/ExampleJsonFiles/semantic_camera_colors.json" --once

Or by drag and dropping the json file to simulation main window (drag dropping a json file only works on Windows platform)

Image

Thermal camera

The simulator offers simplified thermal camera where animals such as reindeer are marked with warm colors and everything else with cold colors.

Under the hood this sensor works similairly to semantic segmentation camera and uses post processing material and Unreal engine stencil value to determine how to color the objects, but with different color table.

Thermal Camera parameters:

Parameter Type Default Description
WarmColor Vector4 1.0, 0.0, 0.0, 1.0 Warm color
WarmColor2 Vector4 1.0, 0.55, 0.0, 1.0 Second warm color
ColdColor Vector4 0.0, 0.07, 0.36, 1.0 Cold color
ColdColor2 Vector4 0.0, 0.11, 1.0, 1.0 Second cold color
AllowCustomNoiseResolution bool false Should allow custom noise resolution
WidthResolutionNoise int32 1280 Width noise resolution, only applies if AllowCustomNoiseResolution is true
HeightResolutionNoise int32 720 height noise resolution, only applies if AllowCustomNoiseResolution is true
CameraParameters FCameraParameters Default Camera parameters Camera parameters.

When AllowCustomNoiseResolution is set to true, you can modify WidthResolutionNoise and HeightResolutionNoise to create lower quality output to better simulate different kinds of camera sensors.

Below is a gif showing thermal camera output, first with AllowCustomNoiseResolution set to false and then AllowCustomNoiseResolution set to true with WidthResolutionNoise and HeightResolutionNoise set to 460 and 240.

Output image:

Image

Spawning sensor

AGRARSENSE Simulator offers multiple ways for spawning sensors.

Terminology:

  • World: Sensors are placed in a fixed position, unattached to any moving object.
  • Vehicle: Sensors are mounted to a vehicle, tracking its movement.
  • Spectator: Sensors are attached to the simulator's spectator actor, tracking its movement.
  • Garage: A dedicated environment where you can focus exclusively on configuring and attaching sensors to a vehicle.

Spawning Sensors to World or to Vehicles through Asset Placement

Open Asset Placement

Spawning Sensors to World through JSON

The simulator allows spawning sensors to fixed position, unattached to any moving object.

You can spawn objects to World through JSON either:

A) Drag and drop json file(s) to Simulator window like below.

Image

Note. Drag and drop only works on Windows platform at the moment.

B) Spawn json file through ROS

Publish message to the /agrarsense/in/commands topic in following format: SpawnObjects FULL_PATH/YOUR_JSON_FILE.json

# ROS1
rostopic pub /agrarsense/in/commands std_msgs/String "SpawnObjects YOUR_FULL_PATH/YOUR_JSON_FILE.json" --once
# ROS2
ros2 topic pub /agrarsense/in/commands std_msgs/String "data: SpawnObjects YOUR_FULL_PATH/YOUR_JSON_FILE.json" --once
# For example
rostopic pub /agrarsense/in/commands std_msgs/String "SpawnObjects C:/Agrarsense/Examples/ExampleJsonFiles/Forwarder.json" --once
ros2 topic pub /agrarsense/in/commands std_msgs/String "data: SpawnObjects C:/Agrarsense/Examples/ExampleJsonFiles/Forwarder.json" --once

Example json file to spawn rgb camera sensor. You can find more example .json files in ROOT/Agrarsense/Examples/ExampleJsonFiles directory. All of these can be drag and dropped to simulator window (Windows only) or spawned through ROS.

{
"objects": [
{
"type": "sensor",
"model": "RGBCamera",
"id": "rgbcamera",
"spawnPoint":
{
"x": 0.0,
"y": 0.0,
"z": 200.0,
"yaw": 0.0,
"pitch": 0.0,
"roll": 0.0
},
"parameters":
{
"postProcessingEffects": true,
"enable16BitFormat": true,
"fov": 90.0,
"targetGamma": 1.0,
"shutterSpeed": 60.0,
"iso": 100.0,
"width": 1280,
"height": 720,
"focalDistance": 0.0,
"depthBlurAmount": 1.0,
"depthBlurRadius": 0.0,
"dofMinFStop": 1.2,
"dofBladeCount": 5,
"filmSlope": 0.88,
"filmToe": 0.55,
"filmShoulder": 0.26,
"filmBlackClip": 0.0,
"filmWhiteClip": 0.04,
"exposureMinBrightness": -10.0,
"exposureMaxBrightness": 20.0,
"exposureSpeedUp": 10.0,
"exposureSpeedDown": 1.0,
"motionBlurIntensity": 0.5,
"motionBlurMax": 5.0,
"motionBlurMinObjSize": 0.0,
"lensFlareIntensity": 1.0,
"bloomIntensity": 0.0,
"whiteTemp": 6500.0,
"whiteTint": 0.0,
"chromAberrIntensity": 0.0,
"chromAberrOffset": 0.0,
"aperture": 4.0,
"saveImageToDisk": false,
"sendDataToROS": true,
"targetFrameRate": 0.0,
"usePhysicLensDistortionEffect": true,
"useIceLensEffect": false,
"iceLensEffectStrength": 0.3,
"iceLensEffectAngle": 1.0
}
}
]
}

Spawning Sensors to Spectator camera through JSON

Spawning sensors to spectator Actor can only be done through JSON.

The only difference compared to Spawning Sensors to World through JSON is to add attachToSpectator: true field.

{
"objects": [
{
"type": "sensor",
"model": "RGBCamera",
"id": "rgbcamera",
"attachToSpectator": true,
"parameters":
{
"postProcessingEffects": true,
"enable16BitFormat": true,
"fov": 90.0,
"targetGamma": 1.0,
"shutterSpeed": 60.0,
"iso": 100.0,
"width": 1280,
"height": 720,
"focalDistance": 0.0,
"depthBlurAmount": 1.0,
"depthBlurRadius": 0.0,
"dofMinFStop": 1.2,
"dofBladeCount": 5,
"filmSlope": 0.88,
"filmToe": 0.55,
"filmShoulder": 0.26,
"filmBlackClip": 0.0,
"filmWhiteClip": 0.04,
"exposureMinBrightness": -10.0,
"exposureMaxBrightness": 20.0,
"exposureSpeedUp": 10.0,
"exposureSpeedDown": 1.0,
"motionBlurIntensity": 0.5,
"motionBlurMax": 5.0,
"motionBlurMinObjSize": 0.0,
"lensFlareIntensity": 1.0,
"bloomIntensity": 0.0,
"whiteTemp": 6500.0,
"whiteTint": 0.0,
"chromAberrIntensity": 0.0,
"chromAberrOffset": 0.0,
"aperture": 4.0,
"saveImageToDisk": false,
"sendDataToROS": true,
"targetFrameRate": 0.0,
"usePhysicLensDistortionEffect": true,
"useIceLensEffect": false,
"iceLensEffectStrength": 0.3,
"iceLensEffectAngle": 1.0
}
}
]
}

Spawning Vehicle with sensors through JSON

To spawn vehicle with sensor(s) you can use following json structure.

{
"objects": [
{
"type": "vehicle",
"model": "Forwarder",
"spawnPoint": {
"x": 801.81213642853379,
"y": -1998.7911836034978,
"z": 4.6668885864991125,
"roll": 0.0034698484636624624,
"pitch": 1.2947014081437904,
"yaw": -0.00095361222054179962
},
"sensors": [
{
"type": "sensor",
"model": "Lidar",
"name": "lidar",
"spawnPoint": {
"x": 428.24285827225685,
"y": -174.3420693053049,
"z": 386.51402282714844,
"roll": -0,
"pitch": 0,
"yaw": 0
},
"parameters": {
"channels": 32,
"range": 120,
"pointsPerSecond": 655360,
"rotationFrequency": 10,
"upperFovLimit": 22.5,
"lowerFovLimit": -22.5,
"horizontalFov": 360,
"distanceNoiseStdDev": 0.0,
"lateralNoiseStdDev": 0.0,
"semantic": false,
"useComplexCollisionTrace ": false,
"sendDataAtRotationFrequency": true,
"sendDataToCombinedROSTopic": true,
"saveDataToDisk": false,
"sendDataToROS": true,
"visualizePointcloud": true,
"useLidarNoiseModel": false,
"savePointcloudWithoutNoiseModel": false
}
}
]
}
]
}

Spawning Sensors to Vehicle through Garage

Getting vehicle into the garage you can do either:

A) Right click the vehicle which will open up a context menu. In the Context menu, click on the 'Garage' option. After you exit the garage, vehicle will be teleported back to the previous location.

B) Open up the Vehicle Listing menu, and press the Garage icon button next to the vehicle name.

Garage user interface

Vehicle's sensors are listed on the left side. Lidars, radars and cameras are in separate lists. These lists can be closed/opened by clicking on the list header. Below the sensor lists is the Presets-button which will show the Presets-menu. Next to those is the guide for the controls. This guide can be closed by clicking on the arrow below the guide. On the right is the parameters menu which will open when a sensor has been selected either by clicking on the sensors on the list, or by adding a new one. This parameters menu can also be hidden with the arrow button below it. Below the parameters menu is the Leave-button which will leave the garage and teleport the vehicle back to its previous location.

Garage user interface

Adding sensors in Garage

To add a sensor, click on one of the ADD-buttons below the sensor lists. After this right click on the vehicle to spawn the sensor to that location. The newly added sensor appears on the list and the parameters menu will open.

Selecting sensor and changing parameters in Garage

To select a sensor, you can either: A) Left-click the sensor 3D-model, or B) Click the sensor's name on the sensor lists

After selecting a sensor, the right side parameters menu will be populated with the selected sensor's data. To change the parameters, or the sensor's name, modify the values and press the APPLY-button which will save the input values to the sensor. If you don't want to save the new input values, press the REVERT-button. The revert will only revert the unsaved input values, and can't be used to revert saved values.

Removing sensors in Garage

To remove a sensor, select it from the list and press the DELETE-button. Deletion requires confirmation so you have to press it again after it comes enabled again.

Sensor presets in Garage

Click on the Presets-button to open up the presets menu. Here you can save current sensors configuration to a presets which can later be loaded from the user interface. Menu consists of a list which shows all the existing presets. Clicking the list item will select that preset.

Presets menu

Options:

  • Press the LOAD-button to replace current sensor configuration with this preset
  • Press the "SAVE AS NEW"-button to save current sensors configuration to a new preset. A prompt will be shown to ask for the preset name
  • Press the "SAVE OVERWRITE"-button to save current sensors configuration to selected preset overwriting the sensors data
  • Press the DELETE-button to delete the preset (deletion requires confirmation click)
  • Press the EDIT-button to edit the preset's name. A prompt will be shown to ask for the new name
  • Press the "EXPORT CURRENT"-button to export the current sensors configuration (not the selected preset) to a JSON-file. A prompt will be shown to ask for the filename

Exporting Sensors and vehicles with sensors

The simulator supports exporting various elements to JSON files, complete with their current parameters. To export sensors or vehicles equipped with sensors, follow the steps outlined below.

All exported .json files are saved to the SIMULATION_ROOT/Data/ExportedJsonFiles/ directory.

Export through Garage

In the garage user interface Press the "EXPORT CURRENT"-button to export to a JSON-file. A prompt will be shown to ask for the filename.

Export through Developers Tools

You can toggle Developer Tools interface by clicking ctrl + F9 key combination at any point. From the you can click "Export Sensors" or "Export Vehicles" to export things.

Export through ROS command

# ROS1 exports
rostopic pub /agrarsense/in/commands std_msgs/String "ExportVehicles" --once
rostopic pub /agrarsense/in/commands std_msgs/String "ExportSensors" --once
rostopic pub /agrarsense/in/commands std_msgs/String "ExportAll" --once
# ROS2 exports
ros2 topic pub /agrarsense/in/commands std_msgs/String "data: ExportVehicles" --once
ros2 topic pub /agrarsense/in/commands std_msgs/String "data: ExportSensors" --once
ros2 topic pub /agrarsense/in/commands std_msgs/String "data: ExportAll" --once