Documentation
General Information
The simulator sends all sensor data to ROS, allowing you to handle data storage manually. Alternatively, the simulator can save sensor data directly to disk in an organized manner, simplifying data collection.
All saved data will be located in the following directory:
SIMULATOR_ROOT/Data/Run{NUM}/
The simulator supports multiple data collection methods:
- Static Sensor Placement: Place sensors anywhere on the map and save data to disk or via ROS (manual and time-consuming).
- Vehicle-Mounted Sensors: Attach sensors to vehicles and collect data while manually controlling them (slow and time-consuming).
- Spectator-Mounted Sensors: Attach sensors to the Spectator actor and collect data while manually controlling them or teleport Spectator via ROS.
- Drone-Based Collection: Mount sensors on a drone and capture data while flying along a predefined path, randomily roam around the map or manually flying the drone.
- Automated Data Capture: Use the
dataCapture
feature to teleport sensors to specified locations, quickly gathering data directly to disk.
Using Drone with sensors for data collection
See using vehicles and example .json files here.
Example json structure to spawn roaming drone with RGB camera.
{
"objects": [
{
"type": "vehicle",
"model": "Drone",
"id": "drone",
"teleportSpectator": false,
"followObject": false,
"destroyOverlappingObjects": false,
"hideVehicleMeshForCameras": false,
"spawnPoint": {
"x": 0,
"y": 0,
"z": 200,
"roll": 0,
"pitch": 0,
"yaw": 0,
"scaleX": 1,
"scaleY": 1,
"scaleZ": 1
},
"parameters": {
"droneAction": "Roaming",
"droneEndAction": "Stop",
"collisionsEnabled": true,
"overlapRadiusMeters": 25.0,
"createInnerOverlapSensor": true,
"innerOverlapRadiusMeters": 2.0,
"visualizeOverlap": false,
"showForwardArrow": false
},
"sensors": [
{
"type": "sensor",
"model": "RGBCamera",
"name": "rgbcamera",
"id": "rgbcamera",
"attachedToComponent": "SkeletalMesh",
"attachedToBone": "base",
"useGimbal": true,
"spawnPoint": {
"x": 6.4605301776318811,
"y": 0.63622325574397109,
"z": 20.747438007208984,
"roll": -1.6560397853027093e-07,
"pitch": -9.391856679030747e-07,
"yaw": 10.000001907348519,
"scaleX": 1,
"scaleY": 1,
"scaleZ": 1
},
"parameters": {
"postProcessingEffects": true,
"enable16BitFormat": true,
"useHDR": false,
"width": 1280,
"height": 720,
"maxViewDistanceInCmOverride": -1.0,
"fOV": 90,
"targetGamma": 1,
"shutterSpeed": 60,
"iSO": 100,
"focalDistance": 0,
"depthBlurAmount": 1,
"depthBlurRadius": 0,
"dofMinFStop": 1.2000000476837158,
"dofBladeCount": 5,
"filmSlope": 0.87999999523162842,
"filmToe": 0.55000001192092896,
"filmShoulder": 0.25999999046325684,
"filmBlackClip": 0,
"filmWhiteClip": 0.039999999105930328,
"exposureMinBrightness": -2,
"exposureMaxBrightness": 20,
"exposureSpeedUp": 10,
"exposureSpeedDown": 1,
"motionBlurIntensity": 0,
"motionBlurMax": 2,
"motionBlurMinObjSize": 0,
"lensFlareIntensity": 0,
"bloomIntensity": 0,
"whiteTemp": 6500,
"whiteTint": 0,
"chromAberrIntensity": 0,
"chromAberrOffset": 0,
"aperture": 4,
"saveImageToDisk": false,
"sendDataToROS": true,
"targetFrameRate": 0,
"usePhysicLensDistortionEffect": true,
"useTemporalAA": true,
"useIceLensEffect": false,
"iceLensEffectStrength": 0.30000001192092896,
"iceLensEffectAngle": 1
}
}
]
}
]
}
DataCapture Usage
Drag and drop .json file into the simulator window.

DataCapture json content
See example files in the data_capture*.json
format here.
To use the dataCapture
system:
- Define one or more camera sensors first.
These should be included as objects of type sensor
, specifying model, position, and parameters such as resolution, post-processing effects, and lens properties.
- Add a
dataCapture
object last.
This defines the capture positions and any optional behaviors.
Key Parameters
- **
points
**
Define the positions and orientations where images should be captured. Each point includes x
, y
, z
, yaw
, pitch
, and roll
.
- **
captureRotatedViews
**
If set to true
, the simulator captures four additional photos at each defined point, rotated in 90-degree increments around the yaw axis.
- **
useGPSLocation
**
When enabled, the x
, y
, and z
coordinates are interpreted as real-world GPS-based positions. The simulator will automatically convert these into the local coordinate system.
- **
zIsHeightAboveGround
**
If enabled, the z value in dataCapture points represents the height in meters above the ground instead of a world Z position or GPS altitude. This allows specifying a fixed height relative to the ground level.
Example json file with both RGB camera and Lidar sensor.
{
"objects": [
{
"type": "sensor",
"model": "RGBCamera",
"id": "rgbcamera",
"teleportSpectator": false,
"spawnPoint": {
"x": 0.0,
"y": 0.0,
"z": 200.0,
"yaw": 0.0,
"pitch": 0.0,
"roll": 0.0
},
"parameters": {
"postProcessingEffects": true,
"enable16BitFormat": true,
"useHDR": true,
"fOV": 90,
"targetGamma": 1,
"shutterSpeed": 60,
"iSO": 100,
"width": 1280,
"height": 720,
"focalDistance": 0,
"depthBlurAmount": 1,
"depthBlurRadius": 0,
"dofMinFStop": 1.2,
"dofBladeCount": 5,
"filmSlope": 0.88,
"filmToe": 0.55,
"filmShoulder": 0.25,
"filmBlackClip": 0,
"filmWhiteClip": 0.04,
"exposureMinBrightness": -2,
"exposureMaxBrightness": 20,
"exposureSpeedUp": 10.0,
"exposureSpeedDown": 1,
"motionBlurIntensity": 0.0,
"motionBlurMax": 2,
"motionBlurMinObjSize": 0,
"lensFlareIntensity": 0.0,
"bloomIntensity": 0,
"whiteTemp": 6500,
"whiteTint": 0,
"chromAberrIntensity": 0,
"chromAberrOffset": 0,
"aperture": 4,
"saveImageToDisk": false,
"sendDataToROS": false,
"targetFrameRate": 0,
"usePhysicLensDistortionEffect": true,
"useIceLensEffect": false,
"iceLensEffectStrength": 0.3,
"iceLensEffectAngle": 1.0
}
},
{
"type": "sensor",
"model": "Lidar",
"id": "lidar",
"teleportSpectator": false,
"followObject": false,
"zIsHeightAboveGround": false,
"spawnPoint":
{
"x": 0.0,
"y": 0.0,
"z": 200.0,
"yaw": 0.0,
"pitch": 0.0,
"roll": 0.0
},
"parameters":
{
"channels": 32,
"range": 120,
"pointsPerSecond": 655360,
"rotationFrequency": 10,
"upperFovLimit": 22.5,
"lowerFovLimit": -22.5,
"horizontalFov": 360,
"distanceNoiseStdDev": 0.0,
"lateralNoiseStdDev": 0.0,
"semantic": false,
"useComplexCollisionTrace ": true,
"sendDataAtRotationFrequency": true,
"sendDataToCombinedROSTopic": true,
"saveDataToDisk": false,
"sendDataToROS": false,
"visualizePointcloud": false,
"useLidarNoiseModel": false,
"savePointcloudWithoutNoiseModel": false
}
},
{
"type": "dataCapture",
"parameters": {
"useGPSLocation": false,
"captureRotatedViews": false,
"points": {
"point0": {
"x": 820.0,
"y": -630.0,
"z": 110.0,
"roll": 0.0,
"pitch": 0.0,
"yaw": 0.0
},
"point1": {
"x": 820.0,
"y": -80.0,
"z": 110.0,
"roll": 0.0,
"pitch": 0.0,
"yaw": 0.0
},
"point2": {
"x": 700.0,
"y": 530.0,
"z": 130.0,
"roll": 0.0,
"pitch": 0.0,
"yaw": 0.0
}
}
}
}
]
}
Unreal Editor utility for Creating DataCapture Locations
The AGRARSENSE simulator does not include a built-in tool for creating dataCapture
positions in the packaged build (.exe
). However, within the Unreal Editor, you can use the BP_DataCaptureLocation
blueprint to define these positions.
To use it:
- Drag and drop one or more
BP_DataCaptureLocation
actors into the level.
- Adjust their positions as needed.
- When you're satisfied, click the Export button in the Actor's Details panel.
This will export two json files, one with dataCapture structure and one with drone follow path structure. These json files will be exported to:
YOUR_LOCATION\Agrarsense\Data\Run{LAST_NUM}\ExportedJsonFiles\DataCaptureExport.json
YOUR_LOCATION\Agrarsense\Data\Run{LAST_NUM}\ExportedJsonFiles\DroneFollowPathExport.json
Note: These exported json files only contain the base json structure. You will need to manually add sensor definitions.
Attaching sensors to Spectator
You can attach any sensor to Spectator when spawning sensor through json file. Simply add "attachToSpectator": true to the json file.
Once the JSON file is loaded either by dragging and dropping it into the simulator window or by sending it through ROS, the sensor will automatically attach to the Spectator actor. This means that as you move the Spectator using the keyboard (WASD) and/or mouse, the attached sensor will follow.
You can also control the Spectator's movement via ROS using the /agrarsense/in/commands topic. For example:
rostopic pub /agrarsense/in/commands std_msgs/String "teleportspectator 100,150,20,0,0,
Example json file content
{
"objects": [
{
"type": "sensor",
"model": "RGBCamera",
"id": "rgbcamera",
"attachToSpectator": true,
"parameters":
{
"postProcessingEffects": true,
"enable16BitFormat": true,
"useHDR": true,
"width": 1280,
"height": 720,
"maxViewDistanceInCmOverride": -1.0,
"fOV": 90,
"targetGamma": 1,
"shutterSpeed": 60,
"iSO": 100,
"focalDistance": 0,
"depthBlurAmount": 1,
"depthBlurRadius": 0,
"dofMinFStop": 1.2,
"dofBladeCount": 5,
"filmSlope": 0.88,
"filmToe": 0.55,
"filmShoulder": 0.25,
"filmBlackClip": 0,
"filmWhiteClip": 0.04,
"exposureMinBrightness": -2,
"exposureMaxBrightness": 20,
"exposureSpeedUp": 10.0,
"exposureSpeedDown": 1,
"motionBlurIntensity": 0.0,
"motionBlurMax": 2,
"motionBlurMinObjSize": 0,
"lensFlareIntensity": 0.0,
"bloomIntensity": 0,
"whiteTemp": 6500,
"whiteTint": 0,
"chromAberrIntensity": 0,
"chromAberrOffset": 0,
"aperture": 4,
"saveImageToDisk": true,
"sendDataToROS": true,
"targetFrameRate": 0,
"usePhysicLensDistortionEffect": true,
"useIceLensEffect": false,
"iceLensEffectStrength": 0.3,
"iceLensEffectAngle": 1.0
}
}
]
}