Documentation
General Information
The simulator sends all sensor data to ROS, allowing you to handle data storage manually. Alternatively, the simulator can save sensor data directly to disk in an organized manner, simplifying data collection.
All saved data will be located in the following directory:
SIMULATOR_ROOT/Data
The simulator supports multiple data collection methods:
- Static Sensor Placement: Place sensors anywhere on the map and save data to disk or via ROS (manual and time-consuming).
- Vehicle-Mounted Sensors: Attach sensors to vehicles and collect data while manually controlling them (slow and time-consuming).
- Drone-Based Collection: Mount sensors on a drone and capture data while flying along a predefined path.
- Automated Data Capture: Use the
dataCapture
feature to teleport sensors around the map, quickly gathering data directly to disk.
Using Drone with sensors for data collection
See using vehicles and example .json files here.
DataCapture Usage
Drag and drop .json file into the simulator window.

DataCapture json content
See example files in the data_capture*.json
format here.
To use the dataCapture
system:
- Define one or more camera sensors first.
These should be included as objects of type sensor
, specifying model, position, and parameters such as resolution, post-processing effects, and lens properties.
- Add a
dataCapture
object last.
This defines the capture positions and any optional behaviors.
Key Parameters
- **
points
**
Define the positions and orientations where images should be captured. Each point includes x
, y
, z
, yaw
, pitch
, and roll
.
- **
captureRotatedViews
**
If set to true
, the simulator captures four additional photos at each defined point, rotated in 90-degree increments around the yaw axis.
- **
useGPSLocation
**
When enabled, the x
, y
, and z
coordinates are interpreted as real-world GPS-based positions. The simulator will automatically convert these into the local coordinate system.
- **
zIsHeightAboveGround
**
If enabled, the z value in dataCapture points represents the height in meters above the ground instead of a world Z position or GPS altitude. This allows specifying a fixed height relative to the ground level.
Example json file with both RGB camera and Lidar sensor.
{
"objects": [
{
"type": "sensor",
"model": "RGBCamera",
"id": "rgbcamera",
"teleportSpectator": false,
"spawnPoint": {
"x": 0.0,
"y": 0.0,
"z": 200.0,
"yaw": 0.0,
"pitch": 0.0,
"roll": 0.0
},
"parameters": {
"postProcessingEffects": true,
"enable16BitFormat": true,
"useHDR": true,
"fOV": 90,
"targetGamma": 1,
"shutterSpeed": 60,
"iSO": 100,
"width": 1280,
"height": 720,
"focalDistance": 0,
"depthBlurAmount": 1,
"depthBlurRadius": 0,
"dofMinFStop": 1.2,
"dofBladeCount": 5,
"filmSlope": 0.88,
"filmToe": 0.55,
"filmShoulder": 0.25,
"filmBlackClip": 0,
"filmWhiteClip": 0.04,
"exposureMinBrightness": -2,
"exposureMaxBrightness": 20,
"exposureSpeedUp": 10.0,
"exposureSpeedDown": 1,
"motionBlurIntensity": 0.0,
"motionBlurMax": 2,
"motionBlurMinObjSize": 0,
"lensFlareIntensity": 0.0,
"bloomIntensity": 0,
"whiteTemp": 6500,
"whiteTint": 0,
"chromAberrIntensity": 0,
"chromAberrOffset": 0,
"aperture": 4,
"saveImageToDisk": false,
"sendDataToROS": false,
"targetFrameRate": 0,
"usePhysicLensDistortionEffect": true,
"useIceLensEffect": false,
"iceLensEffectStrength": 0.3,
"iceLensEffectAngle": 1.0
}
},
{
"type": "sensor",
"model": "Lidar",
"id": "lidar",
"teleportSpectator": false,
"followObject": false,
"zIsHeightAboveGround": false,
"spawnPoint":
{
"x": 0.0,
"y": 0.0,
"z": 200.0,
"yaw": 0.0,
"pitch": 0.0,
"roll": 0.0
},
"parameters":
{
"channels": 32,
"range": 120,
"pointsPerSecond": 655360,
"rotationFrequency": 10,
"upperFovLimit": 22.5,
"lowerFovLimit": -22.5,
"horizontalFov": 360,
"distanceNoiseStdDev": 0.0,
"lateralNoiseStdDev": 0.0,
"semantic": false,
"useComplexCollisionTrace ": true,
"sendDataAtRotationFrequency": true,
"sendDataToCombinedROSTopic": true,
"saveDataToDisk": false,
"sendDataToROS": false,
"visualizePointcloud": false,
"useLidarNoiseModel": false,
"savePointcloudWithoutNoiseModel": false
}
},
{
"type": "dataCapture",
"parameters": {
"useGPSLocation": false,
"captureRotatedViews": false,
"points": {
"point0": {
"x": 820.0,
"y": -630.0,
"z": 110.0,
"roll": 0.0,
"pitch": 0.0,
"yaw": 0.0
},
"point1": {
"x": 820.0,
"y": -80.0,
"z": 110.0,
"roll": 0.0,
"pitch": 0.0,
"yaw": 0.0
},
"point2": {
"x": 700.0,
"y": 530.0,
"z": 130.0,
"roll": 0.0,
"pitch": 0.0,
"yaw": 0.0
}
}
}
}
]
}
Unreal Editor utility for Creating DataCapture Locations
The AGRARSENSE simulator does not include a built-in tool for creating dataCapture
positions in the packaged build (.exe
). However, within the Unreal Editor, you can use the BP_DataCaptureLocation
blueprint to define these positions.
To use it:
- Drag and drop one or more
BP_DataCaptureLocation
actors into the level.
- Adjust their positions as needed.
- When you're satisfied, click the Export button in the Actor's Details panel.
This will export two json files, one with dataCapture structure and one with drone follow path structure. These json files will be exported to:
YOUR_LOCATION\Agrarsense\Data\Run{LAST_NUM}\ExportedJsonFiles\DataCaptureExport.json
YOUR_LOCATION\Agrarsense\Data\Run{LAST_NUM}\ExportedJsonFiles\DroneFollowPathExport.json
Note: These exported json files only contain the base json structure. You will need to manually add sensor definitions.