Skip to main content

Project File Format

The .oneai file is a JSON document that stores all configuration for a ONE AI project. It is created automatically when you set up a new project in ONE WARE Studio and updated as you modify settings through the UI.

File Structure

{
"type": "imageDetection",
"guid": "xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx",
"version": "1.2.3.4",
"data": { ... }
}
FieldTypeDescription
typestringProject type identifier. Currently "imageDetection"
guidstringUnique project identifier (UUID)
versionstringONE AI version that last saved the file
dataobjectAll project configuration (see below)

Data Object

The data object contains the full project state:

Core Settings

KeyTypeDescription
annotationModestring"classes", "objects", or "segmentation"
capabilityModestring"basic" or "advanced"
fusionTypestringImage fusion mode: "single", "multi", "difference", "comparison", "stereo"
multiImageModestringSelected multi-image processing mode
advancedMultiImageSettingsobjectConfiguration for multi-image fusion

Labels and Groups

"labels": [
{
"name": "Defect",
"id": 0,
"groupId": 0,
"color": 4278190335,
"excludeFromTraining": false
}
],
"groups": [
{ "name": "Default", "id": 0 }
]
FieldTypeDescription
namestringDisplay name
idintUnique numeric identifier
groupIdintParent group ID (labels only)
coloruintARGB color as unsigned 32-bit integer
excludeFromTrainingboolWhether to exclude this label from training

Hardware Settings

Defines the target deployment hardware. ONE AI optimizes model architecture for these constraints.

KeyTypeDefaultDescription
hardwareTypestring"FPGA"Target hardware: FPGA, FPGA SoC, CPU, GPU, TPU, MCU, Server, ASIC
prioritizeSpeedOptimizationboolfalsePrioritize inference speed over accuracy
computeCapabilitynumber1Available compute capacity
computeCapabilityUnitstring"TOPS"Unit: TOPS, MOPS, KOPS
dspBlocksnumber40Number of 8-bit multipliers (FPGA DSP blocks)
dspGroupsnumber18-bit multipliers with sum per DSP block
prioritizeMemoryOptimizationboolfalsePrioritize memory efficiency
memoryLimitnumber1Available memory
memoryLimitUnitstring"GB"Unit: KB, MB, GB
optimizeForParallelExecutionboolfalseEnable parallel execution optimization
quantizedCalculationsboolfalseUse quantized arithmetic
bitsPerValuenumber8Bit width for quantized values (2–32)
fpgaClockSpeednumber50FPGA clock speed in MHz
maximumMemoryUsagenumber25Maximum memory usage (0–100%)
maximumDspUsagenumber25Maximum multiplier usage (0–100%)

Prefilters

Prefilters are stored as arrays corresponding to pipeline stages:

"preFiltersBegin": [...],
"preFiltersBeforeAugmentation": [...],
"preFiltersAfterAugmentation": [...],
"preFiltersEnd": [...]

Each filter entry:

{
"id": "initialResize",
"isEnabled": true,
"settings": { ... }
}
FieldTypeDescription
idstringFilter type identifier
isEnabledboolWhether the filter is active
settingsobjectFilter-specific parameters

See Prefilters for available filter types and their parameters.


Augmentations

Augmentations are organized into three arrays:

"augmentationsBegin": [...],
"augmentationsStatic": [...],
"augmentationsDynamic": [...]

Each augmentation entry uses the same schema as prefilters:

{
"id": "move",
"isEnabled": true,
"settings": { ... }
}

Default static augmentations: mosaic, move, rotate, flip, resize. Default dynamic augmentation: color.

See Augmentations for available augmentation types and their parameters.


Model Output Settings

Controls what the model predicts. Available keys depend on annotationMode:

KeyModesTypeDefaultDescription
classificationTypeclassesstring"allIndividualClasses"allIndividualClasses, oneClassPerImage, atLeastOneClass, regression
predictionTypeobjectsstring"sizePositionClass"sizePositionClass, positionClass, allPresentClasses, largestArea, mostObjects, atLeastOneObject
segmentationTypesegmentationstring"oneClassPerPixel"oneClassPerPixel, sizePositionClass, positionClass, allPresentClasses, largestArea, mostObjects, atLeastOneObject
sizePredictionEffortobjectsnumber25Effort allocated to size prediction (0–100%)
positionPredictionResolutionobjectsnumberResolution for coordinate prediction
precisionRecallPrioritizationallnumberBalance between precision and recall

Model Input Settings (Advanced)

Fine-grained control over model architecture. These are automatically derived from Basic Mode settings unless overridden.

KeyTypeDefaultDescription
surroundingSizeModestring"relativeToObject"Context sizing: relativeToObject or relativeToImage
minRelativeSurroundingSizenumber100Minimum surrounding context (%)
maxRelativeSurroundingSizenumber100Maximum surrounding context (%)
estimatedMinWidthnumber10Estimated minimum object width (% of image)
estimatedMinHeightnumber10Estimated minimum object height (% of image)
estimatedAvgWidthnumber50Estimated average object width (%)
estimatedAvgHeightnumber50Estimated average object height (%)
estimatedMaxWidthnumber90Estimated maximum object width (%)
estimatedMaxHeightnumber90Estimated maximum object height (%)
detectComplexitynumber50Feature detection complexity (0–100%)
sameClassDifferencenumber50Intra-class variance (0–100%)
backgroundDifferencenumber50Object-to-background contrast (0–100%)
maxFeaturesnumber10Maximum features for classification
avgFeaturesnumber2Average features for classification

Model Input Settings (Basic Mode)

Simplified settings that auto-configure advanced model input parameters:

KeyTypeDefaultDescription
estimatedSizestring"Small,Medium"Object size categories: Tiny, Small, Medium, Big (comma-separated)
numberOfFeaturesstring"FewFeatures"AlwaysOne, FewFeatures, ManyFeatures
typeOfEnvironmentstring"Controlled"Controlled, Limited, Natural
typeOfFeaturesstring"Limited"Similar, Limited, Open

Validation and Test Settings

KeyTypeDefaultDescription
useValidationSplitbooltrueEnable automatic validation split
validationSplitnumber20Percentage of training data used for validation (0–100)
testImagePercentagenumber0Percentage of training images used for testing
validationImagePercentagenumber100Percentage of validation images used for testing
validationImageSplitPercentagenumber0Additional validation/test split

Auto-Label Settings

KeyTypeDescription
selectedAutoLabelModelsarrayModels used for auto-labeling: [{ "name": "...", "minConfidence": 0.5 }]
autoLabelMergeThresholdnumberOverlap threshold for merging auto-label predictions
selectedRunModelstringName of the selected inference model

Project Directory Structure

A .oneai project file sits alongside its associated data:

MyProject/
├── MyProject.oneai ← project configuration
├── Dataset/ ← training images and annotations
├── Models/ ← trained model files
└── Export/ ← exported models (ONNX, VHDL, TFLite)
Christopher - Development Support

Need Help? We're Here for You!

Christopher from our development team is ready to help with any questions about ONE AI usage, troubleshooting, or optimization. Don't hesitate to reach out!

Our Support Email:support@one-ware.com