* Implement extraction of images for classification state models * Add object classification dataset preparation * Add first step wizard * Update i18n * Add state classification image selection step * Improve box handling * Add object selector * Improve object cropping implementation * Fix state classification selection * Finalize training and image selection step * Cleanup * Design optimizations * Cleanup mobile styling * Update no models screen * Cleanups and fixes * Fix bugs * Improve model training and creation process * Cleanup * Dynamically add metrics for new model * Add loading when hitting continue * Improve image selection mechanism * Remove unused translation keys * Adjust wording * Add retry button for image generation * Make no models view more specific * Adjust plus icon * Adjust form label * Start with correct type selected * Cleanup sizing and more font colors * Small tweaks * Add tips and more info * Cleanup dialog sizing * Add cursor rule for frontend * Cleanup * remove underline * Lazy loading
3.7 KiB
| id | title |
|---|---|
| object_classification | Object Classification |
Object classification allows you to train a custom MobileNetV2 classification model to run on tracked objects (persons, cars, animals, etc.) to identify a finer category or attribute for that object.
Minimum System Requirements
Object classification models are lightweight and run very fast on CPU. Inference should be usable on virtually any machine that can run Frigate.
Training the model does briefly use a high amount of system resources for about 1–3 minutes per training run. On lower-power devices, training may take longer.
When running the -tensorrt image, Nvidia GPUs will automatically be used to accelerate training.
Classes
Classes are the categories your model will learn to distinguish between. Each class represents a distinct visual category that the model will predict.
For object classification:
- Define classes that represent different types or attributes of the detected object
- Examples: For
personobjects, classes might bedelivery_person,resident,stranger - Include a
noneclass for objects that don't fit any specific category - Keep classes visually distinct to improve accuracy
Classification Type
-
Sub label:
- Applied to the object’s
sub_labelfield. - Ideal for a single, more specific identity or type.
- Example:
cat→Leo,Charlie,None.
- Applied to the object’s
-
Attribute:
- Added as metadata to the object (visible in /events):
<model_name>: <predicted_value>. - Ideal when multiple attributes can coexist independently.
- Example: Detecting if a
personin a construction yard is wearing a helmet or not.
- Added as metadata to the object (visible in /events):
Example use cases
Sub label
- Known pet vs unknown: For
dogobjects, set sub label to your pet’s name (e.g.,buddy) ornonefor others. - Mail truck vs normal car: For
car, classify asmail_truckvscarto filter important arrivals. - Delivery vs non-delivery person: For
person, classifydeliveryvsvisitorbased on uniform/props.
Attributes
- Backpack: For
person, add attributebackpack: yes/no. - Helmet: For
person(worksite), addhelmet: yes/no. - Leash: For
dog, addleash: yes/no(useful for park or yard rules). - Ladder rack: For
truck, addladder_rack: yes/noto flag service vehicles.
Configuration
Object classification is configured as a custom classification model. Each model has its own name and settings. You must list which object labels should be classified.
classification:
custom:
dog:
threshold: 0.8
object_config:
objects: [dog] # object labels to classify
classification_type: sub_label # or: attribute
Training the model
Creating and training the model is done within the Frigate UI using the Classification page.
Getting Started
When choosing which objects to classify, start with a small number of visually distinct classes and ensure your training samples match camera viewpoints and distances typical for those objects.
// TODO add this section once UI is implemented. Explain process of selecting objects and curating training examples.
Improving the Model
- Problem framing: Keep classes visually distinct and relevant to the chosen object types.
- Data collection: Use the model’s Recent Classification tab to gather balanced examples across times of day, weather, and distances.
- Preprocessing: Ensure examples reflect object crops similar to Frigate’s boxes; keep the subject centered.
- Labels: Keep label names short and consistent; include a
noneclass if you plan to ignore uncertain predictions for sub labels. - Threshold: Tune
thresholdper model to reduce false assignments. Start at0.8and adjust based on validation.