mirror of
https://github.com/blakeblackshear/frigate.git
synced 2026-02-19 01:17:06 +03:00
Move hailo to first party detectors
This commit is contained in:
parent
704b92bdbf
commit
0481c47ea0
@ -116,6 +116,30 @@ detectors:
|
|||||||
device: pci
|
device: pci
|
||||||
```
|
```
|
||||||
|
|
||||||
|
## Hailo-8l
|
||||||
|
|
||||||
|
This detector is available for use with Hailo-8 AI Acceleration Module.
|
||||||
|
|
||||||
|
See the [installation docs](../frigate/installation.md#hailo-8l) for information on configuring the hailo8.
|
||||||
|
|
||||||
|
### Configuration
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
detectors:
|
||||||
|
hailo8l:
|
||||||
|
type: hailo8l
|
||||||
|
device: PCIe
|
||||||
|
|
||||||
|
model:
|
||||||
|
width: 300
|
||||||
|
height: 300
|
||||||
|
input_tensor: nhwc
|
||||||
|
input_pixel_format: bgr
|
||||||
|
model_type: ssd
|
||||||
|
path: /config/model_cache/h8l_cache/ssd_mobilenet_v1.hef
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
## OpenVINO Detector
|
## OpenVINO Detector
|
||||||
|
|
||||||
The OpenVINO detector type runs an OpenVINO IR model on AMD and Intel CPUs, Intel GPUs and Intel VPU hardware. To configure an OpenVINO detector, set the `"type"` attribute to `"openvino"`.
|
The OpenVINO detector type runs an OpenVINO IR model on AMD and Intel CPUs, Intel GPUs and Intel VPU hardware. To configure an OpenVINO detector, set the `"type"` attribute to `"openvino"`.
|
||||||
@ -624,26 +648,3 @@ $ cat /sys/kernel/debug/rknpu/load
|
|||||||
|
|
||||||
- All models are automatically downloaded and stored in the folder `config/model_cache/rknn_cache`. After upgrading Frigate, you should remove older models to free up space.
|
- All models are automatically downloaded and stored in the folder `config/model_cache/rknn_cache`. After upgrading Frigate, you should remove older models to free up space.
|
||||||
- You can also provide your own `.rknn` model. You should not save your own models in the `rknn_cache` folder, store them directly in the `model_cache` folder or another subfolder. To convert a model to `.rknn` format see the `rknn-toolkit2` (requires a x86 machine). Note, that there is only post-processing for the supported models.
|
- You can also provide your own `.rknn` model. You should not save your own models in the `rknn_cache` folder, store them directly in the `model_cache` folder or another subfolder. To convert a model to `.rknn` format see the `rknn-toolkit2` (requires a x86 machine). Note, that there is only post-processing for the supported models.
|
||||||
|
|
||||||
## Hailo-8l
|
|
||||||
|
|
||||||
This detector is available for use with Hailo-8 AI Acceleration Module.
|
|
||||||
|
|
||||||
See the [installation docs](../frigate/installation.md#hailo-8l) for information on configuring the hailo8.
|
|
||||||
|
|
||||||
### Configuration
|
|
||||||
|
|
||||||
```yaml
|
|
||||||
detectors:
|
|
||||||
hailo8l:
|
|
||||||
type: hailo8l
|
|
||||||
device: PCIe
|
|
||||||
|
|
||||||
model:
|
|
||||||
width: 300
|
|
||||||
height: 300
|
|
||||||
input_tensor: nhwc
|
|
||||||
input_pixel_format: bgr
|
|
||||||
model_type: ssd
|
|
||||||
path: /config/model_cache/h8l_cache/ssd_mobilenet_v1.hef
|
|
||||||
```
|
|
||||||
|
|||||||
Loading…
Reference in New Issue
Block a user