mirror of
https://github.com/blakeblackshear/frigate.git
synced 2026-05-06 05:27:44 +03:00
Update detector docs
This commit is contained in:
parent
b24bb9c153
commit
a6de759b01
@ -344,6 +344,12 @@ Note that the labelmap uses a subset of the complete COCO label set that has onl
|
|||||||
|
|
||||||
[D-FINE](https://github.com/Peterande/D-FINE) is a DETR based model. The ONNX exported models are supported, but not included by default. See [the models section](#downloading-d-fine-model) for more information on downloading the D-FINE model for use in Frigate.
|
[D-FINE](https://github.com/Peterande/D-FINE) is a DETR based model. The ONNX exported models are supported, but not included by default. See [the models section](#downloading-d-fine-model) for more information on downloading the D-FINE model for use in Frigate.
|
||||||
|
|
||||||
|
:::warning
|
||||||
|
|
||||||
|
Currently D-FINE models only run on OpenVINO in CPU mode, GPUs currently fail to compile the model
|
||||||
|
|
||||||
|
:::
|
||||||
|
|
||||||
After placing the downloaded onnx model in your config/model_cache folder, you can use the following configuration:
|
After placing the downloaded onnx model in your config/model_cache folder, you can use the following configuration:
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
@ -653,7 +659,7 @@ Note that the labelmap uses a subset of the complete COCO label set that has onl
|
|||||||
|
|
||||||
After placing the downloaded onnx model in your `config/model_cache` folder, you can use the following configuration:
|
After placing the downloaded onnx model in your `config/model_cache` folder, you can use the following configuration:
|
||||||
|
|
||||||
```
|
```yaml
|
||||||
detectors:
|
detectors:
|
||||||
onnx:
|
onnx:
|
||||||
type: onnx
|
type: onnx
|
||||||
@ -671,7 +677,7 @@ model:
|
|||||||
|
|
||||||
[D-FINE](https://github.com/Peterande/D-FINE) is a DETR based model. The ONNX exported models are supported, but not included by default. See [the models section](#downloading-d-fine-model) for more information on downloading the D-FINE model for use in Frigate.
|
[D-FINE](https://github.com/Peterande/D-FINE) is a DETR based model. The ONNX exported models are supported, but not included by default. See [the models section](#downloading-d-fine-model) for more information on downloading the D-FINE model for use in Frigate.
|
||||||
|
|
||||||
After placing the downloaded onnx model in your config/model_cache folder, you can use the following configuration:
|
After placing the downloaded onnx model in your `config/model_cache` folder, you can use the following configuration:
|
||||||
|
|
||||||
```yaml
|
```yaml
|
||||||
detectors:
|
detectors:
|
||||||
@ -898,7 +904,7 @@ Make sure you change the batch size to 1 before exporting.
|
|||||||
To export as ONNX:
|
To export as ONNX:
|
||||||
|
|
||||||
1. `pip3 install rfdetr`
|
1. `pip3 install rfdetr`
|
||||||
2. `python`
|
2. `python3`
|
||||||
3. `from rfdetr import RFDETRBase`
|
3. `from rfdetr import RFDETRBase`
|
||||||
4. `x = RFDETRBase()`
|
4. `x = RFDETRBase()`
|
||||||
5. `x.export()`
|
5. `x.export()`
|
||||||
|
|||||||
Loading…
Reference in New Issue
Block a user