From 3756415de8103338c465b53f69207662e32bc3fc Mon Sep 17 00:00:00 2001 From: AmirHossein_Omidi <151873319+AmirHoseinOmidi@users.noreply.github.com> Date: Tue, 7 Oct 2025 02:12:00 +0330 Subject: [PATCH] Clarify ONNX Runtime usage in Frigate documentation Currently, the documentation mentions support for ONNX models but does not explicitly mention ONNX Runtime. To avoid confusion, I suggest clarifying that . This has been confirmed by checking the Frigate logs, where ONNX Runtime is loaded when an ONNX model is used --- docs/docs/configuration/object_detectors.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/docs/configuration/object_detectors.md b/docs/docs/configuration/object_detectors.md index 1e68d6ff4..14975780f 100644 --- a/docs/docs/configuration/object_detectors.md +++ b/docs/docs/configuration/object_detectors.md @@ -498,7 +498,7 @@ See [ONNX supported models](#supported-models) for supported models, there are s ## ONNX -ONNX is an open format for building machine learning models, Frigate supports running ONNX models on CPU, OpenVINO, ROCm, and TensorRT. On startup Frigate will automatically try to use a GPU if one is available. +ONNX is an open format for building machine learning models, Frigate supports running ONNX models on CPU, OpenVINO, ROCm, ONNX RUNTIME, and TensorRT. On startup Frigate will automatically try to use a GPU if one is available. :::info