From 2217b21b0e4767930d4a55cf701fc1bceeccbc05 Mon Sep 17 00:00:00 2001 From: Nick Mowen Date: Fri, 30 Dec 2022 08:23:41 -0700 Subject: [PATCH] Add info about tensorrt detectors and link to docs --- docs/docs/frigate/hardware.md | 8 ++++++-- 1 file changed, 6 insertions(+), 2 deletions(-) diff --git a/docs/docs/frigate/hardware.md b/docs/docs/frigate/hardware.md index 348549bb1..057c3a905 100644 --- a/docs/docs/frigate/hardware.md +++ b/docs/docs/frigate/hardware.md @@ -68,9 +68,13 @@ Inference speeds vary greatly depending on the CPU, GPU, or VPU used, some known The TensortRT detector is able to run on x86 hosts that have an Nvidia GPU. -Inference speeds will vary depending on the model, some known examples are below: +Inference speeds will vary greatly depending on the GPU and the model used. +`tiny` variants are faster than the equivalent non-tiny model, some known examples are below: -TODO +| Name | Model | Inference Speed | +| -------- | --------------- | --------------- | +| RTX 3050 | yolov4-tiny-416 | ~ 5 ms | +| RTX 3050 | yolov7-tiny-416 | ~ 6 ms | ## What does Frigate use the CPU for and what does it use a detector for? (ELI5 Version)