From d8e0f4d718e7156f4310d0c31ccc60820716f80d Mon Sep 17 00:00:00 2001 From: Nick Mowen Date: Mon, 19 Dec 2022 14:29:22 -0700 Subject: [PATCH] Update with new observed range --- docs/docs/frigate/hardware.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/docs/frigate/hardware.md b/docs/docs/frigate/hardware.md index e8ebbb4ed..e78cc59a0 100644 --- a/docs/docs/frigate/hardware.md +++ b/docs/docs/frigate/hardware.md @@ -53,10 +53,10 @@ The OpenVINO detector type is able to run on x86 hosts that have an Intel CPU, G Inference speeds vary greatly depending on the CPU, GPU, or VPU used, some known examples are below: -| Name | Inference Speed | Notes | -| Intel Celeron J4105 | ~ 25 ms | Inference speeds on CPU were ~ 150 ms | -| Intel Celeron N4020 | 150 - 200 ms | Inference speeds on CPU were ~ 800 ms | -| Intel NCS2 VPU | 60 - 65 ms | May vary based on host device | +| Name | Inference Speed | Notes | +| Intel Celeron J4105 | ~ 25 ms | Inference speeds on CPU were ~ 150 ms | +| Intel Celeron N4020 | 50 - 200 ms | Inference speeds on CPU were ~ 800 ms, greatly depends on other loads | +| Intel NCS2 VPU | 60 - 65 ms | May vary based on host device | ### TensorRT