Update with new observed range

This commit is contained in:
Nick Mowen 2022-12-19 14:29:22 -07:00
parent 0a72515b21
commit d8e0f4d718

View File

@ -53,10 +53,10 @@ The OpenVINO detector type is able to run on x86 hosts that have an Intel CPU, G
Inference speeds vary greatly depending on the CPU, GPU, or VPU used, some known examples are below: Inference speeds vary greatly depending on the CPU, GPU, or VPU used, some known examples are below:
| Name | Inference Speed | Notes | | Name | Inference Speed | Notes |
| Intel Celeron J4105 | ~ 25 ms | Inference speeds on CPU were ~ 150 ms | | Intel Celeron J4105 | ~ 25 ms | Inference speeds on CPU were ~ 150 ms |
| Intel Celeron N4020 | 150 - 200 ms | Inference speeds on CPU were ~ 800 ms | | Intel Celeron N4020 | 50 - 200 ms | Inference speeds on CPU were ~ 800 ms, greatly depends on other loads |
| Intel NCS2 VPU | 60 - 65 ms | May vary based on host device | | Intel NCS2 VPU | 60 - 65 ms | May vary based on host device |
### TensorRT ### TensorRT