diff --git a/docs/docs/configuration/custom_classification/object_classification.md b/docs/docs/configuration/custom_classification/object_classification.md index ac0b9387a..713dcf998 100644 --- a/docs/docs/configuration/custom_classification/object_classification.md +++ b/docs/docs/configuration/custom_classification/object_classification.md @@ -7,7 +7,7 @@ Object classification allows you to train a custom MobileNetV2 classification mo ## Minimum System Requirements -Object classification models are lightweight and run very fast on CPU. Inference should be usable on virtually any machine that can run Frigate. +Object classification models are lightweight and run very fast on CPU. Training the model does briefly use a high amount of system resources for about 1–3 minutes per training run. On lower-power devices, training may take longer. diff --git a/docs/docs/configuration/custom_classification/state_classification.md b/docs/docs/configuration/custom_classification/state_classification.md index 1ffdf9011..53310e4c6 100644 --- a/docs/docs/configuration/custom_classification/state_classification.md +++ b/docs/docs/configuration/custom_classification/state_classification.md @@ -7,7 +7,7 @@ State classification allows you to train a custom MobileNetV2 classification mod ## Minimum System Requirements -State classification models are lightweight and run very fast on CPU. Inference should be usable on virtually any machine that can run Frigate. +State classification models are lightweight and run very fast on CPU. Training the model does briefly use a high amount of system resources for about 1–3 minutes per training run. On lower-power devices, training may take longer. diff --git a/docs/docs/configuration/face_recognition.md b/docs/docs/configuration/face_recognition.md index 713671a16..c13a1047d 100644 --- a/docs/docs/configuration/face_recognition.md +++ b/docs/docs/configuration/face_recognition.md @@ -32,6 +32,8 @@ All of these features run locally on your system. ## Minimum System Requirements + A CPU with AVX instructions is required to run Face Recognition. + The `small` model is optimized for efficiency and runs on the CPU, most CPUs should run the model efficiently. The `large` model is optimized for accuracy, an integrated or discrete GPU / NPU is required. See the [Hardware Accelerated Enrichments](/configuration/hardware_acceleration_enrichments.md) documentation. diff --git a/docs/docs/configuration/license_plate_recognition.md b/docs/docs/configuration/license_plate_recognition.md index ac7942675..76837efcb 100644 --- a/docs/docs/configuration/license_plate_recognition.md +++ b/docs/docs/configuration/license_plate_recognition.md @@ -30,7 +30,7 @@ In the default mode, Frigate's LPR needs to first detect a `car` or `motorcycle` ## Minimum System Requirements -License plate recognition works by running AI models locally on your system. The YOLOv9 plate detector model and the OCR models ([PaddleOCR](https://github.com/PaddlePaddle/PaddleOCR)) are relatively lightweight and can run on your CPU or GPU, depending on your configuration. At least 4GB of RAM is required. +License plate recognition works by running AI models locally on your system. The YOLOv9 plate detector model and the OCR models ([PaddleOCR](https://github.com/PaddlePaddle/PaddleOCR)) are relatively lightweight and can run on your CPU or GPU, depending on your configuration. At least 4GB of RAM and a CPU with AVX instructions is required. ## Configuration diff --git a/docs/docs/configuration/semantic_search.md b/docs/docs/configuration/semantic_search.md index 91f435ff0..5946af139 100644 --- a/docs/docs/configuration/semantic_search.md +++ b/docs/docs/configuration/semantic_search.md @@ -13,7 +13,7 @@ Semantic Search is accessed via the _Explore_ view in the Frigate UI. Semantic Search works by running a large AI model locally on your system. Small or underpowered systems like a Raspberry Pi will not run Semantic Search reliably or at all. -A minimum of 8GB of RAM is required to use Semantic Search. A GPU is not strictly required but will provide a significant performance increase over CPU-only systems. +A minimum of 8GB of RAM is required to use Semantic Search. A CPU with AVX instructions is required to run Semantic Search. A GPU is not strictly required but will provide a significant performance increase over CPU-only systems. For best performance, 16GB or more of RAM and a dedicated GPU are recommended. diff --git a/docs/docs/frigate/hardware.md b/docs/docs/frigate/hardware.md index 8fd972aa7..9bb321ecf 100644 --- a/docs/docs/frigate/hardware.md +++ b/docs/docs/frigate/hardware.md @@ -26,7 +26,7 @@ I may earn a small commission for my endorsement, recommendation, testimonial, o ## Server -My current favorite is the Beelink EQ13 because of the efficient N100 CPU and dual NICs that allow you to setup a dedicated private network for your cameras where they can be blocked from accessing the internet. There are many used workstation options on eBay that work very well. Anything with an Intel CPU and capable of running Debian should work fine. As a bonus, you may want to look for devices with a M.2 or PCIe express slot that is compatible with the Google Coral, Hailo, or other AI accelerators. +My current favorite is the Beelink EQ13 because of the efficient N100 CPU and dual NICs that allow you to setup a dedicated private network for your cameras where they can be blocked from accessing the internet. There are many used workstation options on eBay that work very well. Anything with an Intel CPU (with AVX instructions) and capable of running Debian should work fine. As a bonus, you may want to look for devices with a M.2 or PCIe express slot that is compatible with the Google Coral, Hailo, or other AI accelerators. Note that many of these mini PCs come with Windows pre-installed, and you will need to install Linux according to the [getting started guide](../guides/getting_started.md). diff --git a/docs/docs/frigate/planning_setup.md b/docs/docs/frigate/planning_setup.md index cddd50265..28d78e670 100644 --- a/docs/docs/frigate/planning_setup.md +++ b/docs/docs/frigate/planning_setup.md @@ -38,6 +38,9 @@ There are many different hardware options for object detection depending on prio Storage is an important consideration when planning a new installation. To get a more precise estimate of your storage requirements, you can use an IP camera storage calculator. Websites like [IPConfigure Storage Calculator](https://calculator.ipconfigure.com/) can help you determine the necessary disk space based on your camera settings. +### CPU + +Frigate requires a CPU with AVX instructions. Most modern CPUs (post-2011) support AVX, but it is generally absent in low-power or budget-oriented processors, particularly older Intel Pentium, Celeron, and Atom-based chips. Specifically, Intel Celeron and Pentium models prior to the 2020 Tiger Lake generation typically lack AVX. #### SSDs (Solid State Drives)