From 984d654c408aa716771208aecd4bbedd9e3442fe Mon Sep 17 00:00:00 2001 From: Meow Date: Mon, 23 Feb 2026 14:45:49 +0100 Subject: [PATCH 01/38] Update line breaks in video_pipeline.md diagram (#21919) Mermaid compatible newlines (
) --- docs/docs/frigate/video_pipeline.md | 16 ++++++++-------- 1 file changed, 8 insertions(+), 8 deletions(-) diff --git a/docs/docs/frigate/video_pipeline.md b/docs/docs/frigate/video_pipeline.md index ba9365650..74b804b16 100644 --- a/docs/docs/frigate/video_pipeline.md +++ b/docs/docs/frigate/video_pipeline.md @@ -37,18 +37,18 @@ The following diagram adds a lot more detail than the simple view explained befo %%{init: {"themeVariables": {"edgeLabelBackground": "transparent"}}}%% flowchart TD - RecStore[(Recording\nstore)] - SnapStore[(Snapshot\nstore)] + RecStore[(Recording
store)] + SnapStore[(Snapshot
store)] subgraph Acquisition Cam["Camera"] -->|FFmpeg supported| Stream - Cam -->|"Other streaming\nprotocols"| go2rtc + Cam -->|"Other streaming
protocols"| go2rtc go2rtc("go2rtc") --> Stream - Stream[Capture main and\nsub streams] --> |detect stream|Decode(Decode and\ndownscale) + Stream[Capture main and
sub streams] --> |detect stream|Decode(Decode and
downscale) end subgraph Motion - Decode --> MotionM(Apply\nmotion masks) - MotionM --> MotionD(Motion\ndetection) + Decode --> MotionM(Apply
motion masks) + MotionM --> MotionD(Motion
detection) end subgraph Detection MotionD --> |motion regions| ObjectD(Object detection) @@ -60,8 +60,8 @@ flowchart TD MotionD --> |motion event|Birdseye ObjectZ --> |object event|Birdseye - MotionD --> |"video segments\n(retain motion)"|RecStore + MotionD --> |"video segments
(retain motion)"|RecStore ObjectZ --> |detection clip|RecStore - Stream -->|"video segments\n(retain all)"| RecStore + Stream -->|"video segments
(retain all)"| RecStore ObjectZ --> |detection snapshot|SnapStore ``` From dd8282ff3c0a1ef8e6418f5ec0cb3a88cba199dc Mon Sep 17 00:00:00 2001 From: Bart Nagel Date: Tue, 24 Feb 2026 06:38:04 -0800 Subject: [PATCH 02/38] Docs: fix YOLOv9 onnx export (#22107) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * Docs: fix missing dependency in YOLOv9 build script I had this command fail because it didn't have cmake available. This change fixes that problem. * Docs: avoid failure in YOLOv9 build script Pinning to 0.4.36 avoids this error: ``` 10.58 Downloading onnx 12.87 Building onnxsim==0.5.0 1029.4 × Failed to download and build `onnxsim==0.5.0` 1029.4 ╰─▶ Package metadata version `0.4.36` does not match given version `0.5.0` 1029.4 help: `onnxsim` (v0.5.0) was included because `onnx-simplifier` (v0.5.0) 1029.4 depends on `onnxsim` ``` * Update Dockerfile instructions for object detectors --------- Co-authored-by: Nicolas Mowen --- docs/docs/configuration/object_detectors.md | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/docs/configuration/object_detectors.md b/docs/docs/configuration/object_detectors.md index 7016bf4b6..5db813d29 100644 --- a/docs/docs/configuration/object_detectors.md +++ b/docs/docs/configuration/object_detectors.md @@ -1057,12 +1057,12 @@ YOLOv9 model can be exported as ONNX using the command below. You can copy and p ```sh docker build . --build-arg MODEL_SIZE=t --build-arg IMG_SIZE=320 --output . -f- <<'EOF' FROM python:3.11 AS build -RUN apt-get update && apt-get install --no-install-recommends -y libgl1 && rm -rf /var/lib/apt/lists/* -COPY --from=ghcr.io/astral-sh/uv:0.8.0 /uv /bin/ +RUN apt-get update && apt-get install --no-install-recommends -y cmake libgl1 && rm -rf /var/lib/apt/lists/* +COPY --from=ghcr.io/astral-sh/uv:0.10.4 /uv /bin/ WORKDIR /yolov9 ADD https://github.com/WongKinYiu/yolov9.git . RUN uv pip install --system -r requirements.txt -RUN uv pip install --system onnx==1.18.0 onnxruntime onnx-simplifier>=0.4.1 onnxscript +RUN uv pip install --system onnx==1.18.0 onnxruntime onnx-simplifier==0.4.* onnxscript ARG MODEL_SIZE ARG IMG_SIZE ADD https://github.com/WongKinYiu/yolov9/releases/download/v0.1/yolov9-${MODEL_SIZE}-converted.pt yolov9-${MODEL_SIZE}.pt From 96c70eee4ccccef7492cf10a5af2aa5686a10d32 Mon Sep 17 00:00:00 2001 From: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com> Date: Fri, 27 Feb 2026 17:07:07 -0600 Subject: [PATCH 03/38] fix link to coral yolov9 plus models (#22164) --- docs/docs/configuration/object_detectors.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/docs/configuration/object_detectors.md b/docs/docs/configuration/object_detectors.md index bc2bee59a..0a80ea463 100644 --- a/docs/docs/configuration/object_detectors.md +++ b/docs/docs/configuration/object_detectors.md @@ -161,7 +161,7 @@ YOLOv9 models that are compiled for TensorFlow Lite and properly quantized are s :::tip -**Frigate+ Users:** Follow the [instructions](../integrations/plus#use-models) to set a model ID in your config file. +**Frigate+ Users:** Follow the [instructions](/integrations/plus#use-models) to set a model ID in your config file. ::: From e064024a312bc5e1b22dd0df121ba57a3ecdd6ce Mon Sep 17 00:00:00 2001 From: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com> Date: Fri, 27 Feb 2026 21:02:19 -0600 Subject: [PATCH 04/38] Fix go2rtc stream alias auth (#22097) * Fix go2rtc stream alias authorization and live audio gating for main/sub stream names * revert * add tests --- frigate/api/auth.py | 66 +++++- frigate/api/camera.py | 23 +- .../test/http_api/test_http_camera_access.py | 204 ++++++++++++++++++ web/src/hooks/use-camera-live-mode.ts | 46 ++-- 4 files changed, 317 insertions(+), 22 deletions(-) diff --git a/frigate/api/auth.py b/frigate/api/auth.py index e0a6ec924..7c3a231ed 100644 --- a/frigate/api/auth.py +++ b/frigate/api/auth.py @@ -986,7 +986,16 @@ async def require_camera_access( current_user = await get_current_user(request) if isinstance(current_user, JSONResponse): - return current_user + detail = "Authentication required" + try: + error_payload = json.loads(current_user.body) + detail = ( + error_payload.get("message") or error_payload.get("detail") or detail + ) + except Exception: + pass + + raise HTTPException(status_code=current_user.status_code, detail=detail) role = current_user["role"] all_camera_names = set(request.app.frigate_config.cameras.keys()) @@ -1004,6 +1013,61 @@ async def require_camera_access( ) +def _get_stream_owner_cameras(request: Request, stream_name: str) -> set[str]: + owner_cameras: set[str] = set() + + for camera_name, camera in request.app.frigate_config.cameras.items(): + if stream_name == camera_name: + owner_cameras.add(camera_name) + continue + + if stream_name in camera.live.streams.values(): + owner_cameras.add(camera_name) + + return owner_cameras + + +async def require_go2rtc_stream_access( + stream_name: Optional[str] = None, + request: Request = None, +): + """Dependency to enforce go2rtc stream access based on owning camera access.""" + if stream_name is None: + return + + current_user = await get_current_user(request) + if isinstance(current_user, JSONResponse): + detail = "Authentication required" + try: + error_payload = json.loads(current_user.body) + detail = ( + error_payload.get("message") or error_payload.get("detail") or detail + ) + except Exception: + pass + + raise HTTPException(status_code=current_user.status_code, detail=detail) + + role = current_user["role"] + all_camera_names = set(request.app.frigate_config.cameras.keys()) + roles_dict = request.app.frigate_config.auth.roles + allowed_cameras = User.get_allowed_cameras(role, roles_dict, all_camera_names) + + # Admin or full access bypasses + if role == "admin" or not roles_dict.get(role): + return + + owner_cameras = _get_stream_owner_cameras(request, stream_name) + + if owner_cameras & set(allowed_cameras): + return + + raise HTTPException( + status_code=403, + detail=f"Access denied to camera '{stream_name}'. Allowed: {allowed_cameras}", + ) + + async def get_allowed_cameras_for_filter(request: Request): """Dependency to get allowed_cameras for filtering lists.""" current_user = await get_current_user(request) diff --git a/frigate/api/camera.py b/frigate/api/camera.py index 488ec1e1f..a94486d8c 100644 --- a/frigate/api/camera.py +++ b/frigate/api/camera.py @@ -17,7 +17,7 @@ from zeep.transports import AsyncTransport from frigate.api.auth import ( allow_any_authenticated, - require_camera_access, + require_go2rtc_stream_access, require_role, ) from frigate.api.defs.tags import Tags @@ -71,14 +71,27 @@ def go2rtc_streams(): @router.get( - "/go2rtc/streams/{camera_name}", dependencies=[Depends(require_camera_access)] + "/go2rtc/streams/{stream_name}", + dependencies=[Depends(require_go2rtc_stream_access)], ) -def go2rtc_camera_stream(request: Request, camera_name: str): +def go2rtc_camera_stream(request: Request, stream_name: str): r = requests.get( - f"http://127.0.0.1:1984/api/streams?src={camera_name}&video=all&audio=allµphone" + "http://127.0.0.1:1984/api/streams", + params={ + "src": stream_name, + "video": "all", + "audio": "all", + "microphone": "", + }, ) if not r.ok: - camera_config = request.app.frigate_config.cameras.get(camera_name) + camera_config = request.app.frigate_config.cameras.get(stream_name) + + if camera_config is None: + for camera_name, camera in request.app.frigate_config.cameras.items(): + if stream_name in camera.live.streams.values(): + camera_config = request.app.frigate_config.cameras.get(camera_name) + break if camera_config and camera_config.enabled: logger.error("Failed to fetch streams from go2rtc") diff --git a/frigate/test/http_api/test_http_camera_access.py b/frigate/test/http_api/test_http_camera_access.py index 5cd115417..211c84bb4 100644 --- a/frigate/test/http_api/test_http_camera_access.py +++ b/frigate/test/http_api/test_http_camera_access.py @@ -1,6 +1,7 @@ from unittest.mock import patch from fastapi import HTTPException, Request +from fastapi.testclient import TestClient from frigate.api.auth import ( get_allowed_cameras_for_filter, @@ -9,6 +10,33 @@ from frigate.api.auth import ( from frigate.models import Event, Recordings, ReviewSegment from frigate.test.http_api.base_http_test import AuthTestClient, BaseTestHttp +# Minimal multi-camera config used by go2rtc stream access tests. +# front_door has a stream alias "front_door_main"; back_door uses its own name. +# The "limited_user" role is restricted to front_door only. +_MULTI_CAMERA_CONFIG = { + "mqtt": {"host": "mqtt"}, + "auth": { + "roles": { + "limited_user": ["front_door"], + } + }, + "cameras": { + "front_door": { + "ffmpeg": { + "inputs": [{"path": "rtsp://10.0.0.1:554/video", "roles": ["detect"]}] + }, + "detect": {"height": 1080, "width": 1920, "fps": 5}, + "live": {"streams": {"default": "front_door_main"}}, + }, + "back_door": { + "ffmpeg": { + "inputs": [{"path": "rtsp://10.0.0.2:554/video", "roles": ["detect"]}] + }, + "detect": {"height": 1080, "width": 1920, "fps": 5}, + }, + }, +} + class TestCameraAccessEventReview(BaseTestHttp): def setUp(self): @@ -190,3 +218,179 @@ class TestCameraAccessEventReview(BaseTestHttp): resp = client.get("/events/summary") summary_list = resp.json() assert len(summary_list) == 2 + + +class TestGo2rtcStreamAccess(BaseTestHttp): + """Tests for require_go2rtc_stream_access — the auth dependency on + GET /go2rtc/streams/{stream_name}. + + go2rtc is not running in unit tests, so an authorized request returns + 500 (the proxy call fails), while an unauthorized request returns 401/403 + before the proxy is ever reached. + """ + + def _make_app(self, config_override: dict | None = None): + """Build a test app, optionally replacing self.minimal_config.""" + if config_override is not None: + self.minimal_config = config_override + app = super().create_app() + + # Allow tests to control the current user via request headers. + async def mock_get_current_user(request: Request): + username = request.headers.get("remote-user") + role = request.headers.get("remote-role") + if not username or not role: + from fastapi.responses import JSONResponse + + return JSONResponse( + content={"message": "No authorization headers."}, + status_code=401, + ) + return {"username": username, "role": role} + + app.dependency_overrides[get_current_user] = mock_get_current_user + return app + + def setUp(self): + super().setUp([Event, ReviewSegment, Recordings]) + + def tearDown(self): + super().tearDown() + + # ------------------------------------------------------------------ + # Helpers + # ------------------------------------------------------------------ + + def _get_stream( + self, app, stream_name: str, role: str = "admin", user: str = "test" + ): + """Issue GET /go2rtc/streams/{stream_name} with the given role.""" + with AuthTestClient(app) as client: + return client.get( + f"/go2rtc/streams/{stream_name}", + headers={"remote-user": user, "remote-role": role}, + ) + + # ------------------------------------------------------------------ + # Tests + # ------------------------------------------------------------------ + + def test_admin_can_access_any_stream(self): + """Admin role bypasses camera restrictions.""" + app = self._make_app(_MULTI_CAMERA_CONFIG) + # front_door stream — go2rtc is not running so expect 500, not 401/403 + resp = self._get_stream(app, "front_door", role="admin") + assert resp.status_code not in (401, 403), ( + f"Admin should not be blocked; got {resp.status_code}" + ) + + # back_door stream + resp = self._get_stream(app, "back_door", role="admin") + assert resp.status_code not in (401, 403) + + def test_missing_auth_headers_returns_401(self): + """Requests without auth headers must be rejected with 401.""" + app = self._make_app(_MULTI_CAMERA_CONFIG) + # Use plain TestClient (not AuthTestClient) so no headers are injected. + with TestClient(app, raise_server_exceptions=False) as client: + resp = client.get("/go2rtc/streams/front_door") + assert resp.status_code == 401, f"Expected 401, got {resp.status_code}" + + def test_unconfigured_role_can_access_any_stream(self): + """When no camera restrictions are configured for a role the user + should have access to all streams (no roles_dict entry ⇒ no restriction).""" + no_roles_config = { + "mqtt": {"host": "mqtt"}, + "cameras": { + "front_door": { + "ffmpeg": { + "inputs": [ + {"path": "rtsp://10.0.0.1:554/video", "roles": ["detect"]} + ] + }, + "detect": {"height": 1080, "width": 1920, "fps": 5}, + }, + "back_door": { + "ffmpeg": { + "inputs": [ + {"path": "rtsp://10.0.0.2:554/video", "roles": ["detect"]} + ] + }, + "detect": {"height": 1080, "width": 1920, "fps": 5}, + }, + }, + } + app = self._make_app(no_roles_config) + + # "myuser" role is not listed in roles_dict — should be allowed everywhere + for stream in ("front_door", "back_door"): + resp = self._get_stream(app, stream, role="myuser") + assert resp.status_code not in (401, 403), ( + f"Unconfigured role should not be blocked on '{stream}'; " + f"got {resp.status_code}" + ) + + def test_restricted_role_can_access_allowed_camera(self): + """limited_user role (restricted to front_door) can access front_door stream.""" + app = self._make_app(_MULTI_CAMERA_CONFIG) + resp = self._get_stream(app, "front_door", role="limited_user") + assert resp.status_code not in (401, 403), ( + f"limited_user should be allowed on front_door; got {resp.status_code}" + ) + + def test_restricted_role_blocked_from_disallowed_camera(self): + """limited_user role (restricted to front_door) cannot access back_door stream.""" + app = self._make_app(_MULTI_CAMERA_CONFIG) + resp = self._get_stream(app, "back_door", role="limited_user") + assert resp.status_code == 403, ( + f"limited_user should be denied on back_door; got {resp.status_code}" + ) + + def test_stream_alias_allowed_for_owning_camera(self): + """Stream alias 'front_door_main' is owned by front_door; limited_user (who + is allowed front_door) should be permitted.""" + app = self._make_app(_MULTI_CAMERA_CONFIG) + # front_door_main is the alias defined in live.streams for front_door + resp = self._get_stream(app, "front_door_main", role="limited_user") + assert resp.status_code not in (401, 403), ( + f"limited_user should be allowed on alias front_door_main; " + f"got {resp.status_code}" + ) + + def test_stream_alias_blocked_when_owning_camera_disallowed(self): + """limited_user cannot access a stream alias that belongs to a camera they + are not allowed to see.""" + # Give back_door a stream alias and restrict limited_user to front_door only + config = { + "mqtt": {"host": "mqtt"}, + "auth": { + "roles": { + "limited_user": ["front_door"], + } + }, + "cameras": { + "front_door": { + "ffmpeg": { + "inputs": [ + {"path": "rtsp://10.0.0.1:554/video", "roles": ["detect"]} + ] + }, + "detect": {"height": 1080, "width": 1920, "fps": 5}, + }, + "back_door": { + "ffmpeg": { + "inputs": [ + {"path": "rtsp://10.0.0.2:554/video", "roles": ["detect"]} + ] + }, + "detect": {"height": 1080, "width": 1920, "fps": 5}, + "live": {"streams": {"default": "back_door_main"}}, + }, + }, + } + app = self._make_app(config) + resp = self._get_stream(app, "back_door_main", role="limited_user") + assert resp.status_code == 403, ( + f"limited_user should be denied on alias back_door_main; " + f"got {resp.status_code}" + ) diff --git a/web/src/hooks/use-camera-live-mode.ts b/web/src/hooks/use-camera-live-mode.ts index 288c0ea09..5264d1a34 100644 --- a/web/src/hooks/use-camera-live-mode.ts +++ b/web/src/hooks/use-camera-live-mode.ts @@ -18,18 +18,25 @@ export default function useCameraLiveMode( const streamNames = new Set(); cameras.forEach((camera) => { - const isRestreamed = Object.keys(config.go2rtc.streams || {}).includes( - Object.values(camera.live.streams)[0], - ); + if (activeStreams && activeStreams[camera.name]) { + const selectedStreamName = activeStreams[camera.name]; + const isRestreamed = Object.keys(config.go2rtc.streams || {}).includes( + selectedStreamName, + ); - if (isRestreamed) { - if (activeStreams && activeStreams[camera.name]) { - streamNames.add(activeStreams[camera.name]); - } else { - Object.values(camera.live.streams).forEach((streamName) => { - streamNames.add(streamName); - }); + if (isRestreamed) { + streamNames.add(selectedStreamName); } + } else { + Object.values(camera.live.streams).forEach((streamName) => { + const isRestreamed = Object.keys( + config.go2rtc.streams || {}, + ).includes(streamName); + + if (isRestreamed) { + streamNames.add(streamName); + } + }); } }); @@ -66,11 +73,11 @@ export default function useCameraLiveMode( } = {}; cameras.forEach((camera) => { + const selectedStreamName = + activeStreams?.[camera.name] ?? Object.values(camera.live.streams)[0]; const isRestreamed = config && - Object.keys(config.go2rtc.streams || {}).includes( - Object.values(camera.live.streams)[0], - ); + Object.keys(config.go2rtc.streams || {}).includes(selectedStreamName); newIsRestreamedStates[camera.name] = isRestreamed ?? false; @@ -101,14 +108,21 @@ export default function useCameraLiveMode( setPreferredLiveModes(newPreferredLiveModes); setIsRestreamedStates(newIsRestreamedStates); setSupportsAudioOutputStates(newSupportsAudioOutputStates); - }, [cameras, config, windowVisible, streamMetadata]); + }, [activeStreams, cameras, config, windowVisible, streamMetadata]); const resetPreferredLiveMode = useCallback( (cameraName: string) => { const mseSupported = "MediaSource" in window || "ManagedMediaSource" in window; + const cameraConfig = cameras.find((camera) => camera.name === cameraName); + const selectedStreamName = + activeStreams?.[cameraName] ?? + (cameraConfig + ? Object.values(cameraConfig.live.streams)[0] + : cameraName); const isRestreamed = - config && Object.keys(config.go2rtc.streams || {}).includes(cameraName); + config && + Object.keys(config.go2rtc.streams || {}).includes(selectedStreamName); setPreferredLiveModes((prevModes) => { const newModes = { ...prevModes }; @@ -122,7 +136,7 @@ export default function useCameraLiveMode( return newModes; }); }, - [config], + [activeStreams, cameras, config], ); return { From c687aa51195f3d89295d626bcf325c61c54595c0 Mon Sep 17 00:00:00 2001 From: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com> Date: Fri, 27 Feb 2026 21:02:46 -0600 Subject: [PATCH 05/38] Birdseye fixes (#22166) * permit birdseye access if user has viewer role or a custom viewer role that has access to all cameras * bump version --- Makefile | 2 +- .../components/filter/CameraGroupSelector.tsx | 6 ++--- web/src/hooks/use-has-full-camera-access.ts | 26 +++++++++++++++++++ web/src/pages/Live.tsx | 14 +++++----- 4 files changed, 38 insertions(+), 10 deletions(-) create mode 100644 web/src/hooks/use-has-full-camera-access.ts diff --git a/Makefile b/Makefile index d1427b6df..51f12f972 100644 --- a/Makefile +++ b/Makefile @@ -1,7 +1,7 @@ default_target: local COMMIT_HASH := $(shell git log -1 --pretty=format:"%h"|tail -1) -VERSION = 0.17.0 +VERSION = 0.17.1 IMAGE_REPO ?= ghcr.io/blakeblackshear/frigate GITHUB_REF_NAME ?= $(shell git rev-parse --abbrev-ref HEAD) BOARDS= #Initialized empty diff --git a/web/src/components/filter/CameraGroupSelector.tsx b/web/src/components/filter/CameraGroupSelector.tsx index 14845fdb8..6bd7dfbbf 100644 --- a/web/src/components/filter/CameraGroupSelector.tsx +++ b/web/src/components/filter/CameraGroupSelector.tsx @@ -77,6 +77,7 @@ import { useStreamingSettings } from "@/context/streaming-settings-provider"; import { Trans, useTranslation } from "react-i18next"; import { CameraNameLabel } from "../camera/FriendlyNameLabel"; import { useAllowedCameras } from "@/hooks/use-allowed-cameras"; +import { useHasFullCameraAccess } from "@/hooks/use-has-full-camera-access"; import { useIsAdmin } from "@/hooks/use-is-admin"; import { useUserPersistedOverlayState } from "@/hooks/use-overlay-state"; @@ -677,7 +678,7 @@ export function CameraGroupEdit({ ); const allowedCameras = useAllowedCameras(); - const isAdmin = useIsAdmin(); + const hasFullCameraAccess = useHasFullCameraAccess(); const [openCamera, setOpenCamera] = useState(); @@ -866,8 +867,7 @@ export function CameraGroupEdit({ {t("group.cameras.desc")} {[ - ...(birdseyeConfig?.enabled && - (isAdmin || "birdseye" in allowedCameras) + ...(birdseyeConfig?.enabled && hasFullCameraAccess ? ["birdseye"] : []), ...Object.keys(config?.cameras ?? {}) diff --git a/web/src/hooks/use-has-full-camera-access.ts b/web/src/hooks/use-has-full-camera-access.ts new file mode 100644 index 000000000..8e7d74501 --- /dev/null +++ b/web/src/hooks/use-has-full-camera-access.ts @@ -0,0 +1,26 @@ +import { useAllowedCameras } from "@/hooks/use-allowed-cameras"; +import useSWR from "swr"; +import { FrigateConfig } from "@/types/frigateConfig"; + +/** + * Returns true if the current user has access to all cameras. + * This is used to determine birdseye access — users who can see + * all cameras should also be able to see the birdseye view. + */ +export function useHasFullCameraAccess() { + const allowedCameras = useAllowedCameras(); + const { data: config } = useSWR("config", { + revalidateOnFocus: false, + }); + + if (!config?.cameras) return false; + + const enabledCameraNames = Object.entries(config.cameras) + .filter(([, cam]) => cam.enabled_in_config) + .map(([name]) => name); + + return ( + enabledCameraNames.length > 0 && + enabledCameraNames.every((name) => allowedCameras.includes(name)) + ); +} diff --git a/web/src/pages/Live.tsx b/web/src/pages/Live.tsx index 1b4bfb33a..e1a4f4868 100644 --- a/web/src/pages/Live.tsx +++ b/web/src/pages/Live.tsx @@ -11,12 +11,12 @@ import { useTranslation } from "react-i18next"; import { useEffect, useMemo, useRef } from "react"; import useSWR from "swr"; import { useAllowedCameras } from "@/hooks/use-allowed-cameras"; -import { useIsAdmin } from "@/hooks/use-is-admin"; +import { useHasFullCameraAccess } from "@/hooks/use-has-full-camera-access"; function Live() { const { t } = useTranslation(["views/live"]); const { data: config } = useSWR("config"); - const isAdmin = useIsAdmin(); + const hasFullCameraAccess = useHasFullCameraAccess(); // selection @@ -90,8 +90,8 @@ function Live() { const allowedCameras = useAllowedCameras(); const includesBirdseye = useMemo(() => { - // Restricted users should never have access to birdseye - if (!isAdmin) { + // Users without access to all cameras should not have access to birdseye + if (!hasFullCameraAccess) { return false; } @@ -106,7 +106,7 @@ function Live() { } else { return false; } - }, [config, cameraGroup, isAdmin]); + }, [config, cameraGroup, hasFullCameraAccess]); const cameras = useMemo(() => { if (!config) { @@ -151,7 +151,9 @@ function Live() { return (
- {selectedCameraName === "birdseye" ? ( + {selectedCameraName === "birdseye" && + hasFullCameraAccess && + config?.birdseye?.enabled ? ( Date: Sun, 1 Mar 2026 19:10:28 -0700 Subject: [PATCH 06/38] Fix genai (#22203) * fix genai leak * Add fix for value error in embedding * Cleanup --- .../post/object_descriptions.py | 25 ++++++++++++++----- frigate/embeddings/maintainer.py | 5 +++- 2 files changed, 23 insertions(+), 7 deletions(-) diff --git a/frigate/data_processing/post/object_descriptions.py b/frigate/data_processing/post/object_descriptions.py index cdb5f4fc3..266ede316 100644 --- a/frigate/data_processing/post/object_descriptions.py +++ b/frigate/data_processing/post/object_descriptions.py @@ -103,16 +103,19 @@ class ObjectDescriptionProcessor(PostProcessorApi): logger.debug(f"{camera} sending early request to GenAI") self.early_request_sent[data["id"]] = True + # Copy thumbnails to avoid holding references after cleanup + thumbnails_copy = [ + data["thumbnail"][:] if data.get("thumbnail") else None + for data in self.tracked_events[data["id"]] + if data.get("thumbnail") + ] threading.Thread( target=self._genai_embed_description, name=f"_genai_embed_description_{event.id}", daemon=True, args=( event, - [ - data["thumbnail"] - for data in self.tracked_events[data["id"]] - ], + thumbnails_copy, ), ).start() @@ -172,8 +175,13 @@ class ObjectDescriptionProcessor(PostProcessorApi): embed_image = ( [snapshot_image] if event.has_snapshot and source == "snapshot" + # Copy thumbnails to avoid holding references else ( - [data["thumbnail"] for data in self.tracked_events[event_id]] + [ + data["thumbnail"][:] if data.get("thumbnail") else None + for data in self.tracked_events[event_id] + if data.get("thumbnail") + ] if len(self.tracked_events.get(event_id, [])) > 0 else [thumbnail] ) @@ -276,8 +284,13 @@ class ObjectDescriptionProcessor(PostProcessorApi): embed_image = ( [snapshot_image] if event.has_snapshot and camera_config.objects.genai.use_snapshot + # Copy thumbnails to avoid holding references after cleanup else ( - [data["thumbnail"] for data in self.tracked_events[event.id]] + [ + data["thumbnail"][:] if data.get("thumbnail") else None + for data in self.tracked_events[event.id] + if data.get("thumbnail") + ] if num_thumbnails > 0 else [thumbnail] ) diff --git a/frigate/embeddings/maintainer.py b/frigate/embeddings/maintainer.py index bd707de15..b632951d9 100644 --- a/frigate/embeddings/maintainer.py +++ b/frigate/embeddings/maintainer.py @@ -679,4 +679,7 @@ class EmbeddingMaintainer(threading.Thread): if not self.config.semantic_search.enabled: return - self.embeddings.embed_thumbnail(event_id, thumbnail) + try: + self.embeddings.embed_thumbnail(event_id, thumbnail) + except ValueError: + logger.warning(f"Failed to embed thumbnail for event {event_id}") From 0dd1e94d60ed66daf312f08e07aaf7aefac9efad Mon Sep 17 00:00:00 2001 From: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com> Date: Mon, 2 Mar 2026 21:18:33 -0600 Subject: [PATCH 07/38] update docs for avx cpu system requirements (#22222) --- .../custom_classification/object_classification.md | 2 +- .../custom_classification/state_classification.md | 2 +- docs/docs/configuration/face_recognition.md | 2 ++ docs/docs/configuration/license_plate_recognition.md | 2 +- docs/docs/configuration/semantic_search.md | 2 +- docs/docs/frigate/hardware.md | 2 +- docs/docs/frigate/planning_setup.md | 3 +++ 7 files changed, 10 insertions(+), 5 deletions(-) diff --git a/docs/docs/configuration/custom_classification/object_classification.md b/docs/docs/configuration/custom_classification/object_classification.md index ac0b9387a..713dcf998 100644 --- a/docs/docs/configuration/custom_classification/object_classification.md +++ b/docs/docs/configuration/custom_classification/object_classification.md @@ -7,7 +7,7 @@ Object classification allows you to train a custom MobileNetV2 classification mo ## Minimum System Requirements -Object classification models are lightweight and run very fast on CPU. Inference should be usable on virtually any machine that can run Frigate. +Object classification models are lightweight and run very fast on CPU. Training the model does briefly use a high amount of system resources for about 1–3 minutes per training run. On lower-power devices, training may take longer. diff --git a/docs/docs/configuration/custom_classification/state_classification.md b/docs/docs/configuration/custom_classification/state_classification.md index 1ffdf9011..53310e4c6 100644 --- a/docs/docs/configuration/custom_classification/state_classification.md +++ b/docs/docs/configuration/custom_classification/state_classification.md @@ -7,7 +7,7 @@ State classification allows you to train a custom MobileNetV2 classification mod ## Minimum System Requirements -State classification models are lightweight and run very fast on CPU. Inference should be usable on virtually any machine that can run Frigate. +State classification models are lightweight and run very fast on CPU. Training the model does briefly use a high amount of system resources for about 1–3 minutes per training run. On lower-power devices, training may take longer. diff --git a/docs/docs/configuration/face_recognition.md b/docs/docs/configuration/face_recognition.md index 713671a16..c13a1047d 100644 --- a/docs/docs/configuration/face_recognition.md +++ b/docs/docs/configuration/face_recognition.md @@ -32,6 +32,8 @@ All of these features run locally on your system. ## Minimum System Requirements + A CPU with AVX instructions is required to run Face Recognition. + The `small` model is optimized for efficiency and runs on the CPU, most CPUs should run the model efficiently. The `large` model is optimized for accuracy, an integrated or discrete GPU / NPU is required. See the [Hardware Accelerated Enrichments](/configuration/hardware_acceleration_enrichments.md) documentation. diff --git a/docs/docs/configuration/license_plate_recognition.md b/docs/docs/configuration/license_plate_recognition.md index ac7942675..76837efcb 100644 --- a/docs/docs/configuration/license_plate_recognition.md +++ b/docs/docs/configuration/license_plate_recognition.md @@ -30,7 +30,7 @@ In the default mode, Frigate's LPR needs to first detect a `car` or `motorcycle` ## Minimum System Requirements -License plate recognition works by running AI models locally on your system. The YOLOv9 plate detector model and the OCR models ([PaddleOCR](https://github.com/PaddlePaddle/PaddleOCR)) are relatively lightweight and can run on your CPU or GPU, depending on your configuration. At least 4GB of RAM is required. +License plate recognition works by running AI models locally on your system. The YOLOv9 plate detector model and the OCR models ([PaddleOCR](https://github.com/PaddlePaddle/PaddleOCR)) are relatively lightweight and can run on your CPU or GPU, depending on your configuration. At least 4GB of RAM and a CPU with AVX instructions is required. ## Configuration diff --git a/docs/docs/configuration/semantic_search.md b/docs/docs/configuration/semantic_search.md index 91f435ff0..5946af139 100644 --- a/docs/docs/configuration/semantic_search.md +++ b/docs/docs/configuration/semantic_search.md @@ -13,7 +13,7 @@ Semantic Search is accessed via the _Explore_ view in the Frigate UI. Semantic Search works by running a large AI model locally on your system. Small or underpowered systems like a Raspberry Pi will not run Semantic Search reliably or at all. -A minimum of 8GB of RAM is required to use Semantic Search. A GPU is not strictly required but will provide a significant performance increase over CPU-only systems. +A minimum of 8GB of RAM is required to use Semantic Search. A CPU with AVX instructions is required to run Semantic Search. A GPU is not strictly required but will provide a significant performance increase over CPU-only systems. For best performance, 16GB or more of RAM and a dedicated GPU are recommended. diff --git a/docs/docs/frigate/hardware.md b/docs/docs/frigate/hardware.md index 8fd972aa7..9bb321ecf 100644 --- a/docs/docs/frigate/hardware.md +++ b/docs/docs/frigate/hardware.md @@ -26,7 +26,7 @@ I may earn a small commission for my endorsement, recommendation, testimonial, o ## Server -My current favorite is the Beelink EQ13 because of the efficient N100 CPU and dual NICs that allow you to setup a dedicated private network for your cameras where they can be blocked from accessing the internet. There are many used workstation options on eBay that work very well. Anything with an Intel CPU and capable of running Debian should work fine. As a bonus, you may want to look for devices with a M.2 or PCIe express slot that is compatible with the Google Coral, Hailo, or other AI accelerators. +My current favorite is the Beelink EQ13 because of the efficient N100 CPU and dual NICs that allow you to setup a dedicated private network for your cameras where they can be blocked from accessing the internet. There are many used workstation options on eBay that work very well. Anything with an Intel CPU (with AVX instructions) and capable of running Debian should work fine. As a bonus, you may want to look for devices with a M.2 or PCIe express slot that is compatible with the Google Coral, Hailo, or other AI accelerators. Note that many of these mini PCs come with Windows pre-installed, and you will need to install Linux according to the [getting started guide](../guides/getting_started.md). diff --git a/docs/docs/frigate/planning_setup.md b/docs/docs/frigate/planning_setup.md index cddd50265..28d78e670 100644 --- a/docs/docs/frigate/planning_setup.md +++ b/docs/docs/frigate/planning_setup.md @@ -38,6 +38,9 @@ There are many different hardware options for object detection depending on prio Storage is an important consideration when planning a new installation. To get a more precise estimate of your storage requirements, you can use an IP camera storage calculator. Websites like [IPConfigure Storage Calculator](https://calculator.ipconfigure.com/) can help you determine the necessary disk space based on your camera settings. +### CPU + +Frigate requires a CPU with AVX instructions. Most modern CPUs (post-2011) support AVX, but it is generally absent in low-power or budget-oriented processors, particularly older Intel Pentium, Celeron, and Atom-based chips. Specifically, Intel Celeron and Pentium models prior to the 2020 Tiger Lake generation typically lack AVX. #### SSDs (Solid State Drives) From d311974949cf3acb04d61bce89bf6d72e608ea51 Mon Sep 17 00:00:00 2001 From: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com> Date: Wed, 4 Mar 2026 08:04:24 -0600 Subject: [PATCH 08/38] fix menu display conditions (#22237) users without frigate+ enabled would not have the ability to create an object mask from the 3-dots menu in tracking details --- web/src/components/overlay/detail/TrackingDetails.tsx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/web/src/components/overlay/detail/TrackingDetails.tsx b/web/src/components/overlay/detail/TrackingDetails.tsx index 03b6dd6c4..844775d21 100644 --- a/web/src/components/overlay/detail/TrackingDetails.tsx +++ b/web/src/components/overlay/detail/TrackingDetails.tsx @@ -1010,7 +1010,7 @@ function LifecycleIconRow({
{formattedEventTimestamp}
- {isAdmin && config?.plus?.enabled && item.data.box && ( + {isAdmin && (config?.plus?.enabled || item.data.box) && (
From 8c67704ffbf7ad244e8d03d87cf0fc4c6bfd2477 Mon Sep 17 00:00:00 2001 From: GuoQing Liu <842607283@qq.com> Date: Wed, 4 Mar 2026 22:27:31 +0800 Subject: [PATCH 09/38] docs: updated the guides detectors section (#22241) --- docs/docs/guides/getting_started.md | 43 ++++++++++++++++++++++++++++- 1 file changed, 42 insertions(+), 1 deletion(-) diff --git a/docs/docs/guides/getting_started.md b/docs/docs/guides/getting_started.md index f0f2f0f98..efc9edb42 100644 --- a/docs/docs/guides/getting_started.md +++ b/docs/docs/guides/getting_started.md @@ -174,8 +174,47 @@ cameras: ### Step 4: Configure detectors -By default, Frigate will use a single CPU detector. If you have a USB Coral, you will need to add a detectors section to your config. +By default, Frigate will use a single CPU detector. +In many cases, the integrated graphics on Intel CPUs provides sufficient performance for typical Frigate setups. If you have an Intel processor, you can follow the configuration below. + +
+ Use Intel OpenVINO detector + +You need to refer to **Configure hardware acceleration** above to enable the container to use the GPU. + +```yaml +mqtt: ... + +detectors: # <---- add detectors + ov: + type: openvino # <---- use openvino detector + device: GPU + +# We will use the default MobileNet_v2 model from OpenVINO. +model: + width: 300 + height: 300 + input_tensor: nhwc + input_pixel_format: bgr + path: /openvino-model/ssdlite_mobilenet_v2.xml + labelmap_path: /openvino-model/coco_91cl_bkgr.txt + +cameras: + name_of_your_camera: + ffmpeg: ... + detect: + enabled: True # <---- turn on detection + ... +``` + +
+ +If you have a USB Coral, you will need to add a detectors section to your config. + +
+ Use USB Coral detector + `docker-compose.yml` (after modifying, you will need to run `docker compose up -d` to apply changes) ```yaml @@ -204,6 +243,8 @@ cameras: ... ``` +
+ More details on available detectors can be found [here](../configuration/object_detectors.md). Restart Frigate and you should start seeing detections for `person`. If you want to track other objects, they will need to be added according to the [configuration file reference](../configuration/reference.md). From c338533c83ce55a4ceb36e70835e53083e51e53e Mon Sep 17 00:00:00 2001 From: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com> Date: Wed, 4 Mar 2026 08:32:49 -0600 Subject: [PATCH 10/38] fix ordering of points in planning setup docs (#22251) --- docs/docs/frigate/planning_setup.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/docs/docs/frigate/planning_setup.md b/docs/docs/frigate/planning_setup.md index 28d78e670..b7dbd604b 100644 --- a/docs/docs/frigate/planning_setup.md +++ b/docs/docs/frigate/planning_setup.md @@ -34,14 +34,14 @@ For commercial installations it is important to verify the number of supported c There are many different hardware options for object detection depending on priorities and available hardware. See [the recommended hardware page](./hardware.md#detectors) for more specifics on what hardware is recommended for object detection. -### Storage - -Storage is an important consideration when planning a new installation. To get a more precise estimate of your storage requirements, you can use an IP camera storage calculator. Websites like [IPConfigure Storage Calculator](https://calculator.ipconfigure.com/) can help you determine the necessary disk space based on your camera settings. - ### CPU Frigate requires a CPU with AVX instructions. Most modern CPUs (post-2011) support AVX, but it is generally absent in low-power or budget-oriented processors, particularly older Intel Pentium, Celeron, and Atom-based chips. Specifically, Intel Celeron and Pentium models prior to the 2020 Tiger Lake generation typically lack AVX. +### Storage + +Storage is an important consideration when planning a new installation. To get a more precise estimate of your storage requirements, you can use an IP camera storage calculator. Websites like [IPConfigure Storage Calculator](https://calculator.ipconfigure.com/) can help you determine the necessary disk space based on your camera settings. + #### SSDs (Solid State Drives) SSDs are an excellent choice for Frigate, offering high speed and responsiveness. The older concern that SSDs would quickly "wear out" from constant video recording is largely no longer valid for modern consumer and enterprise-grade SSDs. @@ -74,4 +74,4 @@ While supported, using network-attached storage (NAS) for recordings can introdu - **Basic Minimum: 4GB RAM**: This is generally sufficient for a very basic Frigate setup with a few cameras and a dedicated object detection accelerator, without running any enrichments. Performance might be tight, especially with higher resolution streams or numerous detections. - **Minimum for Enrichments: 8GB RAM**: If you plan to utilize Frigate's enrichment features (e.g., facial recognition, license plate recognition, or other AI models that run alongside standard object detection), 8GB of RAM should be considered the minimum. Enrichments require additional memory to load and process their respective models and data. -- **Recommended: 16GB RAM**: For most users, especially those with many cameras (8+) or who plan to heavily leverage enrichments, 16GB of RAM is highly recommended. This provides ample headroom for smooth operation, reduces the likelihood of swapping to disk (which can impact performance), and allows for future expansion. \ No newline at end of file +- **Recommended: 16GB RAM**: For most users, especially those with many cameras (8+) or who plan to heavily leverage enrichments, 16GB of RAM is highly recommended. This provides ample headroom for smooth operation, reduces the likelihood of swapping to disk (which can impact performance), and allows for future expansion. From ab3cef813cbdea8b0d1fb98df086618d0cd93920 Mon Sep 17 00:00:00 2001 From: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com> Date: Thu, 5 Mar 2026 17:20:02 -0600 Subject: [PATCH 11/38] adopt official HA language, change add-on to app (#22258) --- docs/docs/configuration/authentication.md | 2 +- .../hardware_acceleration_video.md | 7 ++--- docs/docs/configuration/index.md | 16 ++++++------ docs/docs/configuration/live.md | 21 ++++----------- docs/docs/development/contributing.md | 6 ++--- docs/docs/frigate/installation.md | 26 +++++++++---------- docs/docs/frigate/updating.md | 24 ++++++++--------- docs/docs/guides/configuring_go2rtc.md | 8 ++---- docs/docs/guides/getting_started.md | 14 +++++----- docs/docs/guides/ha_network_storage.md | 6 ++--- docs/docs/integrations/home-assistant.md | 6 ++--- docs/docs/integrations/plus.md | 4 +-- docs/docs/troubleshooting/edgetpu.md | 2 +- 13 files changed, 64 insertions(+), 78 deletions(-) diff --git a/docs/docs/configuration/authentication.md b/docs/docs/configuration/authentication.md index 70f756b68..a312b5944 100644 --- a/docs/docs/configuration/authentication.md +++ b/docs/docs/configuration/authentication.md @@ -86,7 +86,7 @@ Frigate looks for a JWT token secret in the following order: 1. An environment variable named `FRIGATE_JWT_SECRET` 2. A file named `FRIGATE_JWT_SECRET` in the directory specified by the `CREDENTIALS_DIRECTORY` environment variable (defaults to the Docker Secrets directory: `/run/secrets/`) -3. A `jwt_secret` option from the Home Assistant Add-on options +3. A `jwt_secret` option from the Home Assistant App options 4. A `.jwt_secret` file in the config directory If no secret is found on startup, Frigate generates one and stores it in a `.jwt_secret` file in the config directory. diff --git a/docs/docs/configuration/hardware_acceleration_video.md b/docs/docs/configuration/hardware_acceleration_video.md index bbbf5a640..46be3bb50 100644 --- a/docs/docs/configuration/hardware_acceleration_video.md +++ b/docs/docs/configuration/hardware_acceleration_video.md @@ -10,6 +10,7 @@ import CommunityBadge from '@site/src/components/CommunityBadge'; It is highly recommended to use an integrated or discrete GPU for hardware acceleration video decoding in Frigate. Some types of hardware acceleration are detected and used automatically, but you may need to update your configuration to enable hardware accelerated decoding in ffmpeg. To verify that hardware acceleration is working: + - Check the logs: A message will either say that hardware acceleration was automatically detected, or there will be a warning that no hardware acceleration was automatically detected - If hardware acceleration is specified in the config, verification can be done by ensuring the logs are free from errors. There is no CPU fallback for hardware acceleration. @@ -67,7 +68,7 @@ Frigate can utilize most Intel integrated GPUs and Arc GPUs to accelerate video :::note -The default driver is `iHD`. You may need to change the driver to `i965` by adding the following environment variable `LIBVA_DRIVER_NAME=i965` to your docker-compose file or [in the `config.yml` for HA Add-on users](advanced.md#environment_vars). +The default driver is `iHD`. You may need to change the driver to `i965` by adding the following environment variable `LIBVA_DRIVER_NAME=i965` to your docker-compose file or [in the `config.yml` for HA App users](advanced.md#environment_vars). See [The Intel Docs](https://www.intel.com/content/www/us/en/support/articles/000005505/processors.html) to figure out what generation your CPU is. @@ -188,7 +189,7 @@ Frigate can utilize modern AMD integrated GPUs and AMD GPUs to accelerate video ### Configuring Radeon Driver -You need to change the driver to `radeonsi` by adding the following environment variable `LIBVA_DRIVER_NAME=radeonsi` to your docker-compose file or [in the `config.yml` for HA Add-on users](advanced.md#environment_vars). +You need to change the driver to `radeonsi` by adding the following environment variable `LIBVA_DRIVER_NAME=radeonsi` to your docker-compose file or [in the `config.yml` for HA App users](advanced.md#environment_vars). ### Via VAAPI @@ -292,7 +293,7 @@ These instructions were originally based on the [Jellyfin documentation](https:/ ## Raspberry Pi 3/4 Ensure you increase the allocated RAM for your GPU to at least 128 (`raspi-config` > Performance Options > GPU Memory). -If you are using the HA Add-on, you may need to use the full access variant and turn off _Protection mode_ for hardware acceleration. +If you are using the HA App, you may need to use the full access variant and turn off _Protection mode_ for hardware acceleration. ```yaml # if you want to decode a h264 stream diff --git a/docs/docs/configuration/index.md b/docs/docs/configuration/index.md index 2144ef7ea..c1b0dc903 100644 --- a/docs/docs/configuration/index.md +++ b/docs/docs/configuration/index.md @@ -3,7 +3,7 @@ id: index title: Frigate Configuration --- -For Home Assistant Add-on installations, the config file should be at `/addon_configs//config.yml`, where `` is specific to the variant of the Frigate Add-on you are running. See the list of directories [here](#accessing-add-on-config-dir). +For Home Assistant App installations, the config file should be at `/addon_configs//config.yml`, where `` is specific to the variant of the Frigate App you are running. See the list of directories [here](#accessing-app-config-dir). For all other installation types, the config file should be mapped to `/config/config.yml` inside the container. @@ -25,11 +25,11 @@ cameras: - detect ``` -## Accessing the Home Assistant Add-on configuration directory {#accessing-add-on-config-dir} +## Accessing the Home Assistant App configuration directory {#accessing-app-config-dir} -When running Frigate through the HA Add-on, the Frigate `/config` directory is mapped to `/addon_configs/` in the host, where `` is specific to the variant of the Frigate Add-on you are running. +When running Frigate through the HA App, the Frigate `/config` directory is mapped to `/addon_configs/` in the host, where `` is specific to the variant of the Frigate App you are running. -| Add-on Variant | Configuration directory | +| App Variant | Configuration directory | | -------------------------- | ----------------------------------------- | | Frigate | `/addon_configs/ccab4aaf_frigate` | | Frigate (Full Access) | `/addon_configs/ccab4aaf_frigate-fa` | @@ -38,11 +38,11 @@ When running Frigate through the HA Add-on, the Frigate `/config` directory is m **Whenever you see `/config` in the documentation, it refers to this directory.** -If for example you are running the standard Add-on variant and use the [VS Code Add-on](https://github.com/hassio-addons/addon-vscode) to browse your files, you can click _File_ > _Open folder..._ and navigate to `/addon_configs/ccab4aaf_frigate` to access the Frigate `/config` directory and edit the `config.yaml` file. You can also use the built-in file editor in the Frigate UI to edit the configuration file. +If for example you are running the standard App variant and use the [VS Code App](https://github.com/hassio-addons/addon-vscode) to browse your files, you can click _File_ > _Open folder..._ and navigate to `/addon_configs/ccab4aaf_frigate` to access the Frigate `/config` directory and edit the `config.yaml` file. You can also use the built-in file editor in the Frigate UI to edit the configuration file. ## VS Code Configuration Schema -VS Code supports JSON schemas for automatically validating configuration files. You can enable this feature by adding `# yaml-language-server: $schema=http://frigate_host:5000/api/config/schema.json` to the beginning of the configuration file. Replace `frigate_host` with the IP address or hostname of your Frigate server. If you're using both VS Code and Frigate as an Add-on, you should use `ccab4aaf-frigate` instead. Make sure to expose the internal unauthenticated port `5000` when accessing the config from VS Code on another machine. +VS Code supports JSON schemas for automatically validating configuration files. You can enable this feature by adding `# yaml-language-server: $schema=http://frigate_host:5000/api/config/schema.json` to the beginning of the configuration file. Replace `frigate_host` with the IP address or hostname of your Frigate server. If you're using both VS Code and Frigate as an App, you should use `ccab4aaf-frigate` instead. Make sure to expose the internal unauthenticated port `5000` when accessing the config from VS Code on another machine. ## Environment Variable Substitution @@ -82,10 +82,10 @@ genai: Here are some common starter configuration examples. Refer to the [reference config](./reference.md) for detailed information about all the config values. -### Raspberry Pi Home Assistant Add-on with USB Coral +### Raspberry Pi Home Assistant App with USB Coral - Single camera with 720p, 5fps stream for detect -- MQTT connected to the Home Assistant Mosquitto Add-on +- MQTT connected to the Home Assistant Mosquitto App - Hardware acceleration for decoding video - USB Coral detector - Save all video with any detectable motion for 7 days regardless of whether any objects were detected or not diff --git a/docs/docs/configuration/live.md b/docs/docs/configuration/live.md index 910cb69f1..c55d29a59 100644 --- a/docs/docs/configuration/live.md +++ b/docs/docs/configuration/live.md @@ -15,7 +15,7 @@ The jsmpeg live view will use more browser and client GPU resources. Using go2rt | ------ | ------------------------------------- | ---------- | ---------------------------- | --------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------- | | jsmpeg | same as `detect -> fps`, capped at 10 | 720p | no | no | Resolution is configurable, but go2rtc is recommended if you want higher resolutions and better frame rates. jsmpeg is Frigate's default without go2rtc configured. | | mse | native | native | yes (depends on audio codec) | yes | iPhone requires iOS 17.1+, Firefox is h.264 only. This is Frigate's default when go2rtc is configured. | -| webrtc | native | native | yes (depends on audio codec) | yes | Requires extra configuration. Frigate attempts to use WebRTC when MSE fails or when using a camera's two-way talk feature. | +| webrtc | native | native | yes (depends on audio codec) | yes | Requires extra configuration. Frigate attempts to use WebRTC when MSE fails or when using a camera's two-way talk feature. | ### Camera Settings Recommendations @@ -114,7 +114,7 @@ cameras: WebRTC works by creating a TCP or UDP connection on port `8555`. However, it requires additional configuration: - For external access, over the internet, setup your router to forward port `8555` to port `8555` on the Frigate device, for both TCP and UDP. -- For internal/local access, unless you are running through the HA Add-on, you will also need to set the WebRTC candidates list in the go2rtc config. For example, if `192.168.1.10` is the local IP of the device running Frigate: +- For internal/local access, unless you are running through the HA App, you will also need to set the WebRTC candidates list in the go2rtc config. For example, if `192.168.1.10` is the local IP of the device running Frigate: ```yaml title="config.yml" go2rtc: @@ -128,13 +128,13 @@ WebRTC works by creating a TCP or UDP connection on port `8555`. However, it req - For access through Tailscale, the Frigate system's Tailscale IP must be added as a WebRTC candidate. Tailscale IPs all start with `100.`, and are reserved within the `100.64.0.0/10` CIDR block. -- Note that some browsers may not support H.265 (HEVC). You can check your browser's current version for H.265 compatibility [here](https://github.com/AlexxIT/go2rtc?tab=readme-ov-file#codecs-madness). +- Note that some browsers may not support H.265 (HEVC). You can check your browser's current version for H.265 compatibility [here](https://github.com/AlexxIT/go2rtc?tab=readme-ov-file#codecs-madness). :::tip -This extra configuration may not be required if Frigate has been installed as a Home Assistant Add-on, as Frigate uses the Supervisor's API to generate a WebRTC candidate. +This extra configuration may not be required if Frigate has been installed as a Home Assistant App, as Frigate uses the Supervisor's API to generate a WebRTC candidate. -However, it is recommended if issues occur to define the candidates manually. You should do this if the Frigate Add-on fails to generate a valid candidate. If an error occurs you will see some warnings like the below in the Add-on logs page during the initialization: +However, it is recommended if issues occur to define the candidates manually. You should do this if the Frigate App fails to generate a valid candidate. If an error occurs you will see some warnings like the below in the App logs page during the initialization: ```log [WARN] Failed to get IP address from supervisor @@ -222,34 +222,28 @@ Note that disabling a camera through the config file (`enabled: False`) removes When your browser runs into problems playing back your camera streams, it will log short error messages to the browser console. They indicate playback, codec, or network issues on the client/browser side, not something server side with Frigate itself. Below are the common messages you may see and simple actions you can take to try to resolve them. - **startup** - - What it means: The player failed to initialize or connect to the live stream (network or startup error). - What to try: Reload the Live view or click _Reset_. Verify `go2rtc` is running and the camera stream is reachable. Try switching to a different stream from the Live UI dropdown (if available) or use a different browser. - Possible console messages from the player code: - - `Error opening MediaSource.` - `Browser reported a network error.` - `Max error count ${errorCount} exceeded.` (the numeric value will vary) - **mse-decode** - - What it means: The browser reported a decoding error while trying to play the stream, which usually is a result of a codec incompatibility or corrupted frames. - What to try: Check the browser console for the supported and negotiated codecs. Ensure your camera/restream is using H.264 video and AAC audio (these are the most compatible). If your camera uses a non-standard audio codec, configure `go2rtc` to transcode the stream to AAC. Try another browser (some browsers have stricter MSE/codec support) and, for iPhone, ensure you're on iOS 17.1 or newer. - Possible console messages from the player code: - - `Safari cannot open MediaSource.` - `Safari reported InvalidStateError.` - `Safari reported decoding errors.` - **stalled** - - What it means: Playback has stalled because the player has fallen too far behind live (extended buffering or no data arriving). - What to try: This is usually indicative of the browser struggling to decode too many high-resolution streams at once. Try selecting a lower-bandwidth stream (substream), reduce the number of live streams open, improve the network connection, or lower the camera resolution. Also check your camera's keyframe (I-frame) interval — shorter intervals make playback start and recover faster. You can also try increasing the timeout value in the UI pane of Frigate's settings. - Possible console messages from the player code: - - `Buffer time (10 seconds) exceeded, browser may not be playing media correctly.` - `Media playback has stalled after seconds due to insufficient buffering or a network interruption.` (the seconds value will vary) @@ -270,21 +264,18 @@ When your browser runs into problems playing back your camera streams, it will l If you are using continuous streaming or you are loading more than a few high resolution streams at once on the dashboard, your browser may struggle to begin playback of your streams before the timeout. Frigate always prioritizes showing a live stream as quickly as possible, even if it is a lower quality jsmpeg stream. You can use the "Reset" link/button to try loading your high resolution stream again. Errors in stream playback (e.g., connection failures, codec issues, or buffering timeouts) that cause the fallback to low bandwidth mode (jsmpeg) are logged to the browser console for easier debugging. These errors may include: - - Network issues (e.g., MSE or WebRTC network connection problems). - Unsupported codecs or stream formats (e.g., H.265 in WebRTC, which is not supported in some browsers). - Buffering timeouts or low bandwidth conditions causing fallback to jsmpeg. - Browser compatibility problems (e.g., iOS Safari limitations with MSE). To view browser console logs: - 1. Open the Frigate Live View in your browser. 2. Open the browser's Developer Tools (F12 or right-click > Inspect > Console tab). 3. Reproduce the error (e.g., load a problematic stream or simulate network issues). 4. Look for messages prefixed with the camera name. These logs help identify if the issue is player-specific (MSE vs. WebRTC) or related to camera configuration (e.g., go2rtc streams, codecs). If you see frequent errors: - - Verify your camera's H.264/AAC settings (see [Frigate's camera settings recommendations](#camera_settings_recommendations)). - Check go2rtc configuration for transcoding (e.g., audio to AAC/OPUS). - Test with a different stream via the UI dropdown (if `live -> streams` is configured). @@ -324,9 +315,7 @@ When your browser runs into problems playing back your camera streams, it will l To prevent this, make the `detect` stream match the go2rtc live stream's aspect ratio (resolution does not need to match, just the aspect ratio). You can either adjust the camera's output resolution or set the `width` and `height` values in your config's `detect` section to a resolution with an aspect ratio that matches. Example: Resolutions from two streams - - Mismatched (may cause aspect ratio switching on the dashboard): - - Live/go2rtc stream: 1920x1080 (16:9) - Detect stream: 640x352 (~1.82:1, not 16:9) diff --git a/docs/docs/development/contributing.md b/docs/docs/development/contributing.md index a123f70b8..0a3b76990 100644 --- a/docs/docs/development/contributing.md +++ b/docs/docs/development/contributing.md @@ -17,15 +17,15 @@ From here, follow the guides for: - [Web Interface](#web-interface) - [Documentation](#documentation) -### Frigate Home Assistant Add-on +### Frigate Home Assistant App -This repository holds the Home Assistant Add-on, for use with Home Assistant OS and compatible installations. It is the piece that allows you to run Frigate from your Home Assistant Supervisor tab. +This repository holds the Home Assistant App, for use with Home Assistant OS and compatible installations. It is the piece that allows you to run Frigate from your Home Assistant Supervisor tab. Fork [blakeblackshear/frigate-hass-addons](https://github.com/blakeblackshear/frigate-hass-addons) to your own Github profile, then clone the forked repo to your local machine. ### Frigate Home Assistant Integration -This repository holds the custom integration that allows your Home Assistant installation to automatically create entities for your Frigate instance, whether you are running Frigate as a standalone Docker container or as a [Home Assistant Add-on](#frigate-home-assistant-add-on). +This repository holds the custom integration that allows your Home Assistant installation to automatically create entities for your Frigate instance, whether you are running Frigate as a standalone Docker container or as a [Home Assistant App](#frigate-home-assistant-app). Fork [blakeblackshear/frigate-hass-integration](https://github.com/blakeblackshear/frigate-hass-integration) to your own GitHub profile, then clone the forked repo to your local machine. diff --git a/docs/docs/frigate/installation.md b/docs/docs/frigate/installation.md index 96a283fe4..cef98f483 100644 --- a/docs/docs/frigate/installation.md +++ b/docs/docs/frigate/installation.md @@ -3,11 +3,11 @@ id: installation title: Installation --- -Frigate is a Docker container that can be run on any Docker host including as a [Home Assistant Add-on](https://www.home-assistant.io/addons/). Note that the Home Assistant Add-on is **not** the same thing as the integration. The [integration](/integrations/home-assistant) is required to integrate Frigate into Home Assistant, whether you are running Frigate as a standalone Docker container or as a Home Assistant Add-on. +Frigate is a Docker container that can be run on any Docker host including as a [Home Assistant App](https://www.home-assistant.io/addons/). Note that the Home Assistant App is **not** the same thing as the integration. The [integration](/integrations/home-assistant) is required to integrate Frigate into Home Assistant, whether you are running Frigate as a standalone Docker container or as a Home Assistant App. :::tip -If you already have Frigate installed as a Home Assistant Add-on, check out the [getting started guide](../guides/getting_started#configuring-frigate) to configure Frigate. +If you already have Frigate installed as a Home Assistant App, check out the [getting started guide](../guides/getting_started#configuring-frigate) to configure Frigate. ::: @@ -92,7 +92,7 @@ $ python -c 'print("{:.2f}MB".format(((1280 * 720 * 1.5 * 20 + 270480) / 1048576 253MB ``` -The shm size cannot be set per container for Home Assistant add-ons. However, this is probably not required since by default Home Assistant Supervisor allocates `/dev/shm` with half the size of your total memory. If your machine has 8GB of memory, chances are that Frigate will have access to up to 4GB without any additional configuration. +The shm size cannot be set per container for Home Assistant Apps. However, this is probably not required since by default Home Assistant Supervisor allocates `/dev/shm` with half the size of your total memory. If your machine has 8GB of memory, chances are that Frigate will have access to up to 4GB without any additional configuration. ## Extra Steps for Specific Hardware @@ -510,7 +510,7 @@ The community supported docker image tags for the current stable version are: - `stable-tensorrt-jp6` - Frigate build optimized for Nvidia Jetson devices running Jetpack 6 - `stable-rk` - Frigate build for SBCs with Rockchip SoC -## Home Assistant Add-on +## Home Assistant App :::warning @@ -521,7 +521,7 @@ There are important limitations in HA OS to be aware of: - Separate local storage for media is not yet supported by Home Assistant - AMD GPUs are not supported because HA OS does not include the mesa driver. - Intel NPUs are not supported because HA OS does not include the NPU firmware. -- Nvidia GPUs are not supported because addons do not support the Nvidia runtime. +- Nvidia GPUs are not supported because HA Apps do not support the Nvidia runtime. ::: @@ -531,27 +531,27 @@ See [the network storage guide](/guides/ha_network_storage.md) for instructions ::: -Home Assistant OS users can install via the Add-on repository. +Home Assistant OS users can install via the App repository. -1. In Home Assistant, navigate to _Settings_ > _Add-ons_ > _Add-on Store_ > _Repositories_ +1. In Home Assistant, navigate to _Settings_ > _Apps_ > _App Store_ > _Repositories_ 2. Add `https://github.com/blakeblackshear/frigate-hass-addons` -3. Install the desired variant of the Frigate Add-on (see below) +3. Install the desired variant of the Frigate App (see below) 4. Setup your network configuration in the `Configuration` tab -5. Start the Add-on +5. Start the App 6. Use the _Open Web UI_ button to access the Frigate UI, then click in the _cog icon_ > _Configuration editor_ and configure Frigate to your liking -There are several variants of the Add-on available: +There are several variants of the App available: -| Add-on Variant | Description | +| App Variant | Description | | -------------------------- | ---------------------------------------------------------- | | Frigate | Current release with protection mode on | | Frigate (Full Access) | Current release with the option to disable protection mode | | Frigate Beta | Beta release with protection mode on | | Frigate Beta (Full Access) | Beta release with the option to disable protection mode | -If you are using hardware acceleration for ffmpeg, you **may** need to use the _Full Access_ variant of the Add-on. This is because the Frigate Add-on runs in a container with limited access to the host system. The _Full Access_ variant allows you to disable _Protection mode_ and give Frigate full access to the host system. +If you are using hardware acceleration for ffmpeg, you **may** need to use the _Full Access_ variant of the App. This is because the Frigate App runs in a container with limited access to the host system. The _Full Access_ variant allows you to disable _Protection mode_ and give Frigate full access to the host system. -You can also edit the Frigate configuration file through the [VS Code Add-on](https://github.com/hassio-addons/addon-vscode) or similar. In that case, the configuration file will be at `/addon_configs//config.yml`, where `` is specific to the variant of the Frigate Add-on you are running. See the list of directories [here](../configuration/index.md#accessing-add-on-config-dir). +You can also edit the Frigate configuration file through the [VS Code App](https://github.com/hassio-addons/addon-vscode) or similar. In that case, the configuration file will be at `/addon_configs//config.yml`, where `` is specific to the variant of the Frigate App you are running. See the list of directories [here](../configuration/index.md#accessing-app-config-dir). ## Kubernetes diff --git a/docs/docs/frigate/updating.md b/docs/docs/frigate/updating.md index 589ab7ff1..841a3e2d5 100644 --- a/docs/docs/frigate/updating.md +++ b/docs/docs/frigate/updating.md @@ -7,7 +7,7 @@ title: Updating The current stable version of Frigate is **0.17.0**. The release notes and any breaking changes for this version can be found on the [Frigate GitHub releases page](https://github.com/blakeblackshear/frigate/releases/tag/v0.17.0). -Keeping Frigate up to date ensures you benefit from the latest features, performance improvements, and bug fixes. The update process varies slightly depending on your installation method (Docker, Home Assistant Addon, etc.). Below are instructions for the most common setups. +Keeping Frigate up to date ensures you benefit from the latest features, performance improvements, and bug fixes. The update process varies slightly depending on your installation method (Docker, Home Assistant App, etc.). Below are instructions for the most common setups. ## Before You Begin @@ -67,30 +67,30 @@ If you’re running Frigate via Docker (recommended method), follow these steps: - If you’ve customized other settings (e.g., `shm-size`), ensure they’re still appropriate after the update. - Docker will automatically use the updated image when you restart the container, as long as you pulled the correct version. -## Updating the Home Assistant Addon +## Updating the Home Assistant App (formerly Addon) -For users running Frigate as a Home Assistant Addon: +For users running Frigate as a Home Assistant App: 1. **Check for Updates**: - - Navigate to **Settings > Add-ons** in Home Assistant. - - Find your installed Frigate addon (e.g., "Frigate NVR" or "Frigate NVR (Full Access)"). + - Navigate to **Settings > Apps** in Home Assistant. + - Find your installed Frigate app (e.g., "Frigate NVR" or "Frigate NVR (Full Access)"). - If an update is available, you’ll see an "Update" button. -2. **Update the Addon**: - - Click the "Update" button next to the Frigate addon. +2. **Update the App**: + - Click the "Update" button next to the Frigate app. - Wait for the process to complete. Home Assistant will handle downloading and installing the new version. -3. **Restart the Addon**: - - After updating, go to the addon’s page and click "Restart" to apply the changes. +3. **Restart the App**: + - After updating, go to the app’s page and click "Restart" to apply the changes. 4. **Verify the Update**: - - Check the addon logs (under the "Log" tab) to ensure Frigate starts without errors. + - Check the app logs (under the "Log" tab) to ensure Frigate starts without errors. - Access the Frigate Web UI to confirm the new version is running. ### Notes - Ensure your `/config/frigate.yml` is compatible with the new version by reviewing the [Release notes](https://github.com/blakeblackshear/frigate/releases). -- If using custom hardware (e.g., Coral or GPU), verify that configurations still work, as addon updates don’t modify your hardware settings. +- If using custom hardware (e.g., Coral or GPU), verify that configurations still work, as app updates don’t modify your hardware settings. ## Rolling Back @@ -101,7 +101,7 @@ If an update causes issues: 3. Revert to the previous image version: - For Docker: Specify an older tag (e.g., `ghcr.io/blakeblackshear/frigate:0.16.4`) in your `docker run` command. - For Docker Compose: Edit your `docker-compose.yml`, specify the older version tag (e.g., `ghcr.io/blakeblackshear/frigate:0.16.4`), and re-run `docker compose up -d`. - - For Home Assistant: Reinstall the previous addon version manually via the repository if needed and restart the addon. + - For Home Assistant: Restore from the app/addon backup you took before you updated. 4. Verify the old version is running again. ## Troubleshooting diff --git a/docs/docs/guides/configuring_go2rtc.md b/docs/docs/guides/configuring_go2rtc.md index ca50a90d3..2dbbccada 100644 --- a/docs/docs/guides/configuring_go2rtc.md +++ b/docs/docs/guides/configuring_go2rtc.md @@ -33,19 +33,16 @@ After adding this to the config, restart Frigate and try to watch the live strea ### What if my video doesn't play? - Check Logs: - - Access the go2rtc logs in the Frigate UI under Logs in the sidebar. - If go2rtc is having difficulty connecting to your camera, you should see some error messages in the log. - Check go2rtc Web Interface: if you don't see any errors in the logs, try viewing the camera through go2rtc's web interface. - - Navigate to port 1984 in your browser to access go2rtc's web interface. - If using Frigate through Home Assistant, enable the web interface at port 1984. - If using Docker, forward port 1984 before accessing the web interface. - Click `stream` for the specific camera to see if the camera's stream is being received. - Check Video Codec: - - If the camera stream works in go2rtc but not in your browser, the video codec might be unsupported. - If using H265, switch to H264. Refer to [video codec compatibility](https://github.com/AlexxIT/go2rtc/tree/v1.9.10#codecs-madness) in go2rtc documentation. - If unable to switch from H265 to H264, or if the stream format is different (e.g., MJPEG), re-encode the video using [FFmpeg parameters](https://github.com/AlexxIT/go2rtc/tree/v1.9.10#source-ffmpeg). It supports rotating and resizing video feeds and hardware acceleration. Keep in mind that transcoding video from one format to another is a resource intensive task and you may be better off using the built-in jsmpeg view. @@ -58,7 +55,6 @@ After adding this to the config, restart Frigate and try to watch the live strea ``` - Switch to FFmpeg if needed: - - Some camera streams may need to use the ffmpeg module in go2rtc. This has the downside of slower startup times, but has compatibility with more stream types. ```yaml @@ -101,9 +97,9 @@ After adding this to the config, restart Frigate and try to watch the live strea :::warning -To access the go2rtc stream externally when utilizing the Frigate Add-On (for +To access the go2rtc stream externally when utilizing the Frigate App (for instance through VLC), you must first enable the RTSP Restream port. -You can do this by visiting the Frigate Add-On configuration page within Home +You can do this by visiting the Frigate App configuration page within Home Assistant and revealing the hidden options under the "Show disabled ports" section. diff --git a/docs/docs/guides/getting_started.md b/docs/docs/guides/getting_started.md index efc9edb42..1100a759b 100644 --- a/docs/docs/guides/getting_started.md +++ b/docs/docs/guides/getting_started.md @@ -9,7 +9,7 @@ title: Getting started If you already have an environment with Linux and Docker installed, you can continue to [Installing Frigate](#installing-frigate) below. -If you already have Frigate installed through Docker or through a Home Assistant Add-on, you can continue to [Configuring Frigate](#configuring-frigate) below. +If you already have Frigate installed through Docker or through a Home Assistant App, you can continue to [Configuring Frigate](#configuring-frigate) below. ::: @@ -81,7 +81,7 @@ Now you have a minimal Debian server that requires very little maintenance. ## Installing Frigate -This section shows how to create a minimal directory structure for a Docker installation on Debian. If you have installed Frigate as a Home Assistant Add-on or another way, you can continue to [Configuring Frigate](#configuring-frigate). +This section shows how to create a minimal directory structure for a Docker installation on Debian. If you have installed Frigate as a Home Assistant App or another way, you can continue to [Configuring Frigate](#configuring-frigate). ### Setup directories @@ -174,7 +174,7 @@ cameras: ### Step 4: Configure detectors -By default, Frigate will use a single CPU detector. +By default, Frigate will use a single CPU detector. In many cases, the integrated graphics on Intel CPUs provides sufficient performance for typical Frigate setups. If you have an Intel processor, you can follow the configuration below. @@ -187,12 +187,12 @@ You need to refer to **Configure hardware acceleration** above to enable the con mqtt: ... detectors: # <---- add detectors - ov: + ov: type: openvino # <---- use openvino detector device: GPU # We will use the default MobileNet_v2 model from OpenVINO. -model: +model: width: 300 height: 300 input_tensor: nhwc @@ -209,12 +209,12 @@ cameras: ``` - + If you have a USB Coral, you will need to add a detectors section to your config.
Use USB Coral detector - + `docker-compose.yml` (after modifying, you will need to run `docker compose up -d` to apply changes) ```yaml diff --git a/docs/docs/guides/ha_network_storage.md b/docs/docs/guides/ha_network_storage.md index 78cddddeb..134e1952c 100644 --- a/docs/docs/guides/ha_network_storage.md +++ b/docs/docs/guides/ha_network_storage.md @@ -3,7 +3,7 @@ id: ha_network_storage title: Home Assistant network storage --- -As of Home Assistant 2023.6, Network Mounted Storage is supported for Add-ons. +As of Home Assistant 2023.6, Network Mounted Storage is supported for Apps. ## Setting Up Remote Storage For Frigate @@ -14,7 +14,7 @@ As of Home Assistant 2023.6, Network Mounted Storage is supported for Add-ons. ### Initial Setup -1. Stop the Frigate Add-on +1. Stop the Frigate App ### Move current data @@ -37,4 +37,4 @@ Keeping the current data is optional, but the data will need to be moved regardl 4. Fill out the additional required info for your particular NAS 5. Connect 6. Move files from `/media/frigate_tmp` to `/media/frigate` if they were kept in previous step -7. Start the Frigate Add-on +7. Start the Frigate App diff --git a/docs/docs/integrations/home-assistant.md b/docs/docs/integrations/home-assistant.md index 1ba5bfca1..5b9c01437 100644 --- a/docs/docs/integrations/home-assistant.md +++ b/docs/docs/integrations/home-assistant.md @@ -99,11 +99,11 @@ services: ... ``` -### Home Assistant Add-on +### Home Assistant App -If you are using Home Assistant Add-on, the URL should be one of the following depending on which Add-on variant you are using. Note that if you are using the Proxy Add-on, you should NOT point the integration at the proxy URL. Just enter the same URL used to access Frigate directly from your network. +If you are using Home Assistant App, the URL should be one of the following depending on which App variant you are using. Note that if you are using the Proxy App, you should NOT point the integration at the proxy URL. Just enter the same URL used to access Frigate directly from your network. -| Add-on Variant | URL | +| App Variant | URL | | -------------------------- | -------------------------------------- | | Frigate | `http://ccab4aaf-frigate:5000` | | Frigate (Full Access) | `http://ccab4aaf-frigate-fa:5000` | diff --git a/docs/docs/integrations/plus.md b/docs/docs/integrations/plus.md index 36efa5e74..aa3d78df5 100644 --- a/docs/docs/integrations/plus.md +++ b/docs/docs/integrations/plus.md @@ -19,11 +19,11 @@ Once logged in, you can generate an API key for Frigate in Settings. ### Set your API key -In Frigate, you can use an environment variable or a docker secret named `PLUS_API_KEY` to enable the `Frigate+` buttons on the Explore page. Home Assistant Addon users can set it under Settings > Add-ons > Frigate > Configuration > Options (be sure to toggle the "Show unused optional configuration options" switch). +In Frigate, you can use an environment variable or a docker secret named `PLUS_API_KEY` to enable the `Frigate+` buttons on the Explore page. Home Assistant App users can set it under Settings > Apps > Frigate > Configuration > Options (be sure to toggle the "Show unused optional configuration options" switch). :::warning -You cannot use the `environment_vars` section of your Frigate configuration file to set this environment variable. It must be defined as an environment variable in the docker config or Home Assistant Add-on config. +You cannot use the `environment_vars` section of your Frigate configuration file to set this environment variable. It must be defined as an environment variable in the docker config or Home Assistant App config. ::: diff --git a/docs/docs/troubleshooting/edgetpu.md b/docs/docs/troubleshooting/edgetpu.md index 97b2b0040..4ee25afd0 100644 --- a/docs/docs/troubleshooting/edgetpu.md +++ b/docs/docs/troubleshooting/edgetpu.md @@ -32,7 +32,7 @@ The USB coral can draw up to 900mA and this can be too much for some on-device U The USB coral has different IDs when it is uninitialized and initialized. - When running Frigate in a VM, Proxmox lxc, etc. you must ensure both device IDs are mapped. -- When running through the Home Assistant OS you may need to run the Full Access variant of the Frigate Add-on with the _Protection mode_ switch disabled so that the coral can be accessed. +- When running through the Home Assistant OS you may need to run the Full Access variant of the Frigate App with the _Protection mode_ switch disabled so that the coral can be accessed. ### Synology 716+II running DSM 7.2.1-69057 Update 5 From d1f3a807d3d6176a0a17a0f1d8bca83d35a1ed82 Mon Sep 17 00:00:00 2001 From: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com> Date: Sat, 7 Mar 2026 07:18:17 -0600 Subject: [PATCH 12/38] call out avx2 requirement (#22305) --- .../custom_classification/object_classification.md | 3 +-- .../custom_classification/state_classification.md | 2 +- docs/docs/configuration/face_recognition.md | 5 +---- docs/docs/configuration/license_plate_recognition.md | 7 +------ docs/docs/configuration/semantic_search.md | 2 +- docs/docs/frigate/hardware.md | 2 +- docs/docs/frigate/planning_setup.md | 2 +- 7 files changed, 7 insertions(+), 16 deletions(-) diff --git a/docs/docs/configuration/custom_classification/object_classification.md b/docs/docs/configuration/custom_classification/object_classification.md index 713dcf998..fe1b9d0ea 100644 --- a/docs/docs/configuration/custom_classification/object_classification.md +++ b/docs/docs/configuration/custom_classification/object_classification.md @@ -11,7 +11,7 @@ Object classification models are lightweight and run very fast on CPU. Training the model does briefly use a high amount of system resources for about 1–3 minutes per training run. On lower-power devices, training may take longer. -A CPU with AVX instructions is required for training and inference. +A CPU with AVX + AVX2 instructions is required for training and inference. ## Classes @@ -27,7 +27,6 @@ For object classification: ### Classification Type - **Sub label**: - - Applied to the object’s `sub_label` field. - Ideal for a single, more specific identity or type. - Example: `cat` → `Leo`, `Charlie`, `None`. diff --git a/docs/docs/configuration/custom_classification/state_classification.md b/docs/docs/configuration/custom_classification/state_classification.md index 53310e4c6..ad6fb92fc 100644 --- a/docs/docs/configuration/custom_classification/state_classification.md +++ b/docs/docs/configuration/custom_classification/state_classification.md @@ -11,7 +11,7 @@ State classification models are lightweight and run very fast on CPU. Training the model does briefly use a high amount of system resources for about 1–3 minutes per training run. On lower-power devices, training may take longer. -A CPU with AVX instructions is required for training and inference. +A CPU with AVX + AVX2 instructions is required for training and inference. ## Classes diff --git a/docs/docs/configuration/face_recognition.md b/docs/docs/configuration/face_recognition.md index c13a1047d..c44f76dea 100644 --- a/docs/docs/configuration/face_recognition.md +++ b/docs/docs/configuration/face_recognition.md @@ -32,7 +32,7 @@ All of these features run locally on your system. ## Minimum System Requirements - A CPU with AVX instructions is required to run Face Recognition. +A CPU with AVX + AVX2 instructions is required to run Face Recognition. The `small` model is optimized for efficiency and runs on the CPU, most CPUs should run the model efficiently. @@ -145,17 +145,14 @@ Start with the [Usage](#usage) section and re-read the [Model Requirements](#mod 1. Ensure `person` is being _detected_. A `person` will automatically be scanned by Frigate for a face. Any detected faces will appear in the Recent Recognitions tab in the Frigate UI's Face Library. If you are using a Frigate+ or `face` detecting model: - - Watch the debug view (Settings --> Debug) to ensure that `face` is being detected along with `person`. - You may need to adjust the `min_score` for the `face` object if faces are not being detected. If you are **not** using a Frigate+ or `face` detecting model: - - Check your `detect` stream resolution and ensure it is sufficiently high enough to capture face details on `person` objects. - You may need to lower your `detection_threshold` if faces are not being detected. 2. Any detected faces will then be _recognized_. - - Make sure you have trained at least one face per the recommendations above. - Adjust `recognition_threshold` settings per the suggestions [above](#advanced-configuration). diff --git a/docs/docs/configuration/license_plate_recognition.md b/docs/docs/configuration/license_plate_recognition.md index 76837efcb..0450bcef2 100644 --- a/docs/docs/configuration/license_plate_recognition.md +++ b/docs/docs/configuration/license_plate_recognition.md @@ -30,7 +30,7 @@ In the default mode, Frigate's LPR needs to first detect a `car` or `motorcycle` ## Minimum System Requirements -License plate recognition works by running AI models locally on your system. The YOLOv9 plate detector model and the OCR models ([PaddleOCR](https://github.com/PaddlePaddle/PaddleOCR)) are relatively lightweight and can run on your CPU or GPU, depending on your configuration. At least 4GB of RAM and a CPU with AVX instructions is required. +License plate recognition works by running AI models locally on your system. The YOLOv9 plate detector model and the OCR models ([PaddleOCR](https://github.com/PaddlePaddle/PaddleOCR)) are relatively lightweight and can run on your CPU or GPU, depending on your configuration. At least 4GB of RAM and a CPU with AVX + AVX2 instructions is required. ## Configuration @@ -375,7 +375,6 @@ Use `match_distance` to allow small character mismatches. Alternatively, define Start with ["Why isn't my license plate being detected and recognized?"](#why-isnt-my-license-plate-being-detected-and-recognized). If you are still having issues, work through these steps. 1. Start with a simplified LPR config. - - Remove or comment out everything in your LPR config, including `min_area`, `min_plate_length`, `format`, `known_plates`, or `enhancement` values so that the only values left are `enabled` and `debug_save_plates`. This will run LPR with Frigate's default values. ```yaml @@ -386,7 +385,6 @@ Start with ["Why isn't my license plate being detected and recognized?"](#why-is ``` 2. Enable debug logs to see exactly what Frigate is doing. - - Enable debug logs for LPR by adding `frigate.data_processing.common.license_plate: debug` to your `logger` configuration. These logs are _very_ verbose, so only keep this enabled when necessary. Restart Frigate after this change. ```yaml @@ -399,18 +397,15 @@ Start with ["Why isn't my license plate being detected and recognized?"](#why-is 3. Ensure your plates are being _detected_. If you are using a Frigate+ or `license_plate` detecting model: - - Watch the debug view (Settings --> Debug) to ensure that `license_plate` is being detected. - View MQTT messages for `frigate/events` to verify detected plates. - You may need to adjust your `min_score` and/or `threshold` for the `license_plate` object if your plates are not being detected. If you are **not** using a Frigate+ or `license_plate` detecting model: - - Watch the debug logs for messages from the YOLOv9 plate detector. - You may need to adjust your `detection_threshold` if your plates are not being detected. 4. Ensure the characters on detected plates are being _recognized_. - - Enable `debug_save_plates` to save images of detected text on plates to the clips directory (`/media/frigate/clips/lpr`). Ensure these images are readable and the text is clear. - Watch the debug view to see plates recognized in real-time. For non-dedicated LPR cameras, the `car` or `motorcycle` label will change to the recognized plate when LPR is enabled and working. - Adjust `recognition_threshold` settings per the suggestions [above](#advanced-configuration). diff --git a/docs/docs/configuration/semantic_search.md b/docs/docs/configuration/semantic_search.md index 5946af139..19346454b 100644 --- a/docs/docs/configuration/semantic_search.md +++ b/docs/docs/configuration/semantic_search.md @@ -13,7 +13,7 @@ Semantic Search is accessed via the _Explore_ view in the Frigate UI. Semantic Search works by running a large AI model locally on your system. Small or underpowered systems like a Raspberry Pi will not run Semantic Search reliably or at all. -A minimum of 8GB of RAM is required to use Semantic Search. A CPU with AVX instructions is required to run Semantic Search. A GPU is not strictly required but will provide a significant performance increase over CPU-only systems. +A minimum of 8GB of RAM is required to use Semantic Search. A CPU with AVX + AVX2 instructions is required to run Semantic Search. A GPU is not strictly required but will provide a significant performance increase over CPU-only systems. For best performance, 16GB or more of RAM and a dedicated GPU are recommended. diff --git a/docs/docs/frigate/hardware.md b/docs/docs/frigate/hardware.md index 9bb321ecf..3d0730b70 100644 --- a/docs/docs/frigate/hardware.md +++ b/docs/docs/frigate/hardware.md @@ -26,7 +26,7 @@ I may earn a small commission for my endorsement, recommendation, testimonial, o ## Server -My current favorite is the Beelink EQ13 because of the efficient N100 CPU and dual NICs that allow you to setup a dedicated private network for your cameras where they can be blocked from accessing the internet. There are many used workstation options on eBay that work very well. Anything with an Intel CPU (with AVX instructions) and capable of running Debian should work fine. As a bonus, you may want to look for devices with a M.2 or PCIe express slot that is compatible with the Google Coral, Hailo, or other AI accelerators. +My current favorite is the Beelink EQ13 because of the efficient N100 CPU and dual NICs that allow you to setup a dedicated private network for your cameras where they can be blocked from accessing the internet. There are many used workstation options on eBay that work very well. Anything with an Intel CPU (with AVX + AVX2 instructions) and capable of running Debian should work fine. As a bonus, you may want to look for devices with a M.2 or PCIe express slot that is compatible with the Google Coral, Hailo, or other AI accelerators. Note that many of these mini PCs come with Windows pre-installed, and you will need to install Linux according to the [getting started guide](../guides/getting_started.md). diff --git a/docs/docs/frigate/planning_setup.md b/docs/docs/frigate/planning_setup.md index b7dbd604b..85c6eb648 100644 --- a/docs/docs/frigate/planning_setup.md +++ b/docs/docs/frigate/planning_setup.md @@ -36,7 +36,7 @@ There are many different hardware options for object detection depending on prio ### CPU -Frigate requires a CPU with AVX instructions. Most modern CPUs (post-2011) support AVX, but it is generally absent in low-power or budget-oriented processors, particularly older Intel Pentium, Celeron, and Atom-based chips. Specifically, Intel Celeron and Pentium models prior to the 2020 Tiger Lake generation typically lack AVX. +Frigate requires a CPU with AVX + AVX2 instructions. Most modern CPUs (post-2011) support AVX and AVX2, but it is generally absent in low-power or budget-oriented processors, particularly older Intel Pentium, Celeron, and Atom-based chips. Specifically, Intel Celeron and Pentium models prior to the 2020 Tiger Lake generation typically lack AVX. Older Intel Xeon models may have AVX, but may lack AVX2. ### Storage From f316244495a6acb92d400a330ab79702aa0f481a Mon Sep 17 00:00:00 2001 From: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com> Date: Sat, 7 Mar 2026 07:36:08 -0600 Subject: [PATCH 13/38] Improve playback of videos in Tracking Details (#22301) * prevent short hls segments by extending clip backwards * clean up * snap to keyframe instead of arbitrarily subtracting time * formatting --- frigate/api/media.py | 29 ++++++++++++++++++++ frigate/util/media.py | 61 +++++++++++++++++++++++++++++++++++++++++++ 2 files changed, 90 insertions(+) create mode 100644 frigate/util/media.py diff --git a/frigate/api/media.py b/frigate/api/media.py index 971bfef83..0ea4c487e 100644 --- a/frigate/api/media.py +++ b/frigate/api/media.py @@ -50,10 +50,12 @@ from frigate.models import Event, Previews, Recordings, Regions, ReviewSegment from frigate.track.object_processing import TrackedObjectProcessor from frigate.util.file import get_event_thumbnail_bytes from frigate.util.image import get_image_from_recording +from frigate.util.media import get_keyframe_before from frigate.util.time import get_dst_transitions logger = logging.getLogger(__name__) + router = APIRouter(tags=[Tags.media]) @@ -900,6 +902,33 @@ async def vod_ts( if recording.end_time > end_ts: duration -= int((recording.end_time - end_ts) * 1000) + # nginx-vod-module pushes clipFrom forward to the next keyframe, + # which can leave too few frames and produce an empty/unplayable + # segment. Snap clipFrom back to the preceding keyframe so the + # segment always starts with a decodable frame. + if "clipFrom" in clip: + keyframe_ms = get_keyframe_before(recording.path, clip["clipFrom"]) + if keyframe_ms is not None: + gained = clip["clipFrom"] - keyframe_ms + clip["clipFrom"] = keyframe_ms + duration += gained + logger.debug( + "VOD: snapped clipFrom to keyframe at %sms for %s, duration now %sms", + keyframe_ms, + recording.path, + duration, + ) + else: + # could not read keyframes, remove clipFrom to use full recording + logger.debug( + "VOD: no keyframe info for %s, removing clipFrom to use full recording", + recording.path, + ) + del clip["clipFrom"] + duration = int(recording.duration * 1000) + if recording.end_time > end_ts: + duration -= int((recording.end_time - end_ts) * 1000) + if duration < min_duration_ms: # skip if the clip has no valid duration (too short to contain frames) logger.debug( diff --git a/frigate/util/media.py b/frigate/util/media.py new file mode 100644 index 000000000..406d51cf3 --- /dev/null +++ b/frigate/util/media.py @@ -0,0 +1,61 @@ +"""Utilities for media file inspection.""" + +import subprocess as sp + +from frigate.const import DEFAULT_FFMPEG_VERSION + +FFPROBE_PATH = ( + f"/usr/lib/ffmpeg/{DEFAULT_FFMPEG_VERSION}/bin/ffprobe" + if DEFAULT_FFMPEG_VERSION + else "ffprobe" +) + + +def get_keyframe_before(path: str, offset_ms: int) -> int | None: + """Get the timestamp (ms) of the last keyframe at or before offset_ms. + + Uses ffprobe packet index to read keyframe positions from the mp4 file. + Returns None if ffprobe fails or no keyframe is found before the offset. + """ + try: + result = sp.run( + [ + FFPROBE_PATH, + "-select_streams", + "v:0", + "-show_entries", + "packet=pts_time,flags", + "-of", + "csv=p=0", + "-loglevel", + "error", + path, + ], + capture_output=True, + timeout=5, + ) + except (sp.TimeoutExpired, FileNotFoundError): + return None + + if result.returncode != 0: + return None + + offset_s = offset_ms / 1000.0 + best_ms = None + for line in result.stdout.decode().strip().splitlines(): + parts = line.strip().split(",") + if len(parts) != 2: + continue + ts_str, flags = parts + if "K" not in flags: + continue + try: + ts = float(ts_str) + except ValueError: + continue + if ts <= offset_s: + best_ms = int(ts * 1000) + else: + break + + return best_ms From 537e723c302a6ef65b4fb20fde2a8546b8bb732a Mon Sep 17 00:00:00 2001 From: Roki Date: Sun, 8 Mar 2026 14:00:06 +0100 Subject: [PATCH 14/38] Fix/rknn arcface input format master (#22319) * "fix: correct ArcFace input format for RKNN runner" * ruff format --- frigate/detectors/detection_runners.py | 11 +++++++++++ 1 file changed, 11 insertions(+) diff --git a/frigate/detectors/detection_runners.py b/frigate/detectors/detection_runners.py index fcbb41e66..36bf24ce1 100644 --- a/frigate/detectors/detection_runners.py +++ b/frigate/detectors/detection_runners.py @@ -529,6 +529,17 @@ class RKNNModelRunner(BaseModelRunner): # Transpose from NCHW to NHWC pixel_data = np.transpose(pixel_data, (0, 2, 3, 1)) rknn_inputs.append(pixel_data) + elif name == "data": + # ArcFace: undo Python normalisation to uint8 [0,255] + # RKNN runtime applies mean=127.5/std=127.5 internally before first layer + face_data = inputs[name] + if len(face_data.shape) == 4 and face_data.shape[1] == 3: + # Transpose from NCHW to NHWC + face_data = np.transpose(face_data, (0, 2, 3, 1)) + face_data = ( + ((face_data + 1.0) * 127.5).clip(0, 255).astype(np.uint8) + ) + rknn_inputs.append(face_data) else: rknn_inputs.append(inputs[name]) From 4e71a835cb734430444d4ff6c4e58c8313abcfae Mon Sep 17 00:00:00 2001 From: ARandomGitHubUser <7754708+ARandomGitHubUser@users.noreply.github.com> Date: Sun, 8 Mar 2026 10:00:21 -0400 Subject: [PATCH 15/38] Fix broken link to Home Assistant apps page (#22320) Co-authored-by: ARandomGitHubUser <7754708+ARandomGitHubUser@users.noreply.github.com> --- docs/docs/frigate/installation.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/docs/frigate/installation.md b/docs/docs/frigate/installation.md index cef98f483..8bc5cb652 100644 --- a/docs/docs/frigate/installation.md +++ b/docs/docs/frigate/installation.md @@ -3,7 +3,7 @@ id: installation title: Installation --- -Frigate is a Docker container that can be run on any Docker host including as a [Home Assistant App](https://www.home-assistant.io/addons/). Note that the Home Assistant App is **not** the same thing as the integration. The [integration](/integrations/home-assistant) is required to integrate Frigate into Home Assistant, whether you are running Frigate as a standalone Docker container or as a Home Assistant App. +Frigate is a Docker container that can be run on any Docker host including as a [Home Assistant App](https://www.home-assistant.io/apps/). Note that the Home Assistant App is **not** the same thing as the integration. The [integration](/integrations/home-assistant) is required to integrate Frigate into Home Assistant, whether you are running Frigate as a standalone Docker container or as a Home Assistant App. :::tip From bde518e8617d428f965cbf5921541c1ad47e510a Mon Sep 17 00:00:00 2001 From: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com> Date: Sun, 8 Mar 2026 15:14:44 -0500 Subject: [PATCH 16/38] Fix preview retrieval to handle missing previews gracefully (#22331) --- frigate/api/media.py | 36 ++++++++++++++++++------------------ 1 file changed, 18 insertions(+), 18 deletions(-) diff --git a/frigate/api/media.py b/frigate/api/media.py index 0ea4c487e..ad7a4cde5 100644 --- a/frigate/api/media.py +++ b/frigate/api/media.py @@ -1531,25 +1531,25 @@ def preview_gif( ): if datetime.fromtimestamp(start_ts) < datetime.now().replace(minute=0, second=0): # has preview mp4 - preview: Previews = ( - Previews.select( - Previews.camera, - Previews.path, - Previews.duration, - Previews.start_time, - Previews.end_time, + try: + preview: Previews = ( + Previews.select( + Previews.camera, + Previews.path, + Previews.duration, + Previews.start_time, + Previews.end_time, + ) + .where( + Previews.start_time.between(start_ts, end_ts) + | Previews.end_time.between(start_ts, end_ts) + | ((start_ts > Previews.start_time) & (end_ts < Previews.end_time)) + ) + .where(Previews.camera == camera_name) + .limit(1) + .get() ) - .where( - Previews.start_time.between(start_ts, end_ts) - | Previews.end_time.between(start_ts, end_ts) - | ((start_ts > Previews.start_time) & (end_ts < Previews.end_time)) - ) - .where(Previews.camera == camera_name) - .limit(1) - .get() - ) - - if not preview: + except DoesNotExist: return JSONResponse( content={"success": False, "message": "Preview not found"}, status_code=404, From b6f78bd1f291134560264e0f0c8562d8c8e379f3 Mon Sep 17 00:00:00 2001 From: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com> Date: Sun, 8 Mar 2026 15:15:12 -0500 Subject: [PATCH 17/38] fix thumbnail encoding logic (#22329) --- frigate/api/media.py | 21 +++++++++++---------- 1 file changed, 11 insertions(+), 10 deletions(-) diff --git a/frigate/api/media.py b/frigate/api/media.py index ad7a4cde5..bfce8b10c 100644 --- a/frigate/api/media.py +++ b/frigate/api/media.py @@ -1186,11 +1186,12 @@ async def event_thumbnail( status_code=404, ) + img_as_np = np.frombuffer(thumbnail_bytes, dtype=np.uint8) + img = cv2.imdecode(img_as_np, flags=1) + # android notifications prefer a 2:1 ratio if format == "android": - img_as_np = np.frombuffer(thumbnail_bytes, dtype=np.uint8) - img = cv2.imdecode(img_as_np, flags=1) - thumbnail = cv2.copyMakeBorder( + img = cv2.copyMakeBorder( img, 0, 0, @@ -1200,14 +1201,14 @@ async def event_thumbnail( (0, 0, 0), ) - quality_params = None - if extension in (Extension.jpg, Extension.jpeg): - quality_params = [int(cv2.IMWRITE_JPEG_QUALITY), 70] - elif extension == Extension.webp: - quality_params = [int(cv2.IMWRITE_WEBP_QUALITY), 60] + quality_params = None + if extension in (Extension.jpg, Extension.jpeg): + quality_params = [int(cv2.IMWRITE_JPEG_QUALITY), 70] + elif extension == Extension.webp: + quality_params = [int(cv2.IMWRITE_WEBP_QUALITY), 60] - _, img = cv2.imencode(f".{extension.value}", thumbnail, quality_params) - thumbnail_bytes = img.tobytes() + _, encoded = cv2.imencode(f".{extension.value}", img, quality_params) + thumbnail_bytes = encoded.tobytes() return Response( thumbnail_bytes, From 41e290449e9efb35d250cf67c38eeffe201b3331 Mon Sep 17 00:00:00 2001 From: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com> Date: Sun, 8 Mar 2026 16:08:40 -0500 Subject: [PATCH 18/38] Environment variable fixes (#22335) * fix environment_vars timing bug for EnvString substitution FRIGATE_ENV_VARS is populated at module import time but was never updated when environment_vars config section set new vars into os.environ. This meant HA OS users setting FRIGATE_* vars via environment_vars could not use them in EnvString fields. * add EnvString support to MQTT and ONVIF host fields * add tests * docs * update reference config --- docs/docs/configuration/advanced.md | 12 ++- docs/docs/configuration/index.md | 3 +- docs/docs/configuration/reference.md | 4 + frigate/config/camera/onvif.py | 2 +- frigate/config/env.py | 6 +- frigate/config/mqtt.py | 2 +- frigate/test/test_env.py | 105 +++++++++++++++++++++++++++ 7 files changed, 127 insertions(+), 7 deletions(-) create mode 100644 frigate/test/test_env.py diff --git a/docs/docs/configuration/advanced.md b/docs/docs/configuration/advanced.md index 17eb2053d..c04cec97c 100644 --- a/docs/docs/configuration/advanced.md +++ b/docs/docs/configuration/advanced.md @@ -44,13 +44,21 @@ go2rtc: ### `environment_vars` -This section can be used to set environment variables for those unable to modify the environment of the container, like within Home Assistant OS. +This section can be used to set environment variables for those unable to modify the environment of the container, like within Home Assistant OS. Docker users should set environment variables in their `docker run` command (`-e FRIGATE_MQTT_PASSWORD=secret`) or `docker-compose.yml` file (`environment:` section) instead. Note that values set here are stored in plain text in your config file, so if the goal is to keep credentials out of your configuration, use Docker environment variables or Docker secrets instead. + +Variables prefixed with `FRIGATE_` can be referenced in config fields that support environment variable substitution (such as MQTT host and credentials, camera stream URLs, and ONVIF host and credentials) using the `{FRIGATE_VARIABLE_NAME}` syntax. Example: ```yaml environment_vars: - VARIABLE_NAME: variable_value + FRIGATE_MQTT_USER: my_mqtt_user + FRIGATE_MQTT_PASSWORD: my_mqtt_password + +mqtt: + host: "{FRIGATE_MQTT_HOST}" + user: "{FRIGATE_MQTT_USER}" + password: "{FRIGATE_MQTT_PASSWORD}" ``` #### TensorFlow Thread Configuration diff --git a/docs/docs/configuration/index.md b/docs/docs/configuration/index.md index c1b0dc903..efa59246f 100644 --- a/docs/docs/configuration/index.md +++ b/docs/docs/configuration/index.md @@ -50,6 +50,7 @@ Frigate supports the use of environment variables starting with `FRIGATE_` **onl ```yaml mqtt: + host: "{FRIGATE_MQTT_HOST}" user: "{FRIGATE_MQTT_USER}" password: "{FRIGATE_MQTT_PASSWORD}" ``` @@ -60,7 +61,7 @@ mqtt: ```yaml onvif: - host: 10.0.10.10 + host: "{FRIGATE_ONVIF_HOST}" port: 8000 user: "{FRIGATE_RTSP_USER}" password: "{FRIGATE_RTSP_PASSWORD}" diff --git a/docs/docs/configuration/reference.md b/docs/docs/configuration/reference.md index 206d7012e..19a705192 100644 --- a/docs/docs/configuration/reference.md +++ b/docs/docs/configuration/reference.md @@ -16,6 +16,8 @@ mqtt: # Optional: Enable mqtt server (default: shown below) enabled: True # Required: host name + # NOTE: MQTT host can be specified with an environment variable or docker secrets that must begin with 'FRIGATE_'. + # e.g. host: '{FRIGATE_MQTT_HOST}' host: mqtt.server.com # Optional: port (default: shown below) port: 1883 @@ -906,6 +908,8 @@ cameras: onvif: # Required: host of the camera being connected to. # NOTE: HTTP is assumed by default; HTTPS is supported if you specify the scheme, ex: "https://0.0.0.0". + # NOTE: ONVIF host, user, and password can be specified with environment variables or docker secrets + # that must begin with 'FRIGATE_'. e.g. host: '{FRIGATE_ONVIF_HOST}' host: 0.0.0.0 # Optional: ONVIF port for device (default: shown below). port: 8000 diff --git a/frigate/config/camera/onvif.py b/frigate/config/camera/onvif.py index d4955799b..fd35fa537 100644 --- a/frigate/config/camera/onvif.py +++ b/frigate/config/camera/onvif.py @@ -72,7 +72,7 @@ class PtzAutotrackConfig(FrigateBaseModel): class OnvifConfig(FrigateBaseModel): - host: str = Field(default="", title="Onvif Host") + host: EnvString = Field(default="", title="Onvif Host") port: int = Field(default=8000, title="Onvif Port") user: Optional[EnvString] = Field(default=None, title="Onvif Username") password: Optional[EnvString] = Field(default=None, title="Onvif Password") diff --git a/frigate/config/env.py b/frigate/config/env.py index 6534ff411..db094a8af 100644 --- a/frigate/config/env.py +++ b/frigate/config/env.py @@ -24,8 +24,10 @@ EnvString = Annotated[str, AfterValidator(validate_env_string)] def validate_env_vars(v: dict[str, str], info: ValidationInfo) -> dict[str, str]: if isinstance(info.context, dict) and info.context.get("install", False): - for k, v in v.items(): - os.environ[k] = v + for k, val in v.items(): + os.environ[k] = val + if k.startswith("FRIGATE_"): + FRIGATE_ENV_VARS[k] = val return v diff --git a/frigate/config/mqtt.py b/frigate/config/mqtt.py index a760d0a1f..3e2f99294 100644 --- a/frigate/config/mqtt.py +++ b/frigate/config/mqtt.py @@ -13,7 +13,7 @@ __all__ = ["MqttConfig"] class MqttConfig(FrigateBaseModel): enabled: bool = Field(default=True, title="Enable MQTT Communication.") - host: str = Field(default="", title="MQTT Host") + host: EnvString = Field(default="", title="MQTT Host") port: int = Field(default=1883, title="MQTT Port") topic_prefix: str = Field(default="frigate", title="MQTT Topic Prefix") client_id: str = Field(default="frigate", title="MQTT Client ID") diff --git a/frigate/test/test_env.py b/frigate/test/test_env.py new file mode 100644 index 000000000..fe2ce8d3e --- /dev/null +++ b/frigate/test/test_env.py @@ -0,0 +1,105 @@ +"""Tests for environment variable handling.""" + +import os +import unittest + +from frigate.config.env import ( + FRIGATE_ENV_VARS, + validate_env_string, + validate_env_vars, +) + + +class TestEnvString(unittest.TestCase): + def setUp(self): + self._original_env_vars = dict(FRIGATE_ENV_VARS) + + def tearDown(self): + FRIGATE_ENV_VARS.clear() + FRIGATE_ENV_VARS.update(self._original_env_vars) + + def test_substitution(self): + """EnvString substitutes FRIGATE_ env vars.""" + FRIGATE_ENV_VARS["FRIGATE_TEST_HOST"] = "192.168.1.100" + result = validate_env_string("{FRIGATE_TEST_HOST}") + self.assertEqual(result, "192.168.1.100") + + def test_substitution_in_url(self): + """EnvString substitutes vars embedded in a URL.""" + FRIGATE_ENV_VARS["FRIGATE_CAM_USER"] = "admin" + FRIGATE_ENV_VARS["FRIGATE_CAM_PASS"] = "secret" + result = validate_env_string( + "rtsp://{FRIGATE_CAM_USER}:{FRIGATE_CAM_PASS}@10.0.0.1/stream" + ) + self.assertEqual(result, "rtsp://admin:secret@10.0.0.1/stream") + + def test_no_placeholder(self): + """Plain strings pass through unchanged.""" + result = validate_env_string("192.168.1.1") + self.assertEqual(result, "192.168.1.1") + + def test_unknown_var_raises(self): + """Referencing an unknown var raises KeyError.""" + with self.assertRaises(KeyError): + validate_env_string("{FRIGATE_NONEXISTENT_VAR}") + + +class TestEnvVars(unittest.TestCase): + def setUp(self): + self._original_env_vars = dict(FRIGATE_ENV_VARS) + self._original_environ = os.environ.copy() + + def tearDown(self): + FRIGATE_ENV_VARS.clear() + FRIGATE_ENV_VARS.update(self._original_env_vars) + # Clean up any env vars we set + for key in list(os.environ.keys()): + if key not in self._original_environ: + del os.environ[key] + + def _make_context(self, install: bool): + """Create a mock ValidationInfo with the given install flag.""" + + class MockContext: + def __init__(self, ctx): + self.context = ctx + + mock = MockContext({"install": install}) + return mock + + def test_install_sets_os_environ(self): + """validate_env_vars with install=True sets os.environ.""" + ctx = self._make_context(install=True) + validate_env_vars({"MY_CUSTOM_VAR": "value123"}, ctx) + self.assertEqual(os.environ.get("MY_CUSTOM_VAR"), "value123") + + def test_install_updates_frigate_env_vars(self): + """validate_env_vars with install=True updates FRIGATE_ENV_VARS for FRIGATE_ keys.""" + ctx = self._make_context(install=True) + validate_env_vars({"FRIGATE_MQTT_PASS": "secret"}, ctx) + self.assertEqual(FRIGATE_ENV_VARS["FRIGATE_MQTT_PASS"], "secret") + + def test_install_skips_non_frigate_in_env_vars_dict(self): + """Non-FRIGATE_ keys are set in os.environ but not in FRIGATE_ENV_VARS.""" + ctx = self._make_context(install=True) + validate_env_vars({"OTHER_VAR": "value"}, ctx) + self.assertEqual(os.environ.get("OTHER_VAR"), "value") + self.assertNotIn("OTHER_VAR", FRIGATE_ENV_VARS) + + def test_no_install_does_not_set(self): + """validate_env_vars without install=True does not modify state.""" + ctx = self._make_context(install=False) + validate_env_vars({"FRIGATE_SKIP": "nope"}, ctx) + self.assertNotIn("FRIGATE_SKIP", FRIGATE_ENV_VARS) + self.assertNotIn("FRIGATE_SKIP", os.environ) + + def test_env_vars_available_for_env_string(self): + """Vars set via validate_env_vars are usable in validate_env_string.""" + ctx = self._make_context(install=True) + validate_env_vars({"FRIGATE_BROKER": "mqtt.local"}, ctx) + result = validate_env_string("{FRIGATE_BROKER}") + self.assertEqual(result, "mqtt.local") + + +if __name__ == "__main__": + unittest.main() From c4a5ac0e774bcf731583d69a05748a6edf9ef312 Mon Sep 17 00:00:00 2001 From: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com> Date: Mon, 9 Mar 2026 10:33:14 -0500 Subject: [PATCH 19/38] fix go2rtc homekit handling (#22346) file needs to be blank if not using homekit, not {} it seems like go2rtc is not parsing {} as yaml --- .../rootfs/etc/s6-overlay/s6-rc.d/go2rtc/run | 32 ++++++++++--------- 1 file changed, 17 insertions(+), 15 deletions(-) diff --git a/docker/main/rootfs/etc/s6-overlay/s6-rc.d/go2rtc/run b/docker/main/rootfs/etc/s6-overlay/s6-rc.d/go2rtc/run index 7df29f8f5..599ab887e 100755 --- a/docker/main/rootfs/etc/s6-overlay/s6-rc.d/go2rtc/run +++ b/docker/main/rootfs/etc/s6-overlay/s6-rc.d/go2rtc/run @@ -55,7 +55,7 @@ function setup_homekit_config() { if [[ ! -f "${config_path}" ]]; then echo "[INFO] Creating empty config file for HomeKit..." - echo '{}' > "${config_path}" + : > "${config_path}" fi # Convert YAML to JSON for jq processing @@ -65,23 +65,25 @@ function setup_homekit_config() { return 0 } - # Use jq to filter and keep only the homekit section - local cleaned_json="/tmp/cache/homekit_cleaned.json" - jq ' - # Keep only the homekit section if it exists, otherwise empty object - if has("homekit") then {homekit: .homekit} else {} end - ' "${temp_json}" > "${cleaned_json}" 2>/dev/null || { - echo '{}' > "${cleaned_json}" - } + # Use jq to extract the homekit section, if it exists + local homekit_json + homekit_json=$(jq ' + if has("homekit") then {homekit: .homekit} else null end + ' "${temp_json}" 2>/dev/null) || homekit_json="null" - # Convert back to YAML and write to the config file - yq eval -P "${cleaned_json}" > "${config_path}" 2>/dev/null || { - echo "[WARNING] Failed to convert cleaned config to YAML, creating minimal config" - echo '{}' > "${config_path}" - } + # If no homekit section, write an empty config file + if [[ "${homekit_json}" == "null" ]]; then + : > "${config_path}" + else + # Convert homekit JSON back to YAML and write to the config file + echo "${homekit_json}" | yq eval -P - > "${config_path}" 2>/dev/null || { + echo "[WARNING] Failed to convert cleaned config to YAML, creating minimal config" + : > "${config_path}" + } + fi # Clean up temp files - rm -f "${temp_json}" "${cleaned_json}" + rm -f "${temp_json}" } set_libva_version From 1188d87588d5e9aef35c4311c70e08542942edf7 Mon Sep 17 00:00:00 2001 From: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com> Date: Mon, 9 Mar 2026 16:50:46 -0500 Subject: [PATCH 20/38] Save detect dimensions to config on add camera wizard save (#22349) * add util for optimal detect resolution * add detect to type * save optimal detect resolution to config on wizard save * use const --- .../settings/CameraWizardDialog.tsx | 24 +++++++++- web/src/types/cameraWizard.ts | 4 ++ web/src/utils/cameraUtil.ts | 45 +++++++++++++++++++ 3 files changed, 72 insertions(+), 1 deletion(-) diff --git a/web/src/components/settings/CameraWizardDialog.tsx b/web/src/components/settings/CameraWizardDialog.tsx index 5846fd9a2..74969290a 100644 --- a/web/src/components/settings/CameraWizardDialog.tsx +++ b/web/src/components/settings/CameraWizardDialog.tsx @@ -20,7 +20,10 @@ import type { CameraConfigData, ConfigSetBody, } from "@/types/cameraWizard"; -import { processCameraName } from "@/utils/cameraUtil"; +import { + processCameraName, + calculateDetectDimensions, +} from "@/utils/cameraUtil"; import { cn } from "@/lib/utils"; type WizardState = { @@ -203,6 +206,25 @@ export default function CameraWizardDialog({ }, }; + // Calculate detect dimensions from the detect stream's probed resolution + const detectStream = wizardData.streams.find((stream) => + stream.roles.includes("detect"), + ); + if (detectStream?.testResult?.resolution) { + const [streamWidth, streamHeight] = detectStream.testResult.resolution + .split("x") + .map(Number); + if (streamWidth > 0 && streamHeight > 0) { + const detectDimensions = calculateDetectDimensions( + streamWidth, + streamHeight, + ); + if (detectDimensions) { + configData.cameras[finalCameraName].detect = detectDimensions; + } + } + } + // Add live.streams configuration for go2rtc streams if (wizardData.streams && wizardData.streams.length > 0) { configData.cameras[finalCameraName].live = { diff --git a/web/src/types/cameraWizard.ts b/web/src/types/cameraWizard.ts index 4048303cb..20e843635 100644 --- a/web/src/types/cameraWizard.ts +++ b/web/src/types/cameraWizard.ts @@ -162,6 +162,10 @@ export type CameraConfigData = { input_args?: string; }[]; }; + detect?: { + width: number; + height: number; + }; live?: { streams: Record; }; diff --git a/web/src/utils/cameraUtil.ts b/web/src/utils/cameraUtil.ts index 543605ad0..07295d73c 100644 --- a/web/src/utils/cameraUtil.ts +++ b/web/src/utils/cameraUtil.ts @@ -115,6 +115,51 @@ export type CameraAudioFeatures = { * @param requireSecureContext - If true, two-way audio requires secure context (default: true) * @returns CameraAudioFeatures object with detected capabilities */ +/** + * Calculates optimal detect dimensions from stream resolution. + * + * Scales dimensions to an efficient size for object detection while + * preserving the stream's aspect ratio. Does not upscale. + * + * @param streamWidth - Native stream width in pixels + * @param streamHeight - Native stream height in pixels + * @returns Detect dimensions with even values, or null if inputs are invalid + */ + +// Target size for the smaller dimension (width or height) for detect streams +export const DETECT_TARGET_PX = 720; + +export function calculateDetectDimensions( + streamWidth: number, + streamHeight: number, +): { width: number; height: number } | null { + if ( + !Number.isFinite(streamWidth) || + !Number.isFinite(streamHeight) || + streamWidth <= 0 || + streamHeight <= 0 + ) { + return null; + } + + const smallerDim = Math.min(streamWidth, streamHeight); + const target = Math.min(DETECT_TARGET_PX, smallerDim); + const scale = target / smallerDim; + + let width = Math.round(streamWidth * scale); + let height = Math.round(streamHeight * scale); + + // Round down to even numbers (required for video processing) + width = width - (width % 2); + height = height - (height % 2); + + if (width < 2 || height < 2) { + return null; + } + + return { width, height }; +} + export function detectCameraAudioFeatures( metadata: LiveStreamMetadata | null | undefined, requireSecureContext: boolean = true, From 19480867fbfb9b757fc7612ceb582d0dc9a4b60f Mon Sep 17 00:00:00 2001 From: GuoQing Liu <842607283@qq.com> Date: Tue, 10 Mar 2026 21:18:02 +0800 Subject: [PATCH 21/38] docs: add highlight magic comments (#22367) --- docs/docs/configuration/authentication.md | 2 +- docs/docs/configuration/birdseye.md | 9 ++++--- docs/docs/configuration/camera_specific.md | 6 +++-- docs/docs/configuration/cameras.md | 2 +- .../object_classification.md | 1 + .../state_classification.md | 1 + docs/docs/configuration/genai/config.md | 4 +-- .../configuration/genai/review_summaries.md | 5 ++-- .../hardware_acceleration_video.md | 25 ++++++++++--------- .../license_plate_recognition.md | 3 ++- docs/docs/configuration/live.md | 6 ++--- docs/docs/configuration/object_detectors.md | 4 +-- docs/docs/configuration/record.md | 2 +- docs/docs/configuration/restream.md | 4 ++- docs/docs/configuration/review.md | 2 +- docs/docs/configuration/tls.md | 6 ++--- docs/docs/configuration/zones.md | 6 ++++- docs/docs/guides/getting_started.md | 13 +++++----- docs/docusaurus.config.ts | 11 ++++++++ docs/src/css/custom.css | 8 ++++++ 20 files changed, 78 insertions(+), 42 deletions(-) diff --git a/docs/docs/configuration/authentication.md b/docs/docs/configuration/authentication.md index a312b5944..694c4bada 100644 --- a/docs/docs/configuration/authentication.md +++ b/docs/docs/configuration/authentication.md @@ -232,7 +232,7 @@ The viewer role provides read-only access to all cameras in the UI and API. Cust ### Role Configuration Example -```yaml +```yaml {11-16} cameras: front_door: # ... camera config diff --git a/docs/docs/configuration/birdseye.md b/docs/docs/configuration/birdseye.md index d4bd1a15e..f48299aec 100644 --- a/docs/docs/configuration/birdseye.md +++ b/docs/docs/configuration/birdseye.md @@ -24,7 +24,7 @@ A custom icon can be added to the birdseye background by providing a 180x180 ima If you want to include a camera in Birdseye view only for specific circumstances, or just don't include it at all, the Birdseye setting can be set at the camera level. -```yaml +```yaml {8-10,12-14} # Include all cameras by default in Birdseye view birdseye: enabled: True @@ -48,6 +48,7 @@ By default birdseye shows all cameras that have had the configured activity in t ```yaml birdseye: enabled: True + # highlight-next-line inactivity_threshold: 15 ``` @@ -78,9 +79,11 @@ birdseye: cameras: front: birdseye: + # highlight-next-line order: 1 back: birdseye: + # highlight-next-line order: 2 ``` @@ -92,7 +95,7 @@ It is possible to limit the number of cameras shown on birdseye at one time. Whe For example, this can be configured to only show the most recently active camera. -```yaml +```yaml {3-4} birdseye: enabled: True layout: @@ -103,7 +106,7 @@ birdseye: By default birdseye tries to fit 2 cameras in each row and then double in size until a suitable layout is found. The scaling can be configured with a value between 1.0 and 5.0 depending on use case. -```yaml +```yaml {3-4} birdseye: enabled: True layout: diff --git a/docs/docs/configuration/camera_specific.md b/docs/docs/configuration/camera_specific.md index 50d5c52aa..c18b87f2e 100644 --- a/docs/docs/configuration/camera_specific.md +++ b/docs/docs/configuration/camera_specific.md @@ -23,6 +23,7 @@ Some cameras support h265 with different formats, but Safari only supports the a cameras: h265_cam: # <------ Doesn't matter what the camera is called ffmpeg: + # highlight-next-line apple_compatibility: true # <- Adds compatibility with MacOS and iPhone ``` @@ -30,7 +31,7 @@ cameras: Note that mjpeg cameras require encoding the video into h264 for recording, and restream roles. This will use significantly more CPU than if the cameras supported h264 feeds directly. It is recommended to use the restream role to create an h264 restream and then use that as the source for ffmpeg. -```yaml +```yaml {3,10} go2rtc: streams: mjpeg_cam: "ffmpeg:http://your_mjpeg_stream_url#video=h264#hardware" # <- use hardware acceleration to create an h264 stream usable for other components. @@ -96,6 +97,7 @@ This camera is H.265 only. To be able to play clips on some devices (like MacOs cameras: annkec800: # <------ Name the camera ffmpeg: + # highlight-next-line apple_compatibility: true # <- Adds compatibility with MacOS and iPhone output_args: record: preset-record-generic-audio-aac @@ -274,7 +276,7 @@ To use a USB camera (webcam) with Frigate, the recommendation is to use go2rtc's - In your Frigate Configuration File, add the go2rtc stream and roles as appropriate: -``` +```yaml {4,11-12} go2rtc: streams: usb_camera: diff --git a/docs/docs/configuration/cameras.md b/docs/docs/configuration/cameras.md index 47efa5bba..eed430b52 100644 --- a/docs/docs/configuration/cameras.md +++ b/docs/docs/configuration/cameras.md @@ -66,7 +66,7 @@ Not every PTZ supports ONVIF, which is the standard protocol Frigate uses to com Add the onvif section to your camera in your configuration file: -```yaml +```yaml {4-8} cameras: back: ffmpeg: ... diff --git a/docs/docs/configuration/custom_classification/object_classification.md b/docs/docs/configuration/custom_classification/object_classification.md index fe1b9d0ea..caf05d8f3 100644 --- a/docs/docs/configuration/custom_classification/object_classification.md +++ b/docs/docs/configuration/custom_classification/object_classification.md @@ -118,6 +118,7 @@ Enable debug logs for classification models by adding `frigate.data_processing.r logger: default: info logs: + # highlight-next-line frigate.data_processing.real_time.custom_classification: debug ``` diff --git a/docs/docs/configuration/custom_classification/state_classification.md b/docs/docs/configuration/custom_classification/state_classification.md index ad6fb92fc..c41d05439 100644 --- a/docs/docs/configuration/custom_classification/state_classification.md +++ b/docs/docs/configuration/custom_classification/state_classification.md @@ -85,6 +85,7 @@ Enable debug logs for classification models by adding `frigate.data_processing.r logger: default: info logs: + # highlight-next-line frigate.data_processing.real_time.custom_classification: debug ``` diff --git a/docs/docs/configuration/genai/config.md b/docs/docs/configuration/genai/config.md index e1f79b744..cde503e8b 100644 --- a/docs/docs/configuration/genai/config.md +++ b/docs/docs/configuration/genai/config.md @@ -109,7 +109,7 @@ genai: To use a different Gemini-compatible API endpoint, set the `provider_options` with the `base_url` key to your provider's API URL. For example: -``` +```yaml {4,5} genai: provider: gemini ... @@ -152,7 +152,7 @@ To use a different OpenAI-compatible API endpoint, set the `OPENAI_BASE_URL` env For OpenAI-compatible servers (such as llama.cpp) that don't expose the configured context size in the API response, you can manually specify the context size in `provider_options`: -```yaml +```yaml {5,6} genai: provider: openai base_url: http://your-llama-server diff --git a/docs/docs/configuration/genai/review_summaries.md b/docs/docs/configuration/genai/review_summaries.md index df287446c..c0d677a01 100644 --- a/docs/docs/configuration/genai/review_summaries.md +++ b/docs/docs/configuration/genai/review_summaries.md @@ -80,6 +80,7 @@ By default, review summaries use preview images (cached preview frames) which ha review: genai: enabled: true + # highlight-next-line image_source: recordings # Options: "preview" (default) or "recordings" ``` @@ -104,7 +105,7 @@ If recordings are not available for a given time period, the system will automat Along with the concern of suspicious activity or immediate threat, you may have concerns such as animals in your garden or a gate being left open. These concerns can be configured so that the review summaries will make note of them if the activity requires additional review. For example: -```yaml +```yaml {4,5} review: genai: enabled: true @@ -116,7 +117,7 @@ review: By default, review summaries are generated in English. You can configure Frigate to generate summaries in your preferred language by setting the `preferred_language` option: -```yaml +```yaml {4} review: genai: enabled: true diff --git a/docs/docs/configuration/hardware_acceleration_video.md b/docs/docs/configuration/hardware_acceleration_video.md index 46be3bb50..318e1b23e 100644 --- a/docs/docs/configuration/hardware_acceleration_video.md +++ b/docs/docs/configuration/hardware_acceleration_video.md @@ -117,12 +117,13 @@ services: frigate: ... image: ghcr.io/blakeblackshear/frigate:stable + # highlight-next-line privileged: true ``` ##### Docker Run CLI - Privileged -```bash +```bash {4} docker run -d \ --name frigate \ ... @@ -136,7 +137,7 @@ Only recent versions of Docker support the `CAP_PERFMON` capability. You can tes ##### Docker Compose - CAP_PERFMON -```yaml +```yaml {5,6} services: frigate: ... @@ -147,7 +148,7 @@ services: ##### Docker Run CLI - CAP_PERFMON -```bash +```bash {4} docker run -d \ --name frigate \ ... @@ -214,7 +215,7 @@ Additional configuration is needed for the Docker container to be able to access #### Docker Compose - Nvidia GPU -```yaml +```yaml {5-12} services: frigate: ... @@ -231,7 +232,7 @@ services: #### Docker Run CLI - Nvidia GPU -```bash +```bash {4} docker run -d \ --name frigate \ ... @@ -310,7 +311,7 @@ ffmpeg: If running Frigate through Docker, you either need to run in privileged mode or map the `/dev/video*` devices to Frigate. With Docker Compose add: -```yaml +```yaml {4-5} services: frigate: ... @@ -320,7 +321,7 @@ services: Or with `docker run`: -```bash +```bash {4} docker run -d \ --name frigate \ ... @@ -352,7 +353,7 @@ You will need to use the image with the nvidia container runtime: ### Docker Run CLI - Jetson -```bash +```bash {3} docker run -d \ ... --runtime nvidia @@ -361,7 +362,7 @@ docker run -d \ ### Docker Compose - Jetson -```yaml +```yaml {5} services: frigate: ... @@ -452,14 +453,14 @@ Restarting ffmpeg... you should try to uprade to FFmpeg 7. This can be done using this config option: -``` +```yaml ffmpeg: path: "7.0" ``` You can set this option globally to use FFmpeg 7 for all cameras or on camera level to use it only for specific cameras. Do not confuse this option with: -``` +```yaml cameras: name: ffmpeg: @@ -481,7 +482,7 @@ Make sure to follow the [Synaptics specific installation instructions](/frigate/ Add one of the following FFmpeg presets to your `config.yml` to enable hardware video processing: -```yaml +```yaml {2} ffmpeg: hwaccel_args: -c:v h264_v4l2m2m input_args: preset-rtsp-restream diff --git a/docs/docs/configuration/license_plate_recognition.md b/docs/docs/configuration/license_plate_recognition.md index 0450bcef2..a44006b63 100644 --- a/docs/docs/configuration/license_plate_recognition.md +++ b/docs/docs/configuration/license_plate_recognition.md @@ -43,7 +43,7 @@ lpr: Like other enrichments in Frigate, LPR **must be enabled globally** to use the feature. You should disable it for specific cameras at the camera level if you don't want to run LPR on cars on those cameras: -```yaml +```yaml {4,5} cameras: garage: ... @@ -391,6 +391,7 @@ Start with ["Why isn't my license plate being detected and recognized?"](#why-is logger: default: info logs: + # highlight-next-line frigate.data_processing.common.license_plate: debug ``` diff --git a/docs/docs/configuration/live.md b/docs/docs/configuration/live.md index c55d29a59..8e7eff163 100644 --- a/docs/docs/configuration/live.md +++ b/docs/docs/configuration/live.md @@ -77,7 +77,7 @@ Configure the `streams` option with a "friendly name" for your stream followed b Using Frigate's internal version of go2rtc is required to use this feature. You cannot specify paths in the `streams` configuration, only go2rtc stream names. -```yaml +```yaml {3,6,8,25-29} go2rtc: streams: test_cam: @@ -116,7 +116,7 @@ WebRTC works by creating a TCP or UDP connection on port `8555`. However, it req - For external access, over the internet, setup your router to forward port `8555` to port `8555` on the Frigate device, for both TCP and UDP. - For internal/local access, unless you are running through the HA App, you will also need to set the WebRTC candidates list in the go2rtc config. For example, if `192.168.1.10` is the local IP of the device running Frigate: - ```yaml title="config.yml" + ```yaml title="config.yml" {4-7} go2rtc: streams: test_cam: ... @@ -154,7 +154,7 @@ If not running in host mode, port 8555 will need to be mapped for the container: docker-compose.yml -```yaml +```yaml {4-6} services: frigate: ... diff --git a/docs/docs/configuration/object_detectors.md b/docs/docs/configuration/object_detectors.md index 0a80ea463..9bdacfb28 100644 --- a/docs/docs/configuration/object_detectors.md +++ b/docs/docs/configuration/object_detectors.md @@ -572,7 +572,7 @@ $ docker run --device=/dev/kfd --device=/dev/dri \ When using Docker Compose: -```yaml +```yaml {4-6} services: frigate: ... @@ -603,7 +603,7 @@ $ docker run -e HSA_OVERRIDE_GFX_VERSION=10.0.0 \ When using Docker Compose: -```yaml +```yaml {4-5} services: frigate: ... diff --git a/docs/docs/configuration/record.md b/docs/docs/configuration/record.md index 4dfd8b77c..4d696dad0 100644 --- a/docs/docs/configuration/record.md +++ b/docs/docs/configuration/record.md @@ -130,7 +130,7 @@ When exporting a time-lapse the default speed-up is 25x with 30 FPS. This means To configure the speed-up factor, the frame rate and further custom settings, the configuration parameter `timelapse_args` can be used. The below configuration example would change the time-lapse speed to 60x (for fitting 1 hour of recording into 1 minute of time-lapse) with 25 FPS: -```yaml +```yaml {3-4} record: enabled: True export: diff --git a/docs/docs/configuration/restream.md b/docs/docs/configuration/restream.md index ebd506294..875d9a292 100644 --- a/docs/docs/configuration/restream.md +++ b/docs/docs/configuration/restream.md @@ -34,7 +34,7 @@ To improve connection speed when using Birdseye via restream you can enable a sm The go2rtc restream can be secured with RTSP based username / password authentication. Ex: -```yaml +```yaml {2-4} go2rtc: rtsp: username: "admin" @@ -147,6 +147,7 @@ For example: ```yaml go2rtc: streams: + # highlight-error-line my_camera: rtsp://username:$@foo%@192.168.1.100 ``` @@ -155,6 +156,7 @@ becomes ```yaml go2rtc: streams: + # highlight-next-line my_camera: rtsp://username:$%40foo%25@192.168.1.100 ``` diff --git a/docs/docs/configuration/review.md b/docs/docs/configuration/review.md index 752c496a3..d8769749b 100644 --- a/docs/docs/configuration/review.md +++ b/docs/docs/configuration/review.md @@ -71,7 +71,7 @@ To exclude a specific camera from alerts or detections, simply provide an empty For example, to exclude objects on the camera _gatecamera_ from any detections, include this in your config: -```yaml +```yaml {3-5} cameras: gatecamera: review: diff --git a/docs/docs/configuration/tls.md b/docs/docs/configuration/tls.md index 5c3867ea6..b4bfc1842 100644 --- a/docs/docs/configuration/tls.md +++ b/docs/docs/configuration/tls.md @@ -20,7 +20,7 @@ tls: TLS certificates can be mounted at `/etc/letsencrypt/live/frigate` using a bind mount or docker volume. -```yaml +```yaml {3-4} frigate: ... volumes: @@ -32,7 +32,7 @@ Within the folder, the private key is expected to be named `privkey.pem` and the Note that certbot uses symlinks, and those can't be followed by the container unless it has access to the targets as well, so if using certbot you'll also have to mount the `archive` folder for your domain, e.g.: -```yaml +```yaml {3-5} frigate: ... volumes: @@ -46,7 +46,7 @@ Frigate automatically compares the fingerprint of the certificate at `/etc/letse If you issue Frigate valid certificates you will likely want to configure it to run on port 443 so you can access it without a port number like `https://your-frigate-domain.com` by mapping 8971 to 443. -```yaml +```yaml {3-4} frigate: ... ports: diff --git a/docs/docs/configuration/zones.md b/docs/docs/configuration/zones.md index c0a11d4f6..856fe9b48 100644 --- a/docs/docs/configuration/zones.md +++ b/docs/docs/configuration/zones.md @@ -18,7 +18,7 @@ To create a zone, follow [the steps for a "Motion mask"](masks.md), but use the Often you will only want alerts to be created when an object enters areas of interest. This is done using zones along with setting required_zones. Let's say you only want to have an alert created when an object enters your entire_yard zone, the config would be: -```yaml +```yaml {6,8} cameras: name_of_your_camera: review: @@ -104,6 +104,7 @@ cameras: name_of_your_camera: zones: sidewalk: + # highlight-next-line loitering_time: 4 # unit is in seconds objects: - person @@ -118,6 +119,7 @@ cameras: name_of_your_camera: zones: front_yard: + # highlight-next-line inertia: 3 objects: - person @@ -130,6 +132,7 @@ cameras: name_of_your_camera: zones: driveway_entrance: + # highlight-next-line inertia: 1 objects: - car @@ -192,5 +195,6 @@ cameras: coordinates: ... distances: ... inertia: 1 + # highlight-next-line speed_threshold: 20 # unit is in kph or mph, depending on how unit_system is set (see above) ``` diff --git a/docs/docs/guides/getting_started.md b/docs/docs/guides/getting_started.md index 1100a759b..7bdf3d162 100644 --- a/docs/docs/guides/getting_started.md +++ b/docs/docs/guides/getting_started.md @@ -150,7 +150,7 @@ Here is an example configuration with hardware acceleration configured to work w `docker-compose.yml` (after modifying, you will need to run `docker compose up -d` to apply changes) -```yaml +```yaml {4,5} services: frigate: ... @@ -168,6 +168,7 @@ cameras: name_of_your_camera: ffmpeg: inputs: ... + # highlight-next-line hwaccel_args: preset-vaapi detect: ... ``` @@ -183,7 +184,7 @@ In many cases, the integrated graphics on Intel CPUs provides sufficient perform You need to refer to **Configure hardware acceleration** above to enable the container to use the GPU. -```yaml +```yaml {3-6,9-15,20-21} mqtt: ... detectors: # <---- add detectors @@ -217,7 +218,7 @@ If you have a USB Coral, you will need to add a detectors section to your config `docker-compose.yml` (after modifying, you will need to run `docker compose up -d` to apply changes) -```yaml +```yaml {4-6} services: frigate: ... @@ -227,7 +228,7 @@ services: ... ``` -```yaml +```yaml {3-6,11-12} mqtt: ... detectors: # <---- add detectors @@ -263,7 +264,7 @@ Note that motion masks should not be used to mark out areas where you do not wan Your configuration should look similar to this now. -```yaml +```yaml {16-18} mqtt: enabled: False @@ -290,7 +291,7 @@ In order to review activity in the Frigate UI, recordings need to be enabled. To enable recording video, add the `record` role to a stream and enable it in the config. If record is disabled in the config, it won't be possible to enable it in the UI. -```yaml +```yaml {16-17} mqtt: ... detectors: ... diff --git a/docs/docusaurus.config.ts b/docs/docusaurus.config.ts index dca948953..e11cdd555 100644 --- a/docs/docusaurus.config.ts +++ b/docs/docusaurus.config.ts @@ -83,6 +83,17 @@ const config: Config = { }, }, prism: { + magicComments:[ + { + className: 'theme-code-block-highlighted-line', + line: 'highlight-next-line', + block: {start: 'highlight-start', end: 'highlight-end'}, + }, + { + className: 'code-block-error-line', + line: 'highlight-error-line', + }, + ], additionalLanguages: ["bash", "json"], }, languageTabs: [ diff --git a/docs/src/css/custom.css b/docs/src/css/custom.css index 9a572ec1f..5d8fc5055 100644 --- a/docs/src/css/custom.css +++ b/docs/src/css/custom.css @@ -234,3 +234,11 @@ content: "schema"; color: var(--ifm-color-secondary-contrast-foreground); } + +.code-block-error-line { + background-color: #ff000020; + display: block; + margin: 0 calc(-1 * var(--ifm-pre-padding)); + padding: 0 var(--ifm-pre-padding); + border-left: 3px solid #ff000080; +} \ No newline at end of file From 59fc8449edd09aefe758624e1663e6dcb305ccfc Mon Sep 17 00:00:00 2001 From: Nicolas Mowen Date: Tue, 10 Mar 2026 13:26:45 -0600 Subject: [PATCH 22/38] Various Fixes (#22376) * Correctly send topic with role value * Fix missing previews * Catch other one --- frigate/data_processing/post/review_descriptions.py | 7 +++++++ frigate/video.py | 4 ++-- 2 files changed, 9 insertions(+), 2 deletions(-) diff --git a/frigate/data_processing/post/review_descriptions.py b/frigate/data_processing/post/review_descriptions.py index 0a2754468..9949d766c 100644 --- a/frigate/data_processing/post/review_descriptions.py +++ b/frigate/data_processing/post/review_descriptions.py @@ -463,6 +463,13 @@ class ReviewDescriptionProcessor(PostProcessorApi): thumbs = [] for idx, thumb_path in enumerate(frame_paths): thumb_data = cv2.imread(thumb_path) + + if thumb_data is None: + logger.warning( + "Could not read preview frame at %s, skipping", thumb_path + ) + continue + ret, jpg = cv2.imencode( ".jpg", thumb_data, [int(cv2.IMWRITE_JPEG_QUALITY), 100] ) diff --git a/frigate/video.py b/frigate/video.py index 112844543..38a397404 100755 --- a/frigate/video.py +++ b/frigate/video.py @@ -436,7 +436,7 @@ class CameraWatchdog(threading.Thread): for role in p["roles"]: self.requestor.send_data( - f"{self.config.name}/status/{role}", "offline" + f"{self.config.name}/status/{role.value}", "offline" ) continue @@ -451,7 +451,7 @@ class CameraWatchdog(threading.Thread): for role in p["roles"]: self.requestor.send_data( - f"{self.config.name}/status/{role}", "offline" + f"{self.config.name}/status/{role.value}", "offline" ) p["logpipe"].dump() From 104e623923b017fac5ec080846fea5de2e13ba9b Mon Sep 17 00:00:00 2001 From: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com> Date: Wed, 11 Mar 2026 09:26:09 -0500 Subject: [PATCH 23/38] Filter push notifications by user role camera access (#22385) * filter push notifications by user camera access with cached role resolution * simplify --- frigate/api/auth.py | 3 ++ frigate/comms/webpush.py | 70 +++++++++++++++++++++++++++++++++------- 2 files changed, 62 insertions(+), 11 deletions(-) diff --git a/frigate/api/auth.py b/frigate/api/auth.py index 7c3a231ed..d774b3697 100644 --- a/frigate/api/auth.py +++ b/frigate/api/auth.py @@ -837,6 +837,7 @@ def create_user( User.notification_tokens: [], } ).execute() + request.app.config_publisher.publisher.publish("config/auth", None) return JSONResponse(content={"username": body.username}) @@ -854,6 +855,7 @@ def delete_user(request: Request, username: str): ) User.delete_by_id(username) + request.app.config_publisher.publisher.publish("config/auth", None) return JSONResponse(content={"success": True}) @@ -973,6 +975,7 @@ async def update_role( ) User.set_by_id(username, {User.role: body.role}) + request.app.config_publisher.publisher.publish("config/auth", None) return JSONResponse(content={"success": True}) diff --git a/frigate/comms/webpush.py b/frigate/comms/webpush.py index 62cc12c9a..30de43a68 100644 --- a/frigate/comms/webpush.py +++ b/frigate/comms/webpush.py @@ -17,6 +17,7 @@ from titlecase import titlecase from frigate.comms.base_communicator import Communicator from frigate.comms.config_updater import ConfigSubscriber from frigate.config import FrigateConfig +from frigate.config.auth import AuthConfig from frigate.config.camera.updater import ( CameraConfigUpdateEnum, CameraConfigUpdateSubscriber, @@ -58,6 +59,7 @@ class WebPushClient(Communicator): for c in self.config.cameras.values() } self.last_notification_time: float = 0 + self.user_cameras: dict[str, set[str]] = {} self.notification_queue: queue.Queue[PushNotification] = queue.Queue() self.notification_thread = threading.Thread( target=self._process_notifications, daemon=True @@ -78,13 +80,12 @@ class WebPushClient(Communicator): for sub in user["notification_tokens"]: self.web_pushers[user["username"]].append(WebPusher(sub)) - # notification config updater - self.global_config_subscriber = ConfigSubscriber( - "config/notifications", exact=True - ) + # notification and auth config updater + self.global_config_subscriber = ConfigSubscriber("config/") self.config_subscriber = CameraConfigUpdateSubscriber( self.config, self.config.cameras, [CameraConfigUpdateEnum.notifications] ) + self._refresh_user_cameras() def subscribe(self, receiver: Callable) -> None: """Wrapper for allowing dispatcher to subscribe.""" @@ -164,13 +165,19 @@ class WebPushClient(Communicator): def publish(self, topic: str, payload: Any, retain: bool = False) -> None: """Wrapper for publishing when client is in valid state.""" - # check for updated notification config - _, updated_notification_config = ( - self.global_config_subscriber.check_for_update() - ) - - if updated_notification_config: - self.config.notifications = updated_notification_config + # check for updated global config (notifications, auth) + while True: + config_topic, config_payload = ( + self.global_config_subscriber.check_for_update() + ) + if config_topic is None: + break + if config_topic == "config/notifications" and config_payload: + self.config.notifications = config_payload + elif config_topic == "config/auth": + if isinstance(config_payload, AuthConfig): + self.config.auth = config_payload + self._refresh_user_cameras() updates = self.config_subscriber.check_for_updates() @@ -291,6 +298,31 @@ class WebPushClient(Communicator): except Exception as e: logger.error(f"Error processing notification: {str(e)}") + def _refresh_user_cameras(self) -> None: + """Rebuild the user-to-cameras access cache from the database.""" + all_camera_names = set(self.config.cameras.keys()) + roles_dict = self.config.auth.roles + updated: dict[str, set[str]] = {} + for user in User.select(User.username, User.role).dicts().iterator(): + allowed = User.get_allowed_cameras( + user["role"], roles_dict, all_camera_names + ) + updated[user["username"]] = set(allowed) + logger.debug( + "User %s has access to cameras: %s", + user["username"], + ", ".join(allowed), + ) + self.user_cameras = updated + + def _user_has_camera_access(self, username: str, camera: str) -> bool: + """Check if a user has access to a specific camera based on cached roles.""" + allowed = self.user_cameras.get(username) + if allowed is None: + logger.debug(f"No camera access information found for user {username}") + return False + return camera in allowed + def _within_cooldown(self, camera: str) -> bool: now = datetime.datetime.now().timestamp() if now - self.last_notification_time < self.config.notifications.cooldown: @@ -418,6 +450,14 @@ class WebPushClient(Communicator): logger.debug(f"Sending push notification for {camera}, review ID {reviewId}") for user in self.web_pushers: + if not self._user_has_camera_access(user, camera): + logger.debug( + "Skipping notification for user %s - no access to camera %s", + user, + camera, + ) + continue + self.send_push_notification( user=user, payload=payload, @@ -465,6 +505,14 @@ class WebPushClient(Communicator): ) for user in self.web_pushers: + if not self._user_has_camera_access(user, camera): + logger.debug( + "Skipping notification for user %s - no access to camera %s", + user, + camera, + ) + continue + self.send_push_notification( user=user, payload=payload, From 544d3c6139cce1c1c3f07da228aefdb8bfd738fc Mon Sep 17 00:00:00 2001 From: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com> Date: Wed, 11 Mar 2026 09:27:10 -0500 Subject: [PATCH 24/38] keep nav buttons visible (#22384) nav buttons would be hidden when closing and reopening dialog after selecting the tracking details pane --- web/src/components/overlay/detail/SearchDetailDialog.tsx | 9 +++++++++ 1 file changed, 9 insertions(+) diff --git a/web/src/components/overlay/detail/SearchDetailDialog.tsx b/web/src/components/overlay/detail/SearchDetailDialog.tsx index 01e211eec..85f237f66 100644 --- a/web/src/components/overlay/detail/SearchDetailDialog.tsx +++ b/web/src/components/overlay/detail/SearchDetailDialog.tsx @@ -495,6 +495,15 @@ export default function SearchDetailDialog({ } }, [search]); + useEffect(() => { + if (!isDesktop || !onPrevious || !onNext) { + setShowNavigationButtons(false); + return; + } + + setShowNavigationButtons(isOpen); + }, [isOpen, onNext, onPrevious]); + // show/hide annotation settings is handled inside TabsWithActions const searchTabs = useMemo(() => { From f29ee53fb4276da06fd42aefb67704abaab2f22c Mon Sep 17 00:00:00 2001 From: Nicolas Mowen Date: Fri, 13 Mar 2026 07:02:42 -0600 Subject: [PATCH 25/38] Add handler for license plate which is not expected to be stationary (#22416) --- frigate/track/stationary_classifier.py | 11 +++++++++++ 1 file changed, 11 insertions(+) diff --git a/frigate/track/stationary_classifier.py b/frigate/track/stationary_classifier.py index 832df5d31..bea37f641 100644 --- a/frigate/track/stationary_classifier.py +++ b/frigate/track/stationary_classifier.py @@ -55,6 +55,14 @@ DYNAMIC_OBJECT_THRESHOLDS = StationaryThresholds( motion_classifier_enabled=True, ) +# Thresholds for objects that are not expected to be stationary +NON_STATIONARY_OBJECT_THRESHOLDS = StationaryThresholds( + objects=["license_plate"], + known_active_iou=0.9, + stationary_check_iou=0.9, + max_stationary_history=4, +) + def get_stationary_threshold(label: str) -> StationaryThresholds: """Get the stationary thresholds for a given object label.""" @@ -65,6 +73,9 @@ def get_stationary_threshold(label: str) -> StationaryThresholds: if label in DYNAMIC_OBJECT_THRESHOLDS.objects: return DYNAMIC_OBJECT_THRESHOLDS + if label in NON_STATIONARY_OBJECT_THRESHOLDS.objects: + return NON_STATIONARY_OBJECT_THRESHOLDS + return StationaryThresholds() From 614a6b39d4d1dc844f98748671965079ba8e3e78 Mon Sep 17 00:00:00 2001 From: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com> Date: Fri, 13 Mar 2026 08:05:56 -0500 Subject: [PATCH 26/38] consistently sort class names (#22415) keep None at the bottom --- .../overlay/ClassificationSelectionDialog.tsx | 28 ++-- .../classification/ModelTrainingView.tsx | 122 +++++++++--------- 2 files changed, 81 insertions(+), 69 deletions(-) diff --git a/web/src/components/overlay/ClassificationSelectionDialog.tsx b/web/src/components/overlay/ClassificationSelectionDialog.tsx index 6398348a4..f99011423 100644 --- a/web/src/components/overlay/ClassificationSelectionDialog.tsx +++ b/web/src/components/overlay/ClassificationSelectionDialog.tsx @@ -125,17 +125,23 @@ export default function ClassificationSelectionDialog({ isMobile && "gap-2 pb-4", )} > - {classes.sort().map((category) => ( - onCategorizeImage(category)} - > - {category === "none" - ? t("details.none") - : category.replaceAll("_", " ")} - - ))} + {classes + .sort((a, b) => { + if (a === "none") return 1; + if (b === "none") return -1; + return a.localeCompare(b); + }) + .map((category) => ( + onCategorizeImage(category)} + > + {category === "none" + ? t("details.none") + : category.replaceAll("_", " ")} + + ))} )} - {Object.keys(dataset).map((id) => ( - -
setPageToggle(id)} + {Object.keys(dataset) + .sort((a, b) => { + if (a === "none") return 1; + if (b === "none") return -1; + return a.localeCompare(b); + }) + .map((id) => ( + - {id === "none" ? t("details.none") : id.replaceAll("_", " ")} - - ({dataset?.[id].length}) - -
- {id != "none" && ( -
- - - - - - - {t("button.renameCategory")} - - - - - - - - - - {t("button.deleteCategory")} - - - +
setPageToggle(id)} + > + {id === "none" ? t("details.none") : id.replaceAll("_", " ")} + + ({dataset?.[id].length}) +
- )} - - ))} + {id != "none" && ( +
+ + + + + + + {t("button.renameCategory")} + + + + + + + + + + {t("button.deleteCategory")} + + + +
+ )} + + ))} From d2b2faa2d721beb758e19044ff64e02f400f382a Mon Sep 17 00:00:00 2001 From: leccelecce <24962424+leccelecce@users.noreply.github.com> Date: Fri, 13 Mar 2026 14:16:10 +0000 Subject: [PATCH 27/38] Update dev contrib docs with Python checks (#22419) --- docs/docs/development/contributing.md | 30 +++++++++++++++++++++++++++ 1 file changed, 30 insertions(+) diff --git a/docs/docs/development/contributing.md b/docs/docs/development/contributing.md index 0a3b76990..14c39e248 100644 --- a/docs/docs/development/contributing.md +++ b/docs/docs/development/contributing.md @@ -89,6 +89,14 @@ After closing VS Code, you may still have containers running. To close everythin ### Testing +#### Unit Tests + +GitHub will execute unit tests on new PRs. You must ensure that all tests pass. + +```shell +python3 -u -m unittest +``` + #### FFMPEG Hardware Acceleration The following commands are used inside the container to ensure hardware acceleration is working properly. @@ -125,6 +133,28 @@ ffmpeg -hwaccel vaapi -hwaccel_device /dev/dri/renderD128 -hwaccel_output_format ffmpeg -c:v h264_qsv -re -stream_loop -1 -i https://streams.videolan.org/ffmpeg/incoming/720p60.mp4 -f rawvideo -pix_fmt yuv420p pipe: > /dev/null ``` +### Submitting a pull request + +Code must be formatted, linted and type-tested. GitHub will run these checks on pull requests, so it is advised to run them yourself prior to opening. + +**Formatting** + +```shell +ruff format frigate migrations docker *.py +``` + +**Linting** + +```shell +ruff check frigate migrations docker *.py +``` + +**MyPy Static Typing** + +```shell +python3 -u -m mypy --config-file frigate/mypy.ini frigate +``` + ## Web Interface ### Prerequisites From b147b535224805d835f418827215e1f26e138daa Mon Sep 17 00:00:00 2001 From: Josh Hawkins <32435876+hawkeye217@users.noreply.github.com> Date: Fri, 13 Mar 2026 09:43:07 -0500 Subject: [PATCH 28/38] add padding to dropdown text (#22420) --- web/src/pages/FaceLibrary.tsx | 6 +++--- web/src/views/classification/ModelTrainingView.tsx | 6 +++--- 2 files changed, 6 insertions(+), 6 deletions(-) diff --git a/web/src/pages/FaceLibrary.tsx b/web/src/pages/FaceLibrary.tsx index 7595b3cd9..666057110 100644 --- a/web/src/pages/FaceLibrary.tsx +++ b/web/src/pages/FaceLibrary.tsx @@ -598,18 +598,18 @@ function LibrarySelector({ {Object.values(faces).map((face) => (
setPageToggle(face)} > {face} - + ({faceData?.[face].length})
-
+