diff --git a/.github/copilot-instructions.md b/.github/copilot-instructions.md
new file mode 100644
index 000000000..4fc44529d
--- /dev/null
+++ b/.github/copilot-instructions.md
@@ -0,0 +1,207 @@
+# Frigate AI Agent Instructions
+
+## Project Overview
+
+Frigate is a **local NVR with realtime AI object detection** for IP cameras. It's a multiprocess Python service that combines:
+
+- Video capture and processing from IP cameras
+- Realtime object detection (TensorFlow, ONNX models on CPU/GPU/AI accelerators)
+- Recording with retention policies
+- Event tracking and review UI
+- REST API + MQTT integration
+
+**Key architectural philosophy**: Minimize resource use by only running expensive detection where/when motion is detected. Heavy use of multiprocessing for FPS and scalability.
+
+## Architecture Patterns
+
+### Multiprocess Communication
+
+Frigate uses **three primary IPC mechanisms** (NOT shared state):
+
+1. **ZMQ pub/sub** (for one-way broadcasts):
+
+ - Config changes: `frigate/comms/config_updater.py` (PUB via `ipc:///tmp/cache/config`)
+ - Object detection signals: `frigate/comms/object_detector_signaler.py`
+ - Detection/event updates: detector signaler, event publishers
+
+2. **ZMQ req/rep** (request-reply):
+
+ - `frigate/comms/inter_process.py` - processes request data and get responses via `ipc:///tmp/cache/comms`
+ - Used by `InterProcessRequestor` for sync queries
+
+3. **Multiprocessing Queues** (frame data):
+ - `detection_queue`: Camera frames → object detectors
+ - `tracked_objects_queue`: Detected objects → event processor
+ - `timeline_queue`: Events → timeline storage
+
+**Key pattern**: Each service publishes to ZMQ topics, others subscribe. Config changes fan out via ZMQ pub/sub to all processes without central coordination.
+
+### Core Services (in `frigate/app.py` FrigateApp)
+
+- **CameraMaintainer** (thread): Spawns camera capture/processing subprocess per camera
+- **ObjectDetectProcess** (subprocess): Runs ML inference on queued frames
+- **TrackedObjectProcessor** (thread): Receives detections, correlates into tracked objects, publishes events
+- **EventProcessor** (thread): Manages event lifecycle, DB updates
+- **RecordProcess** (subprocess): Manages video recording/retention
+- **OutputProcess** (subprocess): Encodes/streams video
+- **ReviewProcess** (subprocess): Processes review segments
+- **EmbeddingProcess** (subprocess): Runs embeddings for semantic search/face/LPR
+
+**Logging pattern**: Central `log.py` uses QueueListener to collect logs from all processes into one queue to avoid multiprocess logging chaos.
+
+## Config System
+
+Configuration is **Pydantic BaseModel** hierarchy:
+
+- **Parsing**: YAML → Pydantic models with validators in `frigate/config/config.py`
+- **Types**: `frigate/types.py` has shared enums (EventType, ObjectType, etc.)
+- **Validation pattern**: Use `@field_validator` with `mode='before'` to transform/validate before assignment
+- **Runtime values**: `RuntimeMotionConfig` applies frame shape transforms to masks
+- **Key files**:
+ - `config.py` - main FrigateConfig entry point
+ - `camera/` - per-camera sub-configs (detect, record, snapshots, etc.)
+ - `classification.py` - face/LPR/audio/semantic search configs
+
+**When adding config**: Create Pydantic model → add to parent config → update migrations if DB schema changes.
+
+## Data Model & Persistence
+
+**Database**: SQLite with custom `SqliteVecQueueDatabase` (vector support for embeddings)
+
+- **Models** in `frigate/models.py`: Event, Timeline, Recordings, User, etc. (Peewee ORM)
+- **Key tables**:
+ - `events` - detected objects (car, person, etc.) with retention policies
+ - `timeline` - events feed (entered_zone, audio, etc.)
+ - `recordings` - video segments with metadata
+ - `review_segments` - flagged clips for review
+
+**Event lifecycle**:
+
+1. TrackedObject detected → Event created with `false_positive=False`
+2. EventProcessor updates Event (score, zones, clips, snapshots)
+3. On object lost, Event gets `end_time` and is finalized
+
+**Migrations**: Use Peewee migrations in `migrations/` - run via `peewee_migrate.Router`.
+
+## Key Workflows
+
+### Adding a New Detector Type
+
+1. Create detector class in `frigate/detectors/plugins/` inheriting `DetectionApi`
+2. Add config class in `frigate/detectors/detector_config.py`
+3. Register in detector factory in `frigate/detectors/__init__.py`
+4. Update `DEFAULT_DETECTORS` constant if it's the default
+
+### Object Detection Pipeline
+
+```
+Camera subprocess → capture frames → motion detect →
+ queue frame to detection_queue →
+ ObjectDetectProcess (inference) →
+ TrackedObjectProcessor (correlate detections) →
+ Event + tracking + DB updates
+```
+
+### Recording Flow
+
+- **24/7 recording**: Segments written every frame duration
+- **Retention**: Deleted if no events + retention time elapsed
+- **Cleanup**: `RecordingCleanup` task deletes old segments based on retention config
+
+### Frontend Translation Pattern
+
+- **Rule**: NEVER hardcode strings in `.ts/.tsx` files
+- **Pattern**: Store strings in `web/src/locales/en.json` → import locale function → use in code
+- **See**: `.cursor/rules/frontend-always-use-translation-files.mdc`
+
+## Common Code Patterns
+
+### Inter-process Config Updates
+
+```python
+# In detector/processor:
+self.config_subscriber = ConfigSubscriber(config, [ConfigUpdateEnum.cameras])
+
+# In main app (FrigateApp):
+publisher = ConfigPublisher()
+publisher.publish("cameras", new_config) # All subscribers notified
+```
+
+### Event Publishing
+
+```python
+from frigate.comms.events_updater import EventUpdatePublisher
+
+publisher = EventUpdatePublisher()
+publisher.publish({"camera": "cam1", "label": "person", ...})
+```
+
+### Shared Memory Frames
+
+```python
+from frigate.util.image import SharedMemoryFrameManager, UntrackedSharedMemory
+
+frame_manager = SharedMemoryFrameManager()
+shm = frame_manager.get(frame_id) # returns np.ndarray view
+```
+
+## Testing & Debugging
+
+**Test structure**: `frigate/test/test_*.py` using Python `unittest`
+
+- Run tests: `make run_tests` (builds Docker, runs in container)
+- Key tests: config parsing, detector inference, frame processing
+
+**Build targets** (Makefile):
+
+- `make local` - builds Docker image locally with version
+- `make debug` - builds with debug logging enabled
+- `make run` - runs container with config volume mounted
+
+**Debugging multiprocess issues**:
+
+- Check `log_queue` output in `frigate/log.py`
+- Enable `DEBUG` logging for specific modules in config
+- Use `faulthandler.enable()` (already enabled in processes) for segfaults
+
+## Important Conventions
+
+- **Imports**: Run Ruff with isort (`extend-select = ["I"]`) - enforces import sorting
+- **GPU/Acceleration**: Hardware detection in `frigate/util/services.py` (NVIDIA, Intel VAAPI, AMD, etc.)
+- **Model paths**: Stored in `/config/model_cache/` (symlinked or volume mounted)
+- **Recording paths**: `/media/frigate/recordings/` (clips in `clips/`, exports in `exports/`)
+- **PID locking**: Use `setproctitle()` to name processes for debugging via `ps`
+
+## Files to Know
+
+| File | Purpose |
+| ------------------------------------ | ---------------------------------------- |
+| `frigate/app.py` | Main app startup, service orchestration |
+| `frigate/camera/` | Camera subprocess, frame capture, motion |
+| `frigate/track/object_processing.py` | Detection correlation, event publishing |
+| `frigate/events/maintainer.py` | Event lifecycle management |
+| `frigate/config/config.py` | Config parsing & validation |
+| `frigate/comms/` | IPC (ZMQ pub/sub, req/rep) |
+| `frigate/api/fastapi_app.py` | REST API setup |
+| `frigate/models.py` | Database ORM models |
+| `frigate/const.py` | Global constants (paths, defaults) |
+
+## Gotchas & Common Mistakes
+
+1. **Pickle compatibility**: Objects sent over multiprocess queues must be pickleable. Avoid lambdas, file handles.
+2. **Config subscriptions**: Always check `mode='before'` in validators—Pydantic can be confusing.
+3. **Event state confusion**: Events have transient state in `TrackedObjectProcessor` AND persistent state in DB—don't mix them.
+4. **Motion masks**: Frame shape must be applied before creating `RuntimeMotionConfig`—validate in tests.
+5. **ZMQ timing**: Topics must be subscribed BEFORE publisher sends; use small sleep if race condition suspected.
+6. **Frontend strings**: Forgetting to use locale files breaks translations and fails linting.
+
+## External Integration
+
+- **MQTT**: Via `MqttClient` in `frigate/comms/mqtt.py` - publishes detections, accepts commands
+- **Home Assistant**: Native integration via custom component (separate repo)
+- **Frigate+**: Paid cloud sync service - `frigate/plus.py` handles API calls
+- **Webhooks**: Event-triggered POST requests configured per-camera
+
+---
+
+**Last updated**: Branch `review-stream-tweaks` | For architecture deep-dives, start with `frigate/app.py::FrigateApp.__init__()` to see all service wiring.
diff --git a/web/src/components/overlay/detail/AnnotationOffsetSlider.tsx b/web/src/components/overlay/detail/AnnotationOffsetSlider.tsx
index 9f6b6efbd..4af982da5 100644
--- a/web/src/components/overlay/detail/AnnotationOffsetSlider.tsx
+++ b/web/src/components/overlay/detail/AnnotationOffsetSlider.tsx
@@ -91,8 +91,8 @@ export default function AnnotationOffsetSlider({ className }: Props) {
diff --git a/web/src/components/overlay/detail/TrackingDetails.tsx.bak b/web/src/components/overlay/detail/TrackingDetails.tsx.bak
new file mode 100644
index 000000000..06d0f5017
--- /dev/null
+++ b/web/src/components/overlay/detail/TrackingDetails.tsx.bak
@@ -0,0 +1,1023 @@
+import useSWR from "swr";
+import { useCallback, useEffect, useMemo, useRef, useState } from "react";
+import { Event } from "@/types/event";
+import ActivityIndicator from "@/components/indicators/activity-indicator";
+import { Button } from "@/components/ui/button";
+import { TrackingDetailsSequence } from "@/types/timeline";
+import Heading from "@/components/ui/heading";
+import { FrigateConfig } from "@/types/frigateConfig";
+import { formatUnixTimestampToDateTime } from "@/utils/dateUtil";
+import { getIconForLabel } from "@/utils/iconUtil";
+import {
+ LuCircle,
+ LuCircleDot,
+ LuEar,
+ LuPlay,
+ LuSettings,
+ LuTruck,
+} from "react-icons/lu";
+import { IoMdExit } from "react-icons/io";
+import {
+ MdFaceUnlock,
+ MdOutlineLocationOn,
+ MdOutlinePictureInPictureAlt,
+} from "react-icons/md";
+import { cn } from "@/lib/utils";
+import {
+ Tooltip,
+ TooltipContent,
+ TooltipTrigger,
+} from "@/components/ui/tooltip";
+import { AnnotationSettingsPane } from "./AnnotationSettingsPane";
+import { TooltipPortal } from "@radix-ui/react-tooltip";
+import HlsVideoPlayer from "@/components/player/HlsVideoPlayer";
+import { baseUrl } from "@/api/baseUrl";
+import { REVIEW_PADDING } from "@/types/review";
+import { ASPECT_VERTICAL_LAYOUT, ASPECT_WIDE_LAYOUT } from "@/types/record";
+import {
+ DropdownMenu,
+ DropdownMenuTrigger,
+ DropdownMenuContent,
+ DropdownMenuItem,
+ DropdownMenuPortal,
+} from "@/components/ui/dropdown-menu";
+import { Link, useNavigate } from "react-router-dom";
+import { getLifecycleItemDescription } from "@/utils/lifecycleUtil";
+import { IoPlayCircleOutline } from "react-icons/io5";
+import { useTranslation } from "react-i18next";
+import { getTranslatedLabel } from "@/utils/i18n";
+import { Badge } from "@/components/ui/badge";
+import { HiDotsHorizontal } from "react-icons/hi";
+import axios from "axios";
+import { toast } from "sonner";
+import { DetailStreamProvider, useDetailStream } from "@/context/detail-stream-context";
+
+type TrackingDetailsProps = {
+ className?: string;
+ event: Event;
+ fullscreen?: boolean;
+ showImage?: boolean;
+ showLifecycle?: boolean;
+ timeIndex?: number;
+ setTimeIndex?: (index: number) => void;
+};
+
+// Wrapper component that provides DetailStreamContext
+export default function TrackingDetails(props: TrackingDetailsProps) {
+ const [currentTime, setCurrentTime] = useState(props.event.start_time ?? 0);
+
+ return (
+
+
+
+ );
+}
+
+// Inner component with access to DetailStreamContext
+function TrackingDetailsInner({
+ className,
+ event,
+ showImage = true,
+ showLifecycle = false,
+ timeIndex: propTimeIndex,
+ setTimeIndex: propSetTimeIndex,
+ onTimeUpdate,
+}: TrackingDetailsProps & { onTimeUpdate: (time: number) => void }) {
+ const { t } = useTranslation(["views/explore"]);
+ const { setSelectedObjectIds, annotationOffset: contextAnnotationOffset, setAnnotationOffset: setContextAnnotationOffset } = useDetailStream();
+
+ const { data: eventSequence } = useSWR
([
+ "timeline",
+ {
+ source_id: event.id,
+ },
+ ]);
+
+ const { data: config } = useSWR("config");
+
+ const videoRef = useRef(null);
+ const containerRef = useRef(null);
+ const [selectedZone, setSelectedZone] = useState("");
+ const [lifecycleZones, setLifecycleZones] = useState([]);
+ const [showControls, setShowControls] = useState(false);
+ const [showZones, setShowZones] = useState(true);
+
+ const aspectRatio = useMemo(() => {
+ if (!config) {
+ return 16 / 9;
+ }
+
+ return (
+ config.cameras[event.camera].detect.width /
+ config.cameras[event.camera].detect.height
+ );
+ }, [config, event]);
+
+ const label = event.sub_label
+ ? event.sub_label
+ : getTranslatedLabel(event.label);
+
+ const getZoneColor = useCallback(
+ (zoneName: string) => {
+ const zoneColor =
+ config?.cameras?.[event.camera]?.zones?.[zoneName]?.color;
+ if (zoneColor) {
+ const reversed = [...zoneColor].reverse();
+ return reversed;
+ }
+ },
+ [config, event],
+ );
+
+ const getObjectColor = useCallback(
+ (label: string) => {
+ const objectColor = config?.model?.colormap[label];
+ if (objectColor) {
+ const reversed = [...objectColor].reverse();
+ return reversed;
+ }
+ },
+ [config],
+ );
+
+ const getZonePolygon = useCallback(
+ (zoneName: string) => {
+ if (!videoRef.current || !config) {
+ return;
+ }
+ const zonePoints =
+ config?.cameras[event.camera].zones[zoneName].coordinates;
+ const videoElement = videoRef.current;
+ const videoWidth = videoElement.videoWidth;
+ const videoHeight = videoElement.videoHeight;
+
+ if (!videoWidth || !videoHeight) {
+ return;
+ }
+
+ return zonePoints
+ .split(",")
+ .map(Number.parseFloat)
+ .reduce((acc, value, index) => {
+ const isXCoordinate = index % 2 === 0;
+ const coordinate = isXCoordinate
+ ? value * videoWidth
+ : value * videoHeight;
+ acc.push(coordinate);
+ return acc;
+ }, [] as number[])
+ .join(",");
+ },
+ [config, event],
+ );
+
+ const [boxStyle, setBoxStyle] = useState(null);
+ const [attributeBoxStyle, setAttributeBoxStyle] =
+ useState(null);
+
+ const configAnnotationOffset = useMemo(() => {
+ if (!config) {
+ return 0;
+ }
+
+ return config.cameras[event.camera]?.detect?.annotation_offset || 0;
+ }, [config, event]);
+
+ const [annotationOffset, setAnnotationOffset] = useState(
+ configAnnotationOffset,
+ );
+
+ const savedPathPoints = useMemo(() => {
+ return (
+ event.data.path_data?.map(([coords, timestamp]: [number[], number]) => ({
+ x: coords[0],
+ y: coords[1],
+ timestamp,
+ lifecycle_item: undefined,
+ })) || []
+ );
+ }, [event.data.path_data]);
+
+ const eventSequencePoints = useMemo(() => {
+ return (
+ eventSequence
+ ?.filter((event) => event.data.box !== undefined)
+ .map((event) => {
+ const [left, top, width, height] = event.data.box!;
+
+ return {
+ x: left + width / 2, // Center x-coordinate
+ y: top + height, // Bottom y-coordinate
+ timestamp: event.timestamp,
+ lifecycle_item: event,
+ };
+ }) || []
+ );
+ }, [eventSequence]);
+
+ // final object path with timeline points included
+ const pathPoints = useMemo(() => {
+ // don't display a path if we don't have any saved path points
+ if (
+ savedPathPoints.length === 0 ||
+ config?.cameras[event.camera]?.onvif.autotracking.enabled_in_config
+ )
+ return [];
+ return [...savedPathPoints, ...eventSequencePoints].sort(
+ (a, b) => a.timestamp - b.timestamp,
+ );
+ }, [savedPathPoints, eventSequencePoints, config, event]);
+
+ const [localTimeIndex, setLocalTimeIndex] = useState(0);
+
+ const timeIndex =
+ propTimeIndex !== undefined ? propTimeIndex : localTimeIndex;
+ const setTimeIndex = propSetTimeIndex || setLocalTimeIndex;
+
+ const handleSetBox = useCallback(
+ (box: number[], attrBox: number[] | undefined) => {
+ if (videoRef.current && Array.isArray(box) && box.length === 4) {
+ const videoElement = videoRef.current;
+ const videoWidth = videoElement.videoWidth;
+ const videoHeight = videoElement.videoHeight;
+
+ if (!videoWidth || !videoHeight || !displayWidth || !displayHeight) {
+ return;
+ }
+
+ const style = {
+ left: `${box[0] * displayWidth}px`,
+ top: `${box[1] * displayHeight}px`,
+ width: `${box[2] * displayWidth}px`,
+ height: `${box[3] * displayHeight}px`,
+ borderColor: `rgb(${getObjectColor(event.label)?.join(",")})`,
+ };
+
+ if (attrBox) {
+ const attrStyle = {
+ left: `${attrBox[0] * displayWidth}px`,
+ top: `${attrBox[1] * displayHeight}px`,
+ width: `${attrBox[2] * displayWidth}px`,
+ height: `${attrBox[3] * displayHeight}px`,
+ borderColor: `rgb(${getObjectColor(event.label)?.join(",")})`,
+ };
+ setAttributeBoxStyle(attrStyle);
+ } else {
+ setAttributeBoxStyle(null);
+ }
+
+ setBoxStyle(style);
+ }
+ },
+ [event, getObjectColor, displayWidth, displayHeight],
+ );
+
+ // carousels
+
+ // Selected lifecycle item index; -1 when viewing a path-only point
+
+ const handlePathPointClick = useCallback(
+ (index: number) => {
+ if (!eventSequence) return;
+ const sequenceIndex = eventSequence.findIndex(
+ (item) => item.timestamp === pathPoints[index].timestamp,
+ );
+ if (sequenceIndex !== -1) {
+ setTimeIndex(eventSequence[sequenceIndex].timestamp);
+ handleSetBox(
+ eventSequence[sequenceIndex]?.data.box ?? [],
+ eventSequence[sequenceIndex]?.data?.attribute_box,
+ );
+ setLifecycleZones(eventSequence[sequenceIndex]?.data.zones);
+ } else {
+ // click on a normal path point, not a lifecycle point
+ setTimeIndex(pathPoints[index].timestamp);
+ setBoxStyle(null);
+ setLifecycleZones([]);
+ }
+ },
+ [eventSequence, pathPoints, handleSetBox, setTimeIndex],
+ );
+
+ const formattedStart = config
+ ? formatUnixTimestampToDateTime(event.start_time ?? 0, {
+ timezone: config.ui.timezone,
+ date_format:
+ config.ui.time_format == "24hour"
+ ? t("time.formattedTimestamp.24hour", {
+ ns: "common",
+ })
+ : t("time.formattedTimestamp.12hour", {
+ ns: "common",
+ }),
+ time_style: "medium",
+ date_style: "medium",
+ })
+ : "";
+
+ const formattedEnd = config
+ ? formatUnixTimestampToDateTime(event.end_time ?? 0, {
+ timezone: config.ui.timezone,
+ date_format:
+ config.ui.time_format == "24hour"
+ ? t("time.formattedTimestamp.24hour", {
+ ns: "common",
+ })
+ : t("time.formattedTimestamp.12hour", {
+ ns: "common",
+ }),
+ time_style: "medium",
+ date_style: "medium",
+ })
+ : "";
+
+ useEffect(() => {
+ if (!eventSequence || eventSequence.length === 0) return;
+ // If timeIndex hasn't been set to a non-zero value, prefer the first lifecycle timestamp
+ if (timeIndex == null || timeIndex === 0) {
+ setTimeIndex(eventSequence[0].timestamp);
+ handleSetBox(
+ eventSequence[0]?.data.box ?? [],
+ eventSequence[0]?.data?.attribute_box,
+ );
+ setLifecycleZones(eventSequence[0]?.data.zones);
+ }
+ }, [eventSequence, timeIndex, handleSetBox, setTimeIndex]);
+
+ // When timeIndex changes, sync the box/zones to matching lifecycle, else clear
+ useEffect(() => {
+ if (!eventSequence || propTimeIndex == null) return;
+ const idx = eventSequence.findIndex((i) => i.timestamp === propTimeIndex);
+ if (idx !== -1) {
+ handleSetBox(
+ eventSequence[idx]?.data.box ?? [],
+ eventSequence[idx]?.data?.attribute_box,
+ );
+ setLifecycleZones(eventSequence[idx]?.data.zones);
+ } else {
+ // Non-lifecycle point (e.g., saved path point)
+ setBoxStyle(null);
+ setLifecycleZones([]);
+ }
+ }, [propTimeIndex, eventSequence, handleSetBox]);
+
+ const selectedIndex = useMemo(() => {
+ if (!eventSequence || eventSequence.length === 0) return 0;
+ const idx = eventSequence.findIndex((i) => i.timestamp === timeIndex);
+ return idx === -1 ? 0 : idx;
+ }, [eventSequence, timeIndex]);
+
+ // Calculate how far down the blue line should extend based on timeIndex
+ const calculateLineHeight = () => {
+ if (!eventSequence || eventSequence.length === 0) return 0;
+
+ const currentTime = timeIndex ?? 0;
+
+ // Find which events have been passed
+ let lastPassedIndex = -1;
+ for (let i = 0; i < eventSequence.length; i++) {
+ if (currentTime >= (eventSequence[i].timestamp ?? 0)) {
+ lastPassedIndex = i;
+ } else {
+ break;
+ }
+ }
+
+ // No events passed yet
+ if (lastPassedIndex < 0) return 0;
+
+ // All events passed
+ if (lastPassedIndex >= eventSequence.length - 1) return 100;
+
+ // Calculate percentage based on item position, not time
+ // Each item occupies an equal visual space regardless of time gaps
+ const itemPercentage = 100 / (eventSequence.length - 1);
+
+ // Find progress between current and next event for smooth transition
+ const currentEvent = eventSequence[lastPassedIndex];
+ const nextEvent = eventSequence[lastPassedIndex + 1];
+ const currentTimestamp = currentEvent.timestamp ?? 0;
+ const nextTimestamp = nextEvent.timestamp ?? 0;
+
+ // Calculate interpolation between the two events
+ const timeBetween = nextTimestamp - currentTimestamp;
+ const timeElapsed = currentTime - currentTimestamp;
+ const interpolation = timeBetween > 0 ? timeElapsed / timeBetween : 0;
+
+ // Base position plus interpolated progress to next item
+ return Math.min(
+ 100,
+ lastPassedIndex * itemPercentage + interpolation * itemPercentage,
+ );
+ };
+
+ const blueLineHeight = calculateLineHeight();
+
+ const videoSource = useMemo(() => {
+ const startTime = event.start_time - REVIEW_PADDING;
+ const endTime = (event.end_time ?? Date.now() / 1000) + REVIEW_PADDING;
+ return `${baseUrl}vod/${event.camera}/start/${startTime}/end/${endTime}/index.m3u8`;
+ }, [event]);
+
+ // Determine camera aspect ratio category
+ const cameraAspect = useMemo(() => {
+ if (!aspectRatio) {
+ return "normal";
+ } else if (aspectRatio > ASPECT_WIDE_LAYOUT) {
+ return "wide";
+ } else if (aspectRatio < ASPECT_VERTICAL_LAYOUT) {
+ return "tall";
+ } else {
+ return "normal";
+ }
+ }, [aspectRatio]);
+
+ // Apply appropriate classes based on aspect ratio
+ const containerClasses = useMemo(() => {
+ if (cameraAspect === "wide") {
+ return "w-full aspect-wide overflow-hidden";
+ } else if (cameraAspect === "tall") {
+ return "h-full aspect-tall overflow-hidden";
+ } else {
+ return "w-full aspect-video overflow-hidden";
+ }
+ }, [cameraAspect]);
+
+ // Check if video metadata has loaded
+ const [videoReady, setVideoReady] = useState(false);
+
+ useEffect(() => {
+ const checkVideoReady = () => {
+ const isReady =
+ videoRef.current?.readyState !== undefined &&
+ videoRef.current.readyState >= HTMLMediaElement.HAVE_METADATA;
+ setVideoReady(isReady);
+
+ if (videoRef.current) {
+ console.log(
+ "Video dimensions:",
+ videoRef.current.clientWidth,
+ videoRef.current.clientHeight,
+ );
+ setDisplayWidth(videoRef.current.clientWidth);
+ setDisplayHeight(videoRef.current.clientHeight);
+ }
+ };
+
+ const video = videoRef.current;
+ if (video) {
+ // Check immediately
+ checkVideoReady();
+
+ // Listen for metadata load
+ video.addEventListener("loadedmetadata", checkVideoReady);
+
+ return () => {
+ video.removeEventListener("loadedmetadata", checkVideoReady);
+ };
+ }
+ }, [videoRef.current]);
+
+ // Seek video to specific time (relative to event start)
+ useEffect(() => {
+ if (videoRef.current && propTimeIndex !== undefined) {
+ const relativeTime =
+ propTimeIndex -
+ event.start_time +
+ REVIEW_PADDING +
+ annotationOffset / 1000;
+ if (relativeTime >= 0) {
+ videoRef.current.currentTime = relativeTime;
+ }
+ }
+ }, [propTimeIndex, event.start_time, annotationOffset]);
+
+ if (!config) {
+ return ;
+ }
+
+ return (
+
+
+
+ {showImage && (
+
+
+
+ {showZones &&
+ videoRef.current &&
+ videoReady &&
+ lifecycleZones?.map((zone) => {
+ return (
+
+ );
+ })}
+
+ {boxStyle && videoReady && (
+
+ )}
+
+ {/* Attribute box overlay */}
+ {attributeBoxStyle && videoReady && (
+
+ )}
+
+ {/* Path overlay */}
+ {videoRef.current &&
+ videoReady &&
+ pathPoints &&
+ pathPoints.length > 0 && (
+
+
+
+ )}
+
+
+
+ )}
+
+ {showLifecycle && (
+ <>
+
+
{t("trackingDetails.title")}
+
+
+
+
+
+
+
+
+ {t("trackingDetails.adjustAnnotationSettings")}
+
+
+
+
+
+
+
+
+ {t("trackingDetails.scrollViewTips")}
+
+
+ {t("trackingDetails.count", {
+ first: selectedIndex + 1,
+ second: eventSequence?.length ?? 0,
+ })}
+
+
+ {config?.cameras[event.camera]?.onvif.autotracking
+ .enabled_in_config && (
+
+ {t("trackingDetails.autoTrackingTips")}
+
+ )}
+ {showControls && (
+
+ )}
+
+
+
+
+
{
+ e.stopPropagation();
+ setTimeIndex(event.start_time ?? 0);
+ }}
+ role="button"
+ >
+
+ {getIconForLabel(
+ event.sub_label ? event.label + "-verified" : event.label,
+ "size-4 text-white",
+ )}
+
+
+
{label}
+
+ {formattedStart ?? ""} - {formattedEnd ?? ""}
+
+ {event.data?.recognized_license_plate && (
+ <>
+
·
+
+
+ {event.data.recognized_license_plate}
+
+
+ >
+ )}
+
+
+
+
+
+ {!eventSequence ? (
+
+ ) : eventSequence.length === 0 ? (
+
+ {t("detail.noObjectDetailData", { ns: "views/events" })}
+
+ ) : (
+
+
+
+
+ {eventSequence.map((item, idx) => {
+ const isActive =
+ Math.abs(
+ (propTimeIndex ?? 0) - (item.timestamp ?? 0),
+ ) <= 0.5;
+ const formattedEventTimestamp = config
+ ? formatUnixTimestampToDateTime(item.timestamp ?? 0, {
+ timezone: config.ui.timezone,
+ date_format:
+ config.ui.time_format == "24hour"
+ ? t(
+ "time.formattedTimestampHourMinuteSecond.24hour",
+ { ns: "common" },
+ )
+ : t(
+ "time.formattedTimestampHourMinuteSecond.12hour",
+ { ns: "common" },
+ ),
+ time_style: "medium",
+ date_style: "medium",
+ })
+ : "";
+
+ const ratio =
+ Array.isArray(item.data.box) &&
+ item.data.box.length >= 4
+ ? (
+ aspectRatio *
+ (item.data.box[2] / item.data.box[3])
+ ).toFixed(2)
+ : "N/A";
+ const areaPx =
+ Array.isArray(item.data.box) &&
+ item.data.box.length >= 4
+ ? Math.round(
+ (config.cameras[event.camera]?.detect?.width ??
+ 0) *
+ (config.cameras[event.camera]?.detect
+ ?.height ?? 0) *
+ (item.data.box[2] * item.data.box[3]),
+ )
+ : undefined;
+ const areaPct =
+ Array.isArray(item.data.box) &&
+ item.data.box.length >= 4
+ ? (item.data.box[2] * item.data.box[3]).toFixed(4)
+ : undefined;
+
+ return (
+ {
+ setTimeIndex(item.timestamp ?? 0);
+ handleSetBox(
+ item.data.box ?? [],
+ item.data.attribute_box,
+ );
+ setLifecycleZones(item.data.zones);
+ setSelectedZone("");
+ }}
+ setSelectedZone={setSelectedZone}
+ getZoneColor={getZoneColor}
+ />
+ );
+ })}
+
+
+ )}
+
+
+
+ >
+ )}
+
+ );
+}
+
+type GetTimelineIconParams = {
+ lifecycleItem: TrackingDetailsSequence;
+ className?: string;
+};
+
+export function LifecycleIcon({
+ lifecycleItem,
+ className,
+}: GetTimelineIconParams) {
+ switch (lifecycleItem.class_type) {
+ case "visible":
+ return ;
+ case "gone":
+ return ;
+ case "active":
+ return ;
+ case "stationary":
+ return ;
+ case "entered_zone":
+ return ;
+ case "attribute":
+ switch (lifecycleItem.data?.attribute) {
+ case "face":
+ return ;
+ case "license_plate":
+ return ;
+ default:
+ return ;
+ }
+ case "heard":
+ return ;
+ case "external":
+ return ;
+ default:
+ return null;
+ }
+}
+
+type LifecycleIconRowProps = {
+ item: TrackingDetailsSequence;
+ isActive?: boolean;
+ formattedEventTimestamp: string;
+ ratio: string;
+ areaPx?: number;
+ areaPct?: string;
+ onClick: () => void;
+ setSelectedZone: (z: string) => void;
+ getZoneColor: (zoneName: string) => number[] | undefined;
+};
+
+function LifecycleIconRow({
+ item,
+ isActive,
+ formattedEventTimestamp,
+ ratio,
+ areaPx,
+ areaPct,
+ onClick,
+ setSelectedZone,
+ getZoneColor,
+}: LifecycleIconRowProps) {
+ const { t } = useTranslation(["views/explore", "components/player"]);
+ const { data: config } = useSWR("config");
+ const [isOpen, setIsOpen] = useState(false);
+
+ const navigate = useNavigate();
+
+ return (
+
+
+
+
+
+
+
+
+
+ {getLifecycleItemDescription(item)}
+
+
+
+
+ {t("trackingDetails.lifecycleItemDesc.header.ratio")}
+
+ {ratio}
+
+
+
+ {t("trackingDetails.lifecycleItemDesc.header.area")}
+
+ {areaPx !== undefined && areaPct !== undefined ? (
+
+ {t("information.pixels", { ns: "common", area: areaPx })} ·{" "}
+ {areaPct}%
+
+ ) : (
+ N/A
+ )}
+
+
+ {item.data?.zones && item.data.zones.length > 0 && (
+
+ {item.data.zones.map((zone, zidx) => {
+ const color = getZoneColor(zone)?.join(",") ?? "0,0,0";
+ return (
+ {
+ e.stopPropagation();
+ setSelectedZone(zone);
+ }}
+ style={{
+ borderColor: `rgba(${color}, 0.6)`,
+ background: `rgba(${color}, 0.08)`,
+ }}
+ >
+
+
+ {zone.replaceAll("_", " ")}
+
+
+ );
+ })}
+
+ )}
+
+
+
+
+
+
{formattedEventTimestamp}
+ {(config?.plus?.enabled || item.data.box) && (
+
+
+
+
+
+
+
+
+ {config?.plus?.enabled && (
+ {
+ const resp = await axios.post(
+ `/${item.camera}/plus/${item.timestamp}`,
+ );
+
+ if (resp && resp.status == 200) {
+ toast.success(
+ t("toast.success.submittedFrigatePlus", {
+ ns: "components/player",
+ }),
+ {
+ position: "top-center",
+ },
+ );
+ } else {
+ toast.success(
+ t("toast.error.submitFrigatePlusFailed", {
+ ns: "components/player",
+ }),
+ {
+ position: "top-center",
+ },
+ );
+ }
+ }}
+ >
+ {t("itemMenu.submitToPlus.label")}
+
+ )}
+ {item.data.box && (
+ {
+ setIsOpen(false);
+ setTimeout(() => {
+ navigate(
+ `/settings?page=masksAndZones&camera=${item.camera}&object_mask=${item.data.box}`,
+ );
+ }, 0);
+ }}
+ >
+ {t("trackingDetails.createObjectMask")}
+
+ )}
+
+
+
+ )}
+
+
+
+
+ );
+}