Merge branch 'dev' into fastapi-poc

This commit is contained in:
Rui Alves 2024-09-12 13:48:58 +01:00
commit b04f6938b9
46 changed files with 1230 additions and 1509 deletions

View File

@ -41,7 +41,7 @@ environment_vars:
### `database` ### `database`
Event and recording information is managed in a sqlite database at `/config/frigate.db`. If that database is deleted, recordings will be orphaned and will need to be cleaned up manually. They also won't show up in the Media Browser within Home Assistant. Tracked object and recording information is managed in a sqlite database at `/config/frigate.db`. If that database is deleted, recordings will be orphaned and will need to be cleaned up manually. They also won't show up in the Media Browser within Home Assistant.
If you are storing your database on a network share (SMB, NFS, etc), you may get a `database is locked` error message on startup. You can customize the location of the database in the config if necessary. If you are storing your database on a network share (SMB, NFS, etc), you may get a `database is locked` error message on startup. You can customize the location of the database in the config if necessary.

View File

@ -187,4 +187,4 @@ ffmpeg:
### TP-Link VIGI Cameras ### TP-Link VIGI Cameras
TP-Link VIGI cameras need some adjustments to the main stream settings on the camera itself to avoid issues. The stream needs to be configured as `H264` with `Smart Coding` set to `off`. Without these settings you may have problems when trying to watch recorded events. For example Firefox will stop playback after a few seconds and show the following error message: `The media playback was aborted due to a corruption problem or because the media used features your browser did not support.`. TP-Link VIGI cameras need some adjustments to the main stream settings on the camera itself to avoid issues. The stream needs to be configured as `H264` with `Smart Coding` set to `off`. Without these settings you may have problems when trying to watch recorded footage. For example Firefox will stop playback after a few seconds and show the following error message: `The media playback was aborted due to a corruption problem or because the media used features your browser did not support.`.

View File

@ -7,7 +7,7 @@ title: Camera Configuration
Several inputs can be configured for each camera and the role of each input can be mixed and matched based on your needs. This allows you to use a lower resolution stream for object detection, but create recordings from a higher resolution stream, or vice versa. Several inputs can be configured for each camera and the role of each input can be mixed and matched based on your needs. This allows you to use a lower resolution stream for object detection, but create recordings from a higher resolution stream, or vice versa.
A camera is enabled by default but can be temporarily disabled by using `enabled: False`. Existing events and recordings can still be accessed. Live streams, recording and detecting are not working. Camera specific configurations will be used. A camera is enabled by default but can be temporarily disabled by using `enabled: False`. Existing tracked objects and recordings can still be accessed. Live streams, recording and detecting are not working. Camera specific configurations will be used.
Each role can only be assigned to one input per camera. The options for roles are as follows: Each role can only be assigned to one input per camera. The options for roles are as follows:

View File

@ -5,6 +5,8 @@ title: Generative AI
Generative AI can be used to automatically generate descriptions based on the thumbnails of your tracked objects. This helps with [Semantic Search](/configuration/semantic_search) in Frigate by providing detailed text descriptions as a basis of the search query. Generative AI can be used to automatically generate descriptions based on the thumbnails of your tracked objects. This helps with [Semantic Search](/configuration/semantic_search) in Frigate by providing detailed text descriptions as a basis of the search query.
Semantic Search must be enabled to use Generative AI. Descriptions are accessed via the _Explore_ view in the Frigate UI by clicking on a tracked object's thumbnail.
## Configuration ## Configuration
Generative AI can be enabled for all cameras or only for specific cameras. There are currently 3 providers available to integrate with Frigate. Generative AI can be enabled for all cameras or only for specific cameras. There are currently 3 providers available to integrate with Frigate.

View File

@ -72,7 +72,7 @@ Here are some common starter configuration examples. Refer to the [reference con
- Hardware acceleration for decoding video - Hardware acceleration for decoding video
- USB Coral detector - USB Coral detector
- Save all video with any detectable motion for 7 days regardless of whether any objects were detected or not - Save all video with any detectable motion for 7 days regardless of whether any objects were detected or not
- Continue to keep all video if it was during any event for 30 days - Continue to keep all video if it qualified as an alert or detection for 30 days
- Save snapshots for 30 days - Save snapshots for 30 days
- Motion mask for the camera timestamp - Motion mask for the camera timestamp
@ -95,10 +95,12 @@ record:
retain: retain:
days: 7 days: 7
mode: motion mode: motion
events: alerts:
retain: retain:
default: 30 days: 30
mode: motion detections:
retain:
days: 30
snapshots: snapshots:
enabled: True enabled: True
@ -128,7 +130,7 @@ cameras:
- VAAPI hardware acceleration for decoding video - VAAPI hardware acceleration for decoding video
- USB Coral detector - USB Coral detector
- Save all video with any detectable motion for 7 days regardless of whether any objects were detected or not - Save all video with any detectable motion for 7 days regardless of whether any objects were detected or not
- Continue to keep all video if it was during any event for 30 days - Continue to keep all video if it qualified as an alert or detection for 30 days
- Save snapshots for 30 days - Save snapshots for 30 days
- Motion mask for the camera timestamp - Motion mask for the camera timestamp
@ -149,10 +151,12 @@ record:
retain: retain:
days: 7 days: 7
mode: motion mode: motion
events: alerts:
retain: retain:
default: 30 days: 30
mode: motion detections:
retain:
days: 30
snapshots: snapshots:
enabled: True enabled: True
@ -182,7 +186,7 @@ cameras:
- VAAPI hardware acceleration for decoding video - VAAPI hardware acceleration for decoding video
- OpenVino detector - OpenVino detector
- Save all video with any detectable motion for 7 days regardless of whether any objects were detected or not - Save all video with any detectable motion for 7 days regardless of whether any objects were detected or not
- Continue to keep all video if it was during any event for 30 days - Continue to keep all video if it qualified as an alert or detection for 30 days
- Save snapshots for 30 days - Save snapshots for 30 days
- Motion mask for the camera timestamp - Motion mask for the camera timestamp
@ -214,10 +218,12 @@ record:
retain: retain:
days: 7 days: 7
mode: motion mode: motion
events: alerts:
retain: retain:
default: 30 days: 30
mode: motion detections:
retain:
days: 30
snapshots: snapshots:
enabled: True enabled: True

View File

@ -13,11 +13,11 @@ Once motion is detected, it tries to group up nearby areas of motion together in
The default motion settings should work well for the majority of cameras, however there are cases where tuning motion detection can lead to better and more optimal results. Each camera has its own environment with different variables that affect motion, this means that the same motion settings will not fit all of your cameras. The default motion settings should work well for the majority of cameras, however there are cases where tuning motion detection can lead to better and more optimal results. Each camera has its own environment with different variables that affect motion, this means that the same motion settings will not fit all of your cameras.
Before tuning motion it is important to understand the goal. In an optimal configuration, motion from people and cars would be detected, but not grass moving, lighting changes, timestamps, etc. If your motion detection is too sensitive, you will experience higher CPU loads and greater false positives from the increased rate of object detection. If it is not sensitive enough, you will miss events. Before tuning motion it is important to understand the goal. In an optimal configuration, motion from people and cars would be detected, but not grass moving, lighting changes, timestamps, etc. If your motion detection is too sensitive, you will experience higher CPU loads and greater false positives from the increased rate of object detection. If it is not sensitive enough, you will miss objects that you want to track.
## Create Motion Masks ## Create Motion Masks
First, mask areas with regular motion not caused by the objects you want to detect. The best way to find candidates for motion masks is by watching the debug stream with motion boxes enabled. Good use cases for motion masks are timestamps or tree limbs and large bushes that regularly move due to wind. When possible, avoid creating motion masks that would block motion detection for objects you want to track **even if they are in locations where you don't want events**. Motion masks should not be used to avoid detecting objects in specific areas. More details can be found [in the masks docs.](/configuration/masks.md). First, mask areas with regular motion not caused by the objects you want to detect. The best way to find candidates for motion masks is by watching the debug stream with motion boxes enabled. Good use cases for motion masks are timestamps or tree limbs and large bushes that regularly move due to wind. When possible, avoid creating motion masks that would block motion detection for objects you want to track **even if they are in locations where you don't want alerts or detections**. Motion masks should not be used to avoid detecting objects in specific areas. More details can be found [in the masks docs.](/configuration/masks.md).
## Prepare For Testing ## Prepare For Testing
@ -29,7 +29,7 @@ Now that things are set up, find a time to tune that represents normal circumsta
:::note :::note
Remember that motion detection is just used to determine when object detection should be used. You should aim to have motion detection sensitive enough that you won't miss events from objects you want to detect with object detection. The goal is to prevent object detection from running constantly for every small pixel change in the image. Windy days are still going to result in lots of motion being detected. Remember that motion detection is just used to determine when object detection should be used. You should aim to have motion detection sensitive enough that you won't miss objects you want to detect with object detection. The goal is to prevent object detection from running constantly for every small pixel change in the image. Windy days are still going to result in lots of motion being detected.
::: :::
@ -94,7 +94,7 @@ motion:
:::tip :::tip
Some cameras like doorbell cameras may have missed detections when someone walks directly in front of the camera and the lightning_threshold causes motion detection to be re-calibrated. In this case, it may be desirable to increase the `lightning_threshold` to ensure these events are not missed. Some cameras like doorbell cameras may have missed detections when someone walks directly in front of the camera and the lightning_threshold causes motion detection to be re-calibrated. In this case, it may be desirable to increase the `lightning_threshold` to ensure these objects are not missed.
::: :::

View File

@ -20,15 +20,13 @@ For object filters in your configuration, any single detection below `min_score`
In frame 2, the score is below the `min_score` value, so Frigate ignores it and it becomes a 0.0. The computed score is the median of the score history (padding to at least 3 values), and only when that computed score crosses the `threshold` is the object marked as a true positive. That happens in frame 4 in the example. In frame 2, the score is below the `min_score` value, so Frigate ignores it and it becomes a 0.0. The computed score is the median of the score history (padding to at least 3 values), and only when that computed score crosses the `threshold` is the object marked as a true positive. That happens in frame 4 in the example.
show image of snapshot vs event with differing scores
### Minimum Score ### Minimum Score
Any detection below `min_score` will be immediately thrown out and never tracked because it is considered a false positive. If `min_score` is too low then false positives may be detected and tracked which can confuse the object tracker and may lead to wasted resources. If `min_score` is too high then lower scoring true positives like objects that are further away or partially occluded may be thrown out which can also confuse the tracker and cause valid events to be lost or disjointed. Any detection below `min_score` will be immediately thrown out and never tracked because it is considered a false positive. If `min_score` is too low then false positives may be detected and tracked which can confuse the object tracker and may lead to wasted resources. If `min_score` is too high then lower scoring true positives like objects that are further away or partially occluded may be thrown out which can also confuse the tracker and cause valid tracked objects to be lost or disjointed.
### Threshold ### Threshold
`threshold` is used to determine that the object is a true positive. Once an object is detected with a score >= `threshold` object is considered a true positive. If `threshold` is too low then some higher scoring false positives may create an event. If `threshold` is too high then true positive events may be missed due to the object never scoring high enough. `threshold` is used to determine that the object is a true positive. Once an object is detected with a score >= `threshold` object is considered a true positive. If `threshold` is too low then some higher scoring false positives may create an tracked object. If `threshold` is too high then true positive tracked objects may be missed due to the object never scoring high enough.
## Object Shape ## Object Shape
@ -52,7 +50,7 @@ Conceptually, a ratio of 1 is a square, 0.5 is a "tall skinny" box, and 2 is a "
### Zones ### Zones
[Required zones](/configuration/zones.md) can be a great tool to reduce false positives that may be detected in the sky or other areas that are not of interest. The required zones will only create events for objects that enter the zone. [Required zones](/configuration/zones.md) can be a great tool to reduce false positives that may be detected in the sky or other areas that are not of interest. The required zones will only create tracked objects for objects that enter the zone.
### Object Masks ### Object Masks

View File

@ -3,7 +3,7 @@ id: record
title: Recording title: Recording
--- ---
Recordings can be enabled and are stored at `/media/frigate/recordings`. The folder structure for the recordings is `YYYY-MM-DD/HH/<camera_name>/MM.SS.mp4` in **UTC time**. These recordings are written directly from your camera stream without re-encoding. Each camera supports a configurable retention policy in the config. Frigate chooses the largest matching retention value between the recording retention and the event retention when determining if a recording should be removed. Recordings can be enabled and are stored at `/media/frigate/recordings`. The folder structure for the recordings is `YYYY-MM-DD/HH/<camera_name>/MM.SS.mp4` in **UTC time**. These recordings are written directly from your camera stream without re-encoding. Each camera supports a configurable retention policy in the config. Frigate chooses the largest matching retention value between the recording retention and the tracked object retention when determining if a recording should be removed.
New recording segments are written from the camera stream to cache, they are only moved to disk if they match the setup recording retention policy. New recording segments are written from the camera stream to cache, they are only moved to disk if they match the setup recording retention policy.
@ -53,7 +53,7 @@ record:
### Minimum: Alerts only ### Minimum: Alerts only
If you only want to retain video that occurs during an event, this config will discard video unless an alert is ongoing. If you only want to retain video that occurs during a tracked object, this config will discard video unless an alert is ongoing.
```yaml ```yaml
record: record:
@ -72,7 +72,7 @@ As of Frigate 0.12 if there is less than an hour left of storage, the oldest 2 h
## Configuring Recording Retention ## Configuring Recording Retention
Frigate supports both continuous and event based recordings with separate retention modes and retention periods. Frigate supports both continuous and tracked object based recordings with separate retention modes and retention periods.
:::tip :::tip
@ -95,7 +95,7 @@ Continuous recording supports different retention modes [which are described bel
### Object Recording ### Object Recording
The number of days to record review items can be specified for review items classified as alerts as well as events. The number of days to record review items can be specified for review items classified as alerts as well as tracked objects.
```yaml ```yaml
record: record:
@ -108,13 +108,13 @@ record:
days: 10 # <- number of days to keep detections recordings days: 10 # <- number of days to keep detections recordings
``` ```
This configuration will retain recording segments that overlap with alerts and detections for 10 days. Because multiple events can reference the same recording segments, this avoids storing duplicate footage for overlapping events and reduces overall storage needs. This configuration will retain recording segments that overlap with alerts and detections for 10 days. Because multiple tracked objects can reference the same recording segments, this avoids storing duplicate footage for overlapping tracked objects and reduces overall storage needs.
**WARNING**: Recordings still must be enabled in the config. If a camera has recordings disabled in the config, enabling via the methods listed above will have no effect. **WARNING**: Recordings still must be enabled in the config. If a camera has recordings disabled in the config, enabling via the methods listed above will have no effect.
## What do the different retain modes mean? ## What do the different retain modes mean?
Frigate saves from the stream with the `record` role in 10 second segments. These options determine which recording segments are kept for continuous recording (but can also affect events). Frigate saves from the stream with the `record` role in 10 second segments. These options determine which recording segments are kept for continuous recording (but can also affect tracked objects).
Let's say you have Frigate configured so that your doorbell camera would retain the last **2** days of continuous recording. Let's say you have Frigate configured so that your doorbell camera would retain the last **2** days of continuous recording.

View File

@ -271,13 +271,13 @@ detect:
# especially when using separate streams for detect and record. # especially when using separate streams for detect and record.
# Use this setting to make the timeline bounding boxes more closely align # Use this setting to make the timeline bounding boxes more closely align
# with the recording. The value can be positive or negative. # with the recording. The value can be positive or negative.
# TIP: Imagine there is an event clip with a person walking from left to right. # TIP: Imagine there is an tracked object clip with a person walking from left to right.
# If the event timeline bounding box is consistently to the left of the person # If the tracked object lifecycle bounding box is consistently to the left of the person
# then the value should be decreased. Similarly, if a person is walking from # then the value should be decreased. Similarly, if a person is walking from
# left to right and the bounding box is consistently ahead of the person # left to right and the bounding box is consistently ahead of the person
# then the value should be increased. # then the value should be increased.
# TIP: This offset is dynamic so you can change the value and it will update existing # TIP: This offset is dynamic so you can change the value and it will update existing
# events, this makes it easy to tune. # tracked objects, this makes it easy to tune.
# WARNING: Fast moving objects will likely not have the bounding box align. # WARNING: Fast moving objects will likely not have the bounding box align.
annotation_offset: 0 annotation_offset: 0
@ -394,9 +394,9 @@ record:
sync_recordings: False sync_recordings: False
# Optional: Retention settings for recording # Optional: Retention settings for recording
retain: retain:
# Optional: Number of days to retain recordings regardless of events (default: shown below) # Optional: Number of days to retain recordings regardless of tracked objects (default: shown below)
# NOTE: This should be set to 0 and retention should be defined in events section below # NOTE: This should be set to 0 and retention should be defined in alerts and detections section below
# if you only want to retain recordings of events. # if you only want to retain recordings of alerts and detections.
days: 0 days: 0
# Optional: Mode for retention. Available options are: all, motion, and active_objects # Optional: Mode for retention. Available options are: all, motion, and active_objects
# all - save all recording segments regardless of activity # all - save all recording segments regardless of activity
@ -460,7 +460,7 @@ record:
# never stored, so setting the mode to "all" here won't bring them back. # never stored, so setting the mode to "all" here won't bring them back.
mode: motion mode: motion
# Optional: Configuration for the jpg snapshots written to the clips directory for each event # Optional: Configuration for the jpg snapshots written to the clips directory for each tracked object
# NOTE: Can be overridden at the camera level # NOTE: Can be overridden at the camera level
snapshots: snapshots:
# Optional: Enable writing jpg snapshot to /media/frigate/clips (default: shown below) # Optional: Enable writing jpg snapshot to /media/frigate/clips (default: shown below)
@ -491,10 +491,10 @@ snapshots:
semantic_search: semantic_search:
# Optional: Enable semantic search (default: shown below) # Optional: Enable semantic search (default: shown below)
enabled: False enabled: False
# Optional: Re-index embeddings database from historical events (default: shown below) # Optional: Re-index embeddings database from historical tracked objects (default: shown below)
reindex: False reindex: False
# Optional: Configuration for AI generated event descriptions # Optional: Configuration for AI generated tracked object descriptions
# NOTE: Semantic Search must be enabled for this to do anything. # NOTE: Semantic Search must be enabled for this to do anything.
# WARNING: Depending on the provider, this will send thumbnails over the internet # WARNING: Depending on the provider, this will send thumbnails over the internet
# to Google or OpenAI's LLMs to generate descriptions. It can be overridden at # to Google or OpenAI's LLMs to generate descriptions. It can be overridden at

View File

@ -21,7 +21,7 @@ Birdseye RTSP restream can be accessed at `rtsp://<frigate_host>:8554/birdseye`.
```yaml ```yaml
birdseye: birdseye:
restream: true restream: True
``` ```
### Securing Restream With Authentication ### Securing Restream With Authentication

View File

@ -7,13 +7,13 @@ The Review page of the Frigate UI is for quickly reviewing historical footage of
Review items are filterable by date, object type, and camera. Review items are filterable by date, object type, and camera.
### Review items vs. events ### Review items vs. tracked objects (formerly "events")
In Frigate 0.13 and earlier versions, the UI presented "events". An event was synonymous with a tracked or detected object. In Frigate 0.14 and later, a review item is a time period where any number of tracked objects were active. In Frigate 0.13 and earlier versions, the UI presented "events". An event was synonymous with a tracked or detected object. In Frigate 0.14 and later, a review item is a time period where any number of tracked objects were active.
For example, consider a situation where two people walked past your house. One was walking a dog. At the same time, a car drove by on the street behind them. For example, consider a situation where two people walked past your house. One was walking a dog. At the same time, a car drove by on the street behind them.
In this scenario, Frigate 0.13 and earlier would show 4 events in the UI - one for each person, another for the dog, and yet another for the car. You would have had 4 separate videos to watch even though they would have all overlapped. In this scenario, Frigate 0.13 and earlier would show 4 "events" in the UI - one for each person, another for the dog, and yet another for the car. You would have had 4 separate videos to watch even though they would have all overlapped.
In 0.14 and later, all of that is bundled into a single review item which starts and ends to capture all of that activity. Reviews for a single camera cannot overlap. Once you have watched that time period on that camera, it is marked as reviewed. In 0.14 and later, all of that is bundled into a single review item which starts and ends to capture all of that activity. Reviews for a single camera cannot overlap. Once you have watched that time period on that camera, it is marked as reviewed.

View File

@ -3,13 +3,15 @@ id: semantic_search
title: Using Semantic Search title: Using Semantic Search
--- ---
The Search feature in Frigate allows you to find tracked objects within your review items using either the image itself, a user-defined text description, or an automatically generated one. This semantic search functionality works by creating _embeddings_ — numerical vector representations — for both the images and text descriptions of your tracked objects. By comparing these embeddings, Frigate assesses their similarities to deliver relevant search results. Semantic Search in Frigate allows you to find tracked objects within your review items using either the image itself, a user-defined text description, or an automatically generated one. This feature works by creating _embeddings_ — numerical vector representations — for both the images and text descriptions of your tracked objects. By comparing these embeddings, Frigate assesses their similarities to deliver relevant search results.
Frigate has support for two models to create embeddings, both of which run locally: [OpenAI CLIP](https://openai.com/research/clip) and [all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2). Embeddings are then saved to a local instance of [ChromaDB](https://trychroma.com). Frigate has support for two models to create embeddings, both of which run locally: [OpenAI CLIP](https://openai.com/research/clip) and [all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2). Embeddings are then saved to a local instance of [ChromaDB](https://trychroma.com).
Semantic Search is accessed via the _Explore_ view in the Frigate UI.
## Configuration ## Configuration
Semantic Search is a global configuration setting. Semantic search is disabled by default, and must be enabled in your config file before it can be used. Semantic Search is a global configuration setting.
```yaml ```yaml
semantic_search: semantic_search:
@ -31,7 +33,7 @@ This model is able to embed both images and text into the same vector space, whi
### all-MiniLM-L6-v2 ### all-MiniLM-L6-v2
This is a sentence embedding model that has been fine tuned on over 1 billion sentence pairs. This model is used to embed tracked object descriptions and perform searches against them. Descriptions can be created, viewed, and modified on the Search page when clicking on the gray tracked object chip at the top left of each review item. See [the Generative AI docs](/configuration/genai.md) for more information on how to automatically generate event descriptions. This is a sentence embedding model that has been fine tuned on over 1 billion sentence pairs. This model is used to embed tracked object descriptions and perform searches against them. Descriptions can be created, viewed, and modified on the Search page when clicking on the gray tracked object chip at the top left of each review item. See [the Generative AI docs](/configuration/genai.md) for more information on how to automatically generate tracked object descriptions.
## Usage ## Usage

View File

@ -64,7 +64,7 @@ cameras:
### Restricting zones to specific objects ### Restricting zones to specific objects
Sometimes you want to limit a zone to specific object types to have more granular control of when events/snapshots are saved. The following example will limit one zone to person objects and the other to cars. Sometimes you want to limit a zone to specific object types to have more granular control of when alerts, detections, and snapshots are saved. The following example will limit one zone to person objects and the other to cars.
```yaml ```yaml
cameras: cameras:
@ -80,7 +80,7 @@ cameras:
- car - car
``` ```
Only car objects can trigger the `front_yard_street` zone and only person can trigger the `entire_yard`. You will get events for person objects that enter anywhere in the yard, and events for cars only if they enter the street. Only car objects can trigger the `front_yard_street` zone and only person can trigger the `entire_yard`. Objects will be tracked for any `person` that enter anywhere in the yard, and for cars only if they enter the street.
### Zone Loitering ### Zone Loitering

View File

@ -16,10 +16,6 @@ A box returned from the object detection model that outlines an object in the fr
- A gray thin line indicates that object is detected as being stationary - A gray thin line indicates that object is detected as being stationary
- A thick line indicates that object is the subject of autotracking (when enabled). - A thick line indicates that object is the subject of autotracking (when enabled).
## Event
The time period starting when a tracked object entered the frame and ending when it left the frame, including any time that the object remained still. Events are saved when it is considered a [true positive](#threshold) and meets the requirements for a snapshot or recording to be saved.
## False Positive ## False Positive
An incorrect detection of an object type. For example a dog being detected as a person, a chair being detected as a dog, etc. A person being detected in an area you want to ignore is not a false positive. An incorrect detection of an object type. For example a dog being detected as a person, a chair being detected as a dog, etc. A person being detected in an area you want to ignore is not a false positive.
@ -64,6 +60,10 @@ The threshold is the median score that an object must reach in order to be consi
The top score for an object is the highest median score for an object. The top score for an object is the highest median score for an object.
## Tracked Object ("event" in previous versions)
The time period starting when a tracked object entered the frame and ending when it left the frame, including any time that the object remained still. Tracked objects are saved when it is considered a [true positive](#threshold) and meets the requirements for a snapshot or recording to be saved.
## Zone ## Zone
Zones are areas of interest, zones can be used for notifications and for limiting the areas where Frigate will create an [event](#event). [See the zone docs for more info](/configuration/zones) Zones are areas of interest, zones can be used for notifications and for limiting the areas where Frigate will create an [event](#event). [See the zone docs for more info](/configuration/zones)

View File

@ -238,7 +238,7 @@ Now that you know where you need to mask, use the "Mask & Zone creator" in the o
:::warning :::warning
Note that motion masks should not be used to mark out areas where you do not want objects to be detected or to reduce false positives. They do not alter the image sent to object detection, so you can still get events and detections in areas with motion masks. These only prevent motion in these areas from initiating object detection. Note that motion masks should not be used to mark out areas where you do not want objects to be detected or to reduce false positives. They do not alter the image sent to object detection, so you can still get tracked objects, alerts, and detections in areas with motion masks. These only prevent motion in these areas from initiating object detection.
::: :::
@ -302,7 +302,7 @@ If you only plan to use Frigate for recording, it is still recommended to define
::: :::
By default, Frigate will retain video of all events for 10 days. The full set of options for recording can be found [here](../configuration/reference.md). By default, Frigate will retain video of all tracked objects for 10 days. The full set of options for recording can be found [here](../configuration/reference.md).
### Step 7: Complete config ### Step 7: Complete config

View File

@ -7,11 +7,11 @@ The best way to get started with notifications for Frigate is to use the [Bluepr
It is generally recommended to trigger notifications based on the `frigate/reviews` mqtt topic. This provides the event_id(s) needed to fetch [thumbnails/snapshots/clips](../integrations/home-assistant.md#notification-api) and other useful information to customize when and where you want to receive alerts. The data is published in the form of a change feed, which means you can reference the "previous state" of the object in the `before` section and the "current state" of the object in the `after` section. You can see an example [here](../integrations/mqtt.md#frigateevents). It is generally recommended to trigger notifications based on the `frigate/reviews` mqtt topic. This provides the event_id(s) needed to fetch [thumbnails/snapshots/clips](../integrations/home-assistant.md#notification-api) and other useful information to customize when and where you want to receive alerts. The data is published in the form of a change feed, which means you can reference the "previous state" of the object in the `before` section and the "current state" of the object in the `after` section. You can see an example [here](../integrations/mqtt.md#frigateevents).
Here is a simple example of a notification automation of events which will update the existing notification for each change. This means the image you see in the notification will update as Frigate finds a "better" image. Here is a simple example of a notification automation of tracked objects which will update the existing notification for each change. This means the image you see in the notification will update as Frigate finds a "better" image.
```yaml ```yaml
automation: automation:
- alias: Notify of events - alias: Notify of tracked object
trigger: trigger:
platform: mqtt platform: mqtt
topic: frigate/events topic: frigate/events

View File

@ -189,15 +189,15 @@ Example parameters:
### `GET /api/<camera_name>/<label>/thumbnail.jpg` ### `GET /api/<camera_name>/<label>/thumbnail.jpg`
Returns the thumbnail from the latest event for the given camera and label combo. Using `any` as the label will return the latest thumbnail regardless of type. Returns the thumbnail from the latest tracked object for the given camera and label combo. Using `any` as the label will return the latest thumbnail regardless of type.
### `GET /api/<camera_name>/<label>/clip.mp4` ### `GET /api/<camera_name>/<label>/clip.mp4`
Returns the clip from the latest event for the given camera and label combo. Using `any` as the label will return the latest clip regardless of type. Returns the clip from the latest tracked object for the given camera and label combo. Using `any` as the label will return the latest clip regardless of type.
### `GET /api/<camera_name>/<label>/snapshot.jpg` ### `GET /api/<camera_name>/<label>/snapshot.jpg`
Returns the snapshot image from the latest event for the given camera and label combo. Using `any` as the label will return the latest thumbnail regardless of type. Returns the snapshot image from the latest tracked object for the given camera and label combo. Using `any` as the label will return the latest thumbnail regardless of type.
### `GET /api/<camera_name>/grid.jpg` ### `GET /api/<camera_name>/grid.jpg`
@ -386,7 +386,7 @@ Specific preview frame from preview cache.
Looping image made from preview video / frames during this time range. Looping image made from preview video / frames during this time range.
| param | Type | Description | | param | Type | Description |
| --------- | ---- | -------------------------------- | | -------- | ---- | -------------------------------- |
| `format` | str | Format of preview [`gif`, `mp4`] | | `format` | str | Format of preview [`gif`, `mp4`] |
## Recordings ## Recordings

View File

@ -149,7 +149,7 @@ Home Assistant > Configuration > Integrations > Frigate > Options
## Entities Provided ## Entities Provided
| Platform | Description | | Platform | Description |
| --------------- | --------------------------------------------------------------------------------- | | --------------- | ------------------------------------------------------------------------------- |
| `camera` | Live camera stream (requires RTSP). | | `camera` | Live camera stream (requires RTSP). |
| `image` | Image of the latest detected object for each camera. | | `image` | Image of the latest detected object for each camera. |
| `sensor` | States to monitor Frigate performance, object counts for all zones and cameras. | | `sensor` | States to monitor Frigate performance, object counts for all zones and cameras. |
@ -160,7 +160,7 @@ Home Assistant > Configuration > Integrations > Frigate > Options
The integration provides: The integration provides:
- Browsing event recordings with thumbnails - Browsing tracked object recordings with thumbnails
- Browsing snapshots - Browsing snapshots
- Browsing recordings by month, day, camera, time - Browsing recordings by month, day, camera, time
@ -183,19 +183,19 @@ For clips to be castable to media devices, audio is required and may need to be
Many people do not want to expose Frigate to the web, so the integration creates some public API endpoints that can be used for notifications. Many people do not want to expose Frigate to the web, so the integration creates some public API endpoints that can be used for notifications.
To load a thumbnail for an event: To load a thumbnail for a tracked object:
``` ```
https://HA_URL/api/frigate/notifications/<event-id>/thumbnail.jpg https://HA_URL/api/frigate/notifications/<event-id>/thumbnail.jpg
``` ```
To load a snapshot for an event: To load a snapshot for a tracked object:
``` ```
https://HA_URL/api/frigate/notifications/<event-id>/snapshot.jpg https://HA_URL/api/frigate/notifications/<event-id>/snapshot.jpg
``` ```
To load a video clip of an event: To load a video clip of a tracked object:
``` ```
https://HA_URL/api/frigate/notifications/<event-id>/clip.mp4 https://HA_URL/api/frigate/notifications/<event-id>/clip.mp4

View File

@ -19,7 +19,7 @@ Causes Frigate to exit. Docker should be configured to automatically restart the
### `frigate/events` ### `frigate/events`
Message published for each changed event. The first message is published when the tracked object is no longer marked as a false_positive. When Frigate finds a better snapshot of the tracked object or when a zone change occurs, it will publish a message with the same id. When the event ends, a final message is published with `end_time` set. Message published for each changed tracked object. The first message is published when the tracked object is no longer marked as a false_positive. When Frigate finds a better snapshot of the tracked object or when a zone change occurs, it will publish a message with the same id. When the tracked object ends, a final message is published with `end_time` set.
```json ```json
{ {
@ -109,15 +109,13 @@ Message published for each changed review item. The first message is published w
"severity": "detection", "severity": "detection",
"thumb_path": "/media/frigate/clips/review/thumb-front_cam-1718987129.308396-fqk5ka.webp", "thumb_path": "/media/frigate/clips/review/thumb-front_cam-1718987129.308396-fqk5ka.webp",
"data": { "data": {
"detections": [ // list of event IDs "detections": [
// list of event IDs
"1718987128.947436-g92ztx", "1718987128.947436-g92ztx",
"1718987148.879516-d7oq7r", "1718987148.879516-d7oq7r",
"1718987126.934663-q5ywpt" "1718987126.934663-q5ywpt"
], ],
"objects": [ "objects": ["person", "car"],
"person",
"car"
],
"sub_labels": [], "sub_labels": [],
"zones": [], "zones": [],
"audio": [] "audio": []
@ -136,14 +134,9 @@ Message published for each changed review item. The first message is published w
"1718987148.879516-d7oq7r", "1718987148.879516-d7oq7r",
"1718987126.934663-q5ywpt" "1718987126.934663-q5ywpt"
], ],
"objects": [ "objects": ["person", "car"],
"person",
"car"
],
"sub_labels": ["Bob"], "sub_labels": ["Bob"],
"zones": [ "zones": ["front_yard"],
"front_yard"
],
"audio": [] "audio": []
} }
} }
@ -175,13 +168,11 @@ Publishes the count of active objects for the camera for use as a sensor in Home
Assistant. `all` can be used as the object_name for the count of all active objects Assistant. `all` can be used as the object_name for the count of all active objects
for the camera. for the camera.
### `frigate/<zone_name>/<object_name>` ### `frigate/<zone_name>/<object_name>`
Publishes the count of objects for the zone for use as a sensor in Home Assistant. Publishes the count of objects for the zone for use as a sensor in Home Assistant.
`all` can be used as the object_name for the count of all objects for the zone. `all` can be used as the object_name for the count of all objects for the zone.
### `frigate/<zone_name>/<object_name>/active` ### `frigate/<zone_name>/<object_name>/active`
Publishes the count of active objects for the zone for use as a sensor in Home Publishes the count of active objects for the zone for use as a sensor in Home

View File

@ -19,7 +19,7 @@ Once logged in, you can generate an API key for Frigate in Settings.
### Set your API key ### Set your API key
In Frigate, you can use an environment variable or a docker secret named `PLUS_API_KEY` to enable the `SEND TO FRIGATE+` buttons on the events page. Home Assistant Addon users can set it under Settings > Addons > Frigate NVR > Configuration > Options (be sure to toggle the "Show unused optional configuration options" switch). In Frigate, you can use an environment variable or a docker secret named `PLUS_API_KEY` to enable the `Frigate+` buttons on the Explore page. Home Assistant Addon users can set it under Settings > Addons > Frigate NVR > Configuration > Options (be sure to toggle the "Show unused optional configuration options" switch).
:::warning :::warning
@ -29,7 +29,7 @@ You cannot use the `environment_vars` section of your configuration file to set
## Submit examples ## Submit examples
Once your API key is configured, you can submit examples directly from the events page in Frigate using the `SEND TO FRIGATE+` button. Once your API key is configured, you can submit examples directly from the Explore page in Frigate using the `Frigate+` button.
:::note :::note

View File

@ -33,7 +33,7 @@ Frigate+ models support a more relevant set of objects for security cameras. Cur
### Label attributes ### Label attributes
Frigate has special handling for some labels when using Frigate+ models. `face`, `license_plate`, `amazon`, `ups`, and `fedex` are considered attribute labels which are not tracked like regular objects and do not generate events. In addition, the `threshold` filter will have no effect on these labels. You should adjust the `min_score` and other filter values as needed. Frigate has special handling for some labels when using Frigate+ models. `face`, `license_plate`, `amazon`, `ups`, and `fedex` are considered attribute labels which are not tracked like regular objects and do not generate review items directly. In addition, the `threshold` filter will have no effect on these labels. You should adjust the `min_score` and other filter values as needed.
In order to have Frigate start using these attribute labels, you will need to add them to the list of objects to track: In order to have Frigate start using these attribute labels, you will need to add them to the list of objects to track:

View File

@ -17,7 +17,7 @@ ffmpeg:
record: preset-record-generic-audio-aac record: preset-record-generic-audio-aac
``` ```
### I can't view events or recordings in the Web UI. ### I can't view recordings in the Web UI.
Ensure your cameras send h264 encoded video, or [transcode them](/configuration/restream.md). Ensure your cameras send h264 encoded video, or [transcode them](/configuration/restream.md).

View File

@ -251,6 +251,61 @@ def events():
return jsonify(list(events)) return jsonify(list(events))
@EventBp.route("/events/explore")
def events_explore():
limit = request.args.get("limit", 10, type=int)
subquery = Event.select(
Event.id,
Event.camera,
Event.label,
Event.zones,
Event.start_time,
Event.end_time,
Event.has_clip,
Event.has_snapshot,
Event.plus_id,
Event.retain_indefinitely,
Event.sub_label,
Event.top_score,
Event.false_positive,
Event.box,
Event.data,
fn.rank()
.over(partition_by=[Event.label], order_by=[Event.start_time.desc()])
.alias("rank"),
fn.COUNT(Event.id).over(partition_by=[Event.label]).alias("event_count"),
).alias("subquery")
query = (
Event.select(
subquery.c.id,
subquery.c.camera,
subquery.c.label,
subquery.c.zones,
subquery.c.start_time,
subquery.c.end_time,
subquery.c.has_clip,
subquery.c.has_snapshot,
subquery.c.plus_id,
subquery.c.retain_indefinitely,
subquery.c.sub_label,
subquery.c.top_score,
subquery.c.false_positive,
subquery.c.box,
subquery.c.data,
subquery.c.event_count,
)
.from_(subquery)
.where(subquery.c.rank <= limit)
.order_by(subquery.c.event_count.desc(), subquery.c.start_time.desc())
.dicts()
)
events = query.iterator()
return jsonify(list(events))
@EventBp.route("/event_ids") @EventBp.route("/event_ids")
def event_ids(): def event_ids():
idString = request.args.get("ids") idString = request.args.get("ids")
@ -317,7 +372,10 @@ def events_search():
Event.zones, Event.zones,
Event.start_time, Event.start_time,
Event.end_time, Event.end_time,
Event.has_clip,
Event.has_snapshot,
Event.data, Event.data,
Event.plus_id,
ReviewSegment.thumb_path, ReviewSegment.thumb_path,
] ]

View File

@ -192,7 +192,10 @@ def migrate_015_0(config: dict[str, dict[str, any]]) -> dict[str, dict[str, any]
"default" "default"
] ]
else: else:
detections_retention["retain"]["days"] = 0 continuous_days = config.get("record", {}).get("retain", {}).get("days")
detections_retention["retain"]["days"] = (
continuous_days if continuous_days else 1
)
new_config["record"]["alerts"] = alerts_retention new_config["record"]["alerts"] = alerts_retention
new_config["record"]["detections"] = detections_retention new_config["record"]["detections"] = detections_retention
@ -232,7 +235,12 @@ def migrate_015_0(config: dict[str, dict[str, any]]) -> dict[str, dict[str, any]
"default" "default"
] ]
else: else:
detections_retention["retain"]["days"] = 0 continuous_days = (
camera_config.get("record", {}).get("retain", {}).get("days")
)
detections_retention["retain"]["days"] = (
continuous_days if continuous_days else 1
)
camera_config["record"]["alerts"] = alerts_retention camera_config["record"]["alerts"] = alerts_retention
camera_config["record"]["detections"] = detections_retention camera_config["record"]["detections"] = detections_retention

View File

@ -13,9 +13,8 @@ import { isPWA } from "./utils/isPWA";
const Live = lazy(() => import("@/pages/Live")); const Live = lazy(() => import("@/pages/Live"));
const Events = lazy(() => import("@/pages/Events")); const Events = lazy(() => import("@/pages/Events"));
const Search = lazy(() => import("@/pages/Search")); const Explore = lazy(() => import("@/pages/Explore"));
const Exports = lazy(() => import("@/pages/Exports")); const Exports = lazy(() => import("@/pages/Exports"));
const SubmitPlus = lazy(() => import("@/pages/SubmitPlus"));
const ConfigEditor = lazy(() => import("@/pages/ConfigEditor")); const ConfigEditor = lazy(() => import("@/pages/ConfigEditor"));
const System = lazy(() => import("@/pages/System")); const System = lazy(() => import("@/pages/System"));
const Settings = lazy(() => import("@/pages/Settings")); const Settings = lazy(() => import("@/pages/Settings"));
@ -45,9 +44,8 @@ function App() {
<Route index element={<Live />} /> <Route index element={<Live />} />
<Route path="/events" element={<Redirect to="/review" />} /> <Route path="/events" element={<Redirect to="/review" />} />
<Route path="/review" element={<Events />} /> <Route path="/review" element={<Events />} />
<Route path="/search" element={<Search />} /> <Route path="/explore" element={<Explore />} />
<Route path="/export" element={<Exports />} /> <Route path="/export" element={<Exports />} />
<Route path="/plus" element={<SubmitPlus />} />
<Route path="/system" element={<System />} /> <Route path="/system" element={<System />} />
<Route path="/settings" element={<Settings />} /> <Route path="/settings" element={<Settings />} />
<Route path="/config" element={<ConfigEditor />} /> <Route path="/config" element={<ConfigEditor />} />

View File

@ -131,6 +131,11 @@ export function AnimatedEventCard({
<div <div
className="size-full cursor-pointer overflow-hidden rounded md:rounded-lg" className="size-full cursor-pointer overflow-hidden rounded md:rounded-lg"
onClick={onOpenReview} onClick={onOpenReview}
onAuxClick={() =>
window
.open(`${baseUrl}review?id=${event.id}`, "_blank")
?.focus()
}
> >
{!alertVideos ? ( {!alertVideos ? (
<img <img

View File

@ -0,0 +1,125 @@
import { useCallback } from "react";
import { useApiHost } from "@/api";
import { getIconForLabel } from "@/utils/iconUtil";
import TimeAgo from "../dynamic/TimeAgo";
import useSWR from "swr";
import { FrigateConfig } from "@/types/frigateConfig";
import { isIOS, isSafari } from "react-device-detect";
import Chip from "@/components/indicators/Chip";
import { useFormattedTimestamp } from "@/hooks/use-date-utils";
import useImageLoaded from "@/hooks/use-image-loaded";
import { Tooltip, TooltipContent, TooltipTrigger } from "../ui/tooltip";
import ImageLoadingIndicator from "../indicators/ImageLoadingIndicator";
import ActivityIndicator from "../indicators/activity-indicator";
import { capitalizeFirstLetter } from "@/utils/stringUtil";
import { SearchResult } from "@/types/search";
import useContextMenu from "@/hooks/use-contextmenu";
import { cn } from "@/lib/utils";
type SearchThumbnailProps = {
searchResult: SearchResult;
findSimilar: () => void;
onClick: (searchResult: SearchResult) => void;
};
export default function SearchThumbnail({
searchResult,
findSimilar,
onClick,
}: SearchThumbnailProps) {
const apiHost = useApiHost();
const { data: config } = useSWR<FrigateConfig>("config");
const [imgRef, imgLoaded, onImgLoad] = useImageLoaded();
useContextMenu(imgRef, findSimilar);
const handleOnClick = useCallback(() => {
onClick(searchResult);
}, [searchResult, onClick]);
// date
const formattedDate = useFormattedTimestamp(
searchResult.start_time,
config?.ui.time_format == "24hour" ? "%b %-d, %H:%M" : "%b %-d, %I:%M %p",
);
return (
<div className="relative size-full cursor-pointer" onClick={handleOnClick}>
<ImageLoadingIndicator
className="absolute inset-0"
imgLoaded={imgLoaded}
/>
<div className={`${imgLoaded ? "visible" : "invisible"}`}>
<img
ref={imgRef}
className={cn(
"size-full select-none opacity-100 transition-opacity",
searchResult.search_source == "thumbnail" && "object-contain",
)}
style={
isIOS
? {
WebkitUserSelect: "none",
WebkitTouchCallout: "none",
}
: undefined
}
draggable={false}
src={`${apiHost}api/events/${searchResult.id}/thumbnail.jpg`}
loading={isSafari ? "eager" : "lazy"}
onLoad={() => {
onImgLoad();
}}
/>
<div className="absolute left-0 top-2 z-40">
<Tooltip>
<div className="flex">
<TooltipTrigger asChild>
<div className="mx-3 pb-1 text-sm text-white">
{
<>
<Chip
className={`z-0 flex items-start justify-between space-x-1 bg-gray-500 bg-gradient-to-br from-gray-400 to-gray-500`}
onClick={() => onClick(searchResult)}
>
{getIconForLabel(
searchResult.label,
"size-3 text-white",
)}
</Chip>
</>
}
</div>
</TooltipTrigger>
</div>
<TooltipContent className="capitalize">
{[...new Set([searchResult.label])]
.filter(
(item) => item !== undefined && !item.includes("-verified"),
)
.map((text) => capitalizeFirstLetter(text))
.sort()
.join(", ")
.replaceAll("-verified", "")}
</TooltipContent>
</Tooltip>
</div>
<div className="rounded-t-l pointer-events-none absolute inset-x-0 top-0 z-10 h-[30%] w-full bg-gradient-to-b from-black/60 to-transparent"></div>
<div className="rounded-b-l pointer-events-none absolute inset-x-0 bottom-0 z-10 h-[20%] w-full bg-gradient-to-t from-black/60 to-transparent">
<div className="mx-3 flex h-full items-end justify-between pb-1 text-sm text-white">
{searchResult.end_time ? (
<TimeAgo time={searchResult.start_time * 1000} dense />
) : (
<div>
<ActivityIndicator size={24} />
</div>
)}
{formattedDate}
</div>
</div>
</div>
</div>
);
}

View File

@ -110,7 +110,7 @@ export function CalendarRangeFilterButton({
className={`${range == undefined ? "text-secondary-foreground" : "text-selected-foreground"}`} className={`${range == undefined ? "text-secondary-foreground" : "text-selected-foreground"}`}
/> />
<div <div
className={`hidden md:block ${range == undefined ? "text-primary" : "text-selected-foreground"}`} className={`${range == undefined ? "text-primary" : "text-selected-foreground"}`}
> >
{range == undefined ? defaultText : selectedDate} {range == undefined ? defaultText : selectedDate}
</div> </div>

View File

@ -1,6 +1,6 @@
import { Button } from "../ui/button"; import { Button } from "../ui/button";
import { CameraGroupConfig } from "@/types/frigateConfig"; import { CameraGroupConfig } from "@/types/frigateConfig";
import { useState } from "react"; import { useMemo, useState } from "react";
import { import {
DropdownMenu, DropdownMenu,
DropdownMenuContent, DropdownMenuContent,
@ -17,12 +17,14 @@ type CameraFilterButtonProps = {
allCameras: string[]; allCameras: string[];
groups: [string, CameraGroupConfig][]; groups: [string, CameraGroupConfig][];
selectedCameras: string[] | undefined; selectedCameras: string[] | undefined;
hideText?: boolean;
updateCameraFilter: (cameras: string[] | undefined) => void; updateCameraFilter: (cameras: string[] | undefined) => void;
}; };
export function CamerasFilterButton({ export function CamerasFilterButton({
allCameras, allCameras,
groups, groups,
selectedCameras, selectedCameras,
hideText = isMobile,
updateCameraFilter, updateCameraFilter,
}: CameraFilterButtonProps) { }: CameraFilterButtonProps) {
const [open, setOpen] = useState(false); const [open, setOpen] = useState(false);
@ -30,6 +32,18 @@ export function CamerasFilterButton({
selectedCameras, selectedCameras,
); );
const buttonText = useMemo(() => {
if (isMobile) {
return "Cameras";
}
if (!selectedCameras || selectedCameras.length == 0) {
return "All Cameras";
}
return `${selectedCameras.includes("birdseye") ? selectedCameras.length - 1 : selectedCameras.length} Camera${selectedCameras.length !== 1 ? "s" : ""}`;
}, [selectedCameras]);
const trigger = ( const trigger = (
<Button <Button
className="flex items-center gap-2 capitalize" className="flex items-center gap-2 capitalize"
@ -40,11 +54,9 @@ export function CamerasFilterButton({
className={`${(selectedCameras?.length ?? 0) >= 1 ? "text-selected-foreground" : "text-secondary-foreground"}`} className={`${(selectedCameras?.length ?? 0) >= 1 ? "text-selected-foreground" : "text-secondary-foreground"}`}
/> />
<div <div
className={`hidden md:block ${selectedCameras?.length ? "text-selected-foreground" : "text-primary"}`} className={`${hideText ? "hidden" : ""} ${selectedCameras?.length ? "text-selected-foreground" : "text-primary"}`}
> >
{selectedCameras == undefined {buttonText}
? "All Cameras"
: `${selectedCameras.includes("birdseye") ? selectedCameras.length - 1 : selectedCameras.length} Camera${selectedCameras.length !== 1 ? "s" : ""}`}
</div> </div>
</Button> </Button>
); );

View File

@ -5,7 +5,6 @@ import { FrigateConfig } from "@/types/frigateConfig";
import { useCallback, useMemo, useState } from "react"; import { useCallback, useMemo, useState } from "react";
import { DropdownMenuSeparator } from "../ui/dropdown-menu"; import { DropdownMenuSeparator } from "../ui/dropdown-menu";
import { getEndOfDayTimestamp } from "@/utils/dateUtil"; import { getEndOfDayTimestamp } from "@/utils/dateUtil";
import { FaFilter } from "react-icons/fa";
import { isMobile } from "react-device-detect"; import { isMobile } from "react-device-detect";
import { Drawer, DrawerContent, DrawerTrigger } from "../ui/drawer"; import { Drawer, DrawerContent, DrawerTrigger } from "../ui/drawer";
import { Switch } from "../ui/switch"; import { Switch } from "../ui/switch";
@ -19,6 +18,8 @@ import { DateRange } from "react-day-picker";
import { cn } from "@/lib/utils"; import { cn } from "@/lib/utils";
import SubFilterIcon from "../icons/SubFilterIcon"; import SubFilterIcon from "../icons/SubFilterIcon";
import { FaLocationDot } from "react-icons/fa6"; import { FaLocationDot } from "react-icons/fa6";
import { MdLabel } from "react-icons/md";
import SearchSourceIcon from "../icons/SearchSourceIcon";
const SEARCH_FILTERS = [ const SEARCH_FILTERS = [
"cameras", "cameras",
@ -42,14 +43,15 @@ type SearchFilterGroupProps = {
className: string; className: string;
filters?: SearchFilters[]; filters?: SearchFilters[];
filter?: SearchFilter; filter?: SearchFilter;
searchTerm: string;
filterList?: FilterList; filterList?: FilterList;
onUpdateFilter: (filter: SearchFilter) => void; onUpdateFilter: (filter: SearchFilter) => void;
}; };
export default function SearchFilterGroup({ export default function SearchFilterGroup({
className, className,
filters = DEFAULT_REVIEW_FILTERS, filters = DEFAULT_REVIEW_FILTERS,
filter, filter,
searchTerm,
filterList, filterList,
onUpdateFilter, onUpdateFilter,
}: SearchFilterGroupProps) { }: SearchFilterGroupProps) {
@ -154,12 +156,18 @@ export default function SearchFilterGroup({
); );
return ( return (
<div className={cn("flex justify-center gap-2", className)}> <div
className={cn(
"scrollbar-container flex justify-center gap-2 overflow-x-auto",
className,
)}
>
{filters.includes("cameras") && ( {filters.includes("cameras") && (
<CamerasFilterButton <CamerasFilterButton
allCameras={filterValues.cameras} allCameras={filterValues.cameras}
groups={groups} groups={groups}
selectedCameras={filter?.cameras} selectedCameras={filter?.cameras}
hideText={false}
updateCameraFilter={(newCameras) => { updateCameraFilter={(newCameras) => {
onUpdateFilter({ ...filter, cameras: newCameras }); onUpdateFilter({ ...filter, cameras: newCameras });
}} }}
@ -175,19 +183,10 @@ export default function SearchFilterGroup({
to: new Date(filter.before * 1000), to: new Date(filter.before * 1000),
} }
} }
defaultText="All Dates" defaultText={isMobile ? "Dates" : "All Dates"}
updateSelectedRange={onUpdateSelectedRange} updateSelectedRange={onUpdateSelectedRange}
/> />
)} )}
{filters.includes("general") && (
<GeneralFilterButton
allLabels={filterValues.labels}
selectedLabels={filter?.labels}
updateLabelFilter={(newLabels) => {
onUpdateFilter({ ...filter, labels: newLabels });
}}
/>
)}
{filters.includes("zone") && allZones.length > 0 && ( {filters.includes("zone") && allZones.length > 0 && (
<ZoneFilterButton <ZoneFilterButton
allZones={filterValues.zones} allZones={filterValues.zones}
@ -197,6 +196,15 @@ export default function SearchFilterGroup({
} }
/> />
)} )}
{filters.includes("general") && (
<GeneralFilterButton
allLabels={filterValues.labels}
selectedLabels={filter?.labels}
updateLabelFilter={(newLabels) => {
onUpdateFilter({ ...filter, labels: newLabels });
}}
/>
)}
{filters.includes("sub") && ( {filters.includes("sub") && (
<SubFilterButton <SubFilterButton
allSubLabels={allSubLabels} allSubLabels={allSubLabels}
@ -206,7 +214,9 @@ export default function SearchFilterGroup({
} }
/> />
)} )}
{config?.semantic_search?.enabled && filters.includes("source") && ( {config?.semantic_search?.enabled &&
filters.includes("source") &&
!searchTerm.includes("similarity:") && (
<SearchTypeButton <SearchTypeButton
selectedSearchSources={ selectedSearchSources={
filter?.search_type ?? ["thumbnail", "description"] filter?.search_type ?? ["thumbnail", "description"]
@ -235,19 +245,35 @@ function GeneralFilterButton({
selectedLabels, selectedLabels,
); );
const buttonText = useMemo(() => {
if (isMobile) {
return "Labels";
}
if (!selectedLabels || selectedLabels.length == 0) {
return "All Labels";
}
if (selectedLabels.length == 1) {
return selectedLabels[0];
}
return `${selectedLabels.length} Labels`;
}, [selectedLabels]);
const trigger = ( const trigger = (
<Button <Button
size="sm" size="sm"
variant={selectedLabels?.length ? "select" : "default"} variant={selectedLabels?.length ? "select" : "default"}
className="flex items-center gap-2 capitalize" className="flex items-center gap-2 capitalize"
> >
<FaFilter <MdLabel
className={`${selectedLabels?.length ? "text-selected-foreground" : "text-secondary-foreground"}`} className={`${selectedLabels?.length ? "text-selected-foreground" : "text-secondary-foreground"}`}
/> />
<div <div
className={`hidden md:block ${selectedLabels?.length ? "text-selected-foreground" : "text-primary"}`} className={`${selectedLabels?.length ? "text-selected-foreground" : "text-primary"}`}
> >
Filter {buttonText}
</div> </div>
</Button> </Button>
); );
@ -405,6 +431,22 @@ function ZoneFilterButton({
selectedZones, selectedZones,
); );
const buttonText = useMemo(() => {
if (isMobile) {
return "Zones";
}
if (!selectedZones || selectedZones.length == 0) {
return "All Zones";
}
if (selectedZones.length == 1) {
return selectedZones[0];
}
return `${selectedZones.length} Zones`;
}, [selectedZones]);
const trigger = ( const trigger = (
<Button <Button
size="sm" size="sm"
@ -415,11 +457,9 @@ function ZoneFilterButton({
className={`${selectedZones?.length ? "text-selected-foreground" : "text-secondary-foreground"}`} className={`${selectedZones?.length ? "text-selected-foreground" : "text-secondary-foreground"}`}
/> />
<div <div
className={`hidden md:block ${selectedZones?.length ? "text-selected-foreground" : "text-primary"}`} className={`${selectedZones?.length ? "text-selected-foreground" : "text-primary"}`}
> >
{selectedZones?.length {buttonText}
? `${selectedZones.length} Zone${selectedZones.length > 1 ? "s" : ""}`
: "All Zones"}
</div> </div>
</Button> </Button>
); );
@ -585,6 +625,22 @@ function SubFilterButton({
string[] | undefined string[] | undefined
>(selectedSubLabels); >(selectedSubLabels);
const buttonText = useMemo(() => {
if (isMobile) {
return "Sub Labels";
}
if (!selectedSubLabels || selectedSubLabels.length == 0) {
return "All Sub Labels";
}
if (selectedSubLabels.length == 1) {
return selectedSubLabels[0];
}
return `${selectedSubLabels.length} Sub Labels`;
}, [selectedSubLabels]);
const trigger = ( const trigger = (
<Button <Button
size="sm" size="sm"
@ -595,11 +651,9 @@ function SubFilterButton({
className={`${selectedSubLabels?.length || selectedSubLabels?.length ? "text-selected-foreground" : "text-secondary-foreground"}`} className={`${selectedSubLabels?.length || selectedSubLabels?.length ? "text-selected-foreground" : "text-secondary-foreground"}`}
/> />
<div <div
className={`hidden md:block ${selectedSubLabels?.length ? "text-selected-foreground" : "text-primary"}`} className={`${selectedSubLabels?.length ? "text-selected-foreground" : "text-primary"}`}
> >
{selectedSubLabels?.length {buttonText}
? `${selectedSubLabels.length} Sub Labels`
: "All Sub Labels"}
</div> </div>
</Button> </Button>
); );
@ -745,17 +799,34 @@ export function SubFilterContent({
} }
type SearchTypeButtonProps = { type SearchTypeButtonProps = {
selectedSearchSources: SearchSource[]; selectedSearchSources: SearchSource[] | undefined;
updateSearchSourceFilter: (sources: SearchSource[]) => void; updateSearchSourceFilter: (sources: SearchSource[] | undefined) => void;
}; };
function SearchTypeButton({ function SearchTypeButton({
selectedSearchSources, selectedSearchSources,
updateSearchSourceFilter, updateSearchSourceFilter,
}: SearchTypeButtonProps) { }: SearchTypeButtonProps) {
const [open, setOpen] = useState(false); const [open, setOpen] = useState(false);
const [currentSearchSources, setCurrentSearchSources] = useState<
SearchSource[] const buttonText = useMemo(() => {
>(selectedSearchSources); if (isMobile) {
return "Sources";
}
if (
!selectedSearchSources ||
selectedSearchSources.length == 0 ||
selectedSearchSources.length == 2
) {
return "All Search Sources";
}
if (selectedSearchSources.length == 1) {
return selectedSearchSources[0];
}
return `${selectedSearchSources.length} Search Sources`;
}, [selectedSearchSources]);
const trigger = ( const trigger = (
<Button <Button
@ -763,23 +834,19 @@ function SearchTypeButton({
variant={selectedSearchSources?.length != 2 ? "select" : "default"} variant={selectedSearchSources?.length != 2 ? "select" : "default"}
className="flex items-center gap-2 capitalize" className="flex items-center gap-2 capitalize"
> >
<FaFilter <SearchSourceIcon
className={`${selectedSearchSources?.length != 2 ? "text-selected-foreground" : "text-secondary-foreground"}`} className={`${selectedSearchSources?.length != 2 ? "text-selected-foreground" : "text-secondary-foreground"}`}
/> />
<div <div
className={`hidden md:block ${selectedSearchSources?.length != 2 ? "text-selected-foreground" : "text-primary"}`} className={`${selectedSearchSources?.length != 2 ? "text-selected-foreground" : "text-primary"}`}
> >
{selectedSearchSources?.length != 2 {buttonText}
? `${selectedSearchSources[0]}`
: "All Search Sources"}
</div> </div>
</Button> </Button>
); );
const content = ( const content = (
<SearchTypeContent <SearchTypeContent
selectedSearchSources={selectedSearchSources} selectedSearchSources={selectedSearchSources}
currentSearchSources={currentSearchSources}
setCurrentSearchSources={setCurrentSearchSources}
updateSearchSourceFilter={updateSearchSourceFilter} updateSearchSourceFilter={updateSearchSourceFilter}
onClose={() => setOpen(false)} onClose={() => setOpen(false)}
/> />
@ -790,10 +857,6 @@ function SearchTypeButton({
<Drawer <Drawer
open={open} open={open}
onOpenChange={(open) => { onOpenChange={(open) => {
if (!open) {
setCurrentSearchSources(selectedSearchSources);
}
setOpen(open); setOpen(open);
}} }}
> >
@ -809,10 +872,6 @@ function SearchTypeButton({
<Popover <Popover
open={open} open={open}
onOpenChange={(open) => { onOpenChange={(open) => {
if (!open) {
setCurrentSearchSources(selectedSearchSources);
}
setOpen(open); setOpen(open);
}} }}
> >
@ -823,26 +882,26 @@ function SearchTypeButton({
} }
type SearchTypeContentProps = { type SearchTypeContentProps = {
selectedSearchSources: SearchSource[]; selectedSearchSources: SearchSource[] | undefined;
currentSearchSources: SearchSource[]; updateSearchSourceFilter: (sources: SearchSource[] | undefined) => void;
setCurrentSearchSources: (sources: SearchSource[]) => void;
updateSearchSourceFilter: (sources: SearchSource[]) => void;
onClose: () => void; onClose: () => void;
}; };
export function SearchTypeContent({ export function SearchTypeContent({
selectedSearchSources, selectedSearchSources,
currentSearchSources,
setCurrentSearchSources,
updateSearchSourceFilter, updateSearchSourceFilter,
onClose, onClose,
}: SearchTypeContentProps) { }: SearchTypeContentProps) {
const [currentSearchSources, setCurrentSearchSources] = useState<
SearchSource[] | undefined
>(selectedSearchSources);
return ( return (
<> <>
<div className="scrollbar-container h-auto max-h-[80dvh] overflow-y-auto overflow-x-hidden"> <div className="scrollbar-container h-auto max-h-[80dvh] overflow-y-auto overflow-x-hidden">
<div className="my-2.5 flex flex-col gap-2.5"> <div className="my-2.5 flex flex-col gap-2.5">
<FilterSwitch <FilterSwitch
label="Thumbnail Image" label="Thumbnail Image"
isChecked={currentSearchSources?.includes("thumbnail") ?? false} isChecked={selectedSearchSources?.includes("thumbnail") ?? false}
onCheckedChange={(isChecked) => { onCheckedChange={(isChecked) => {
const updatedSources = currentSearchSources const updatedSources = currentSearchSources
? [...currentSearchSources] ? [...currentSearchSources]
@ -897,10 +956,8 @@ export function SearchTypeContent({
</Button> </Button>
<Button <Button
onClick={() => { onClick={() => {
setCurrentSearchSources([ updateSearchSourceFilter(undefined);
"thumbnail", setCurrentSearchSources(["thumbnail", "description"]);
"description",
] as SearchSource[]);
}} }}
> >
Reset Reset

View File

@ -0,0 +1,26 @@
import { forwardRef } from "react";
import { cn } from "@/lib/utils";
import { FaImage } from "react-icons/fa";
import { LuText } from "react-icons/lu";
type SearchSourceIconProps = {
className?: string;
onClick?: () => void;
};
const SearchSourceIcon = forwardRef<HTMLDivElement, SearchSourceIconProps>(
({ className, onClick }, ref) => {
return (
<div
ref={ref}
className={cn("relative flex items-center", className)}
onClick={onClick}
>
<LuText className="absolute size-3 translate-x-3 translate-y-3/4" />
<FaImage className="size-5" />
</div>
);
},
);
export default SearchSourceIcon;

View File

@ -1,6 +1,7 @@
import { forwardRef } from "react"; import { forwardRef } from "react";
import { cn } from "@/lib/utils"; import { cn } from "@/lib/utils";
import { FaCog, FaFilter } from "react-icons/fa"; import { FaCog } from "react-icons/fa";
import { MdLabelOutline } from "react-icons/md";
type SubFilterIconProps = { type SubFilterIconProps = {
className?: string; className?: string;
@ -15,8 +16,8 @@ const SubFilterIcon = forwardRef<HTMLDivElement, SubFilterIconProps>(
className={cn("relative flex items-center", className)} className={cn("relative flex items-center", className)}
onClick={onClick} onClick={onClick}
> >
<FaFilter className="size-full" /> <FaCog className="absolute size-3 translate-x-3 translate-y-[62%]" />
<FaCog className="absolute size-3 translate-x-3 translate-y-3/4" /> <MdLabelOutline className="size-5" />
</div> </div>
); );
}, },

View File

@ -338,12 +338,7 @@ function EventItem({
<Chip <Chip
className="cursor-pointer rounded-md bg-gray-500 bg-gradient-to-br from-gray-400 to-gray-500" className="cursor-pointer rounded-md bg-gray-500 bg-gradient-to-br from-gray-400 to-gray-500"
onClick={() => { onClick={() => {
const similaritySearchParams = new URLSearchParams({ navigate(`/explore?similarity_search_id=${event.id}`);
search_type: "similarity",
event_id: event.id,
}).toString();
navigate(`/search?${similaritySearchParams}`);
}} }}
> >
<FaImages className="size-4 text-white" /> <FaImages className="size-4 text-white" />

View File

@ -1,11 +1,4 @@
import { isDesktop, isIOS } from "react-device-detect"; import { isDesktop, isIOS, isMobile } from "react-device-detect";
import {
Sheet,
SheetContent,
SheetDescription,
SheetHeader,
SheetTitle,
} from "../../ui/sheet";
import { import {
Drawer, Drawer,
DrawerContent, DrawerContent,
@ -20,10 +13,32 @@ import { useFormattedTimestamp } from "@/hooks/use-date-utils";
import { getIconForLabel } from "@/utils/iconUtil"; import { getIconForLabel } from "@/utils/iconUtil";
import { useApiHost } from "@/api"; import { useApiHost } from "@/api";
import { Button } from "../../ui/button"; import { Button } from "../../ui/button";
import { useCallback, useEffect, useMemo, useState } from "react"; import { useCallback, useEffect, useMemo, useRef, useState } from "react";
import axios from "axios"; import axios from "axios";
import { toast } from "sonner"; import { toast } from "sonner";
import { Textarea } from "../../ui/textarea"; import { Textarea } from "../../ui/textarea";
import { ScrollArea, ScrollBar } from "@/components/ui/scroll-area";
import { ToggleGroup, ToggleGroupItem } from "@/components/ui/toggle-group";
import useOptimisticState from "@/hooks/use-optimistic-state";
import {
Dialog,
DialogContent,
DialogDescription,
DialogHeader,
DialogTitle,
} from "@/components/ui/dialog";
import { FrigatePlusDialog } from "../dialog/FrigatePlusDialog";
import { Event } from "@/types/event";
import HlsVideoPlayer from "@/components/player/HlsVideoPlayer";
import { baseUrl } from "@/api/baseUrl";
import { cn } from "@/lib/utils";
import ActivityIndicator from "@/components/indicators/activity-indicator";
import { ASPECT_VERTICAL_LAYOUT, ASPECT_WIDE_LAYOUT } from "@/types/record";
import { FaRegListAlt, FaVideo } from "react-icons/fa";
import FrigatePlusIcon from "@/components/icons/FrigatePlusIcon";
const SEARCH_TABS = ["details", "frigate+", "video"] as const;
type SearchTab = (typeof SEARCH_TABS)[number];
type SearchDetailDialogProps = { type SearchDetailDialogProps = {
search?: SearchResult; search?: SearchResult;
@ -39,6 +54,133 @@ export default function SearchDetailDialog({
revalidateOnFocus: false, revalidateOnFocus: false,
}); });
// tabs
const [page, setPage] = useState<SearchTab>("details");
const [pageToggle, setPageToggle] = useOptimisticState(page, setPage, 100);
const searchTabs = useMemo(() => {
if (!config || !search) {
return [];
}
const views = [...SEARCH_TABS];
if (!config.plus.enabled || !search.has_snapshot) {
const index = views.indexOf("frigate+");
views.splice(index, 1);
}
// TODO implement
//if (!config.semantic_search.enabled) {
// const index = views.indexOf("similar-calendar");
// views.splice(index, 1);
// }
return views;
}, [config, search]);
if (!search) {
return;
}
// content
const Overlay = isDesktop ? Dialog : Drawer;
const Content = isDesktop ? DialogContent : DrawerContent;
const Header = isDesktop ? DialogHeader : DrawerHeader;
const Title = isDesktop ? DialogTitle : DrawerTitle;
const Description = isDesktop ? DialogDescription : DrawerDescription;
return (
<Overlay
open={search != undefined}
onOpenChange={(open) => {
if (!open) {
setSearch(undefined);
}
}}
>
<Content
className={
isDesktop
? "sm:max-w-xl md:max-w-3xl lg:max-w-4xl xl:max-w-7xl"
: "max-h-[75dvh] overflow-hidden px-2 pb-4"
}
>
<Header className="sr-only">
<Title>Tracked Object Details</Title>
<Description>Tracked object details</Description>
</Header>
<ScrollArea
className={cn("w-full whitespace-nowrap", isMobile && "my-2")}
>
<div className="flex flex-row">
<ToggleGroup
className="*:rounded-md *:px-3 *:py-4"
type="single"
size="sm"
value={pageToggle}
onValueChange={(value: SearchTab) => {
if (value) {
setPageToggle(value);
}
}}
>
{Object.values(searchTabs).map((item) => (
<ToggleGroupItem
key={item}
className={`flex scroll-mx-10 items-center justify-between gap-2 ${page == "details" ? "last:mr-20" : ""} ${pageToggle == item ? "" : "*:text-muted-foreground"}`}
value={item}
data-nav-item={item}
aria-label={`Select ${item}`}
>
{item == "details" && <FaRegListAlt className="size-4" />}
{item == "frigate+" && <FrigatePlusIcon className="size-4" />}
{item == "video" && <FaVideo className="size-4" />}
<div className="capitalize">{item}</div>
</ToggleGroupItem>
))}
</ToggleGroup>
<ScrollBar orientation="horizontal" className="h-0" />
</div>
</ScrollArea>
{page == "details" && (
<ObjectDetailsTab
search={search}
config={config}
setSearch={setSearch}
setSimilarity={setSimilarity}
/>
)}
{page == "frigate+" && (
<FrigatePlusDialog
upload={search as unknown as Event}
dialog={false}
onClose={() => {}}
onEventUploaded={() => {
search.plus_id = "new_upload";
}}
/>
)}
{page == "video" && <VideoTab search={search} config={config} />}
</Content>
</Overlay>
);
}
type ObjectDetailsTabProps = {
search: SearchResult;
config?: FrigateConfig;
setSearch: (search: SearchResult | undefined) => void;
setSimilarity?: () => void;
};
function ObjectDetailsTab({
search,
config,
setSearch,
setSimilarity,
}: ObjectDetailsTabProps) {
const apiHost = useApiHost(); const apiHost = useApiHost();
// data // data
@ -77,8 +219,6 @@ export default function SearchDetailDialog({
} }
}, [search]); }, [search]);
// api
const updateDescription = useCallback(() => { const updateDescription = useCallback(() => {
if (!search) { if (!search) {
return; return;
@ -101,33 +241,7 @@ export default function SearchDetailDialog({
}); });
}, [desc, search]); }, [desc, search]);
// content
const Overlay = isDesktop ? Sheet : Drawer;
const Content = isDesktop ? SheetContent : DrawerContent;
const Header = isDesktop ? SheetHeader : DrawerHeader;
const Title = isDesktop ? SheetTitle : DrawerTitle;
const Description = isDesktop ? SheetDescription : DrawerDescription;
return ( return (
<Overlay
open={search != undefined}
onOpenChange={(open) => {
if (!open) {
setSearch(undefined);
}
}}
>
<Content
className={
isDesktop ? "sm:max-w-xl" : "max-h-[75dvh] overflow-hidden p-2 pb-4"
}
>
<Header className="sr-only">
<Title>Tracked Object Details</Title>
<Description>Tracked object details</Description>
</Header>
{search && (
<div className="mt-3 flex size-full flex-col gap-5 md:mt-0"> <div className="mt-3 flex size-full flex-col gap-5 md:mt-0">
<div className="flex w-full flex-row"> <div className="flex w-full flex-row">
<div className="flex w-full flex-col gap-3"> <div className="flex w-full flex-col gap-3">
@ -198,8 +312,84 @@ export default function SearchDetailDialog({
</div> </div>
</div> </div>
</div> </div>
)} );
</Content> }
</Overlay>
type VideoTabProps = {
search: SearchResult;
config?: FrigateConfig;
};
function VideoTab({ search, config }: VideoTabProps) {
const [isLoading, setIsLoading] = useState(true);
const videoRef = useRef<HTMLVideoElement | null>(null);
const endTime = useMemo(() => search.end_time ?? Date.now() / 1000, [search]);
const mainCameraAspect = useMemo(() => {
const camera = config?.cameras?.[search.camera];
if (!camera) {
return "normal";
}
const aspectRatio = camera.detect.width / camera.detect.height;
if (!aspectRatio) {
return "normal";
} else if (aspectRatio > ASPECT_WIDE_LAYOUT) {
return "wide";
} else if (aspectRatio < ASPECT_VERTICAL_LAYOUT) {
return "tall";
} else {
return "normal";
}
}, [config, search]);
const containerClassName = useMemo(() => {
if (mainCameraAspect == "wide") {
return "flex justify-center items-center";
} else if (mainCameraAspect == "tall") {
if (isDesktop) {
return "size-full flex flex-col justify-center items-center";
} else {
return "size-full";
}
} else {
return "";
}
}, [mainCameraAspect]);
const videoClassName = useMemo(() => {
if (mainCameraAspect == "wide") {
return "w-full aspect-wide";
} else if (mainCameraAspect == "tall") {
if (isDesktop) {
return "w-[50%] aspect-tall flex justify-center";
} else {
return "size-full";
}
} else {
return "w-full aspect-video";
}
}, [mainCameraAspect]);
return (
<div className={`aspect-video ${containerClassName}`}>
{isLoading && (
<ActivityIndicator className="absolute left-1/2 top-1/2 -translate-x-1/2 -translate-y-1/2" />
)}
<div className={videoClassName}>
<HlsVideoPlayer
videoRef={videoRef}
currentSource={`${baseUrl}vod/${search.camera}/start/${search.start_time}/end/${endTime}/index.m3u8`}
hotKeys
visible
frigateControls={false}
fullscreen={false}
supportsFullscreen={false}
onPlaying={() => setIsLoading(false)}
/>
</div>
</div>
); );
} }

View File

@ -1,4 +1,5 @@
import { baseUrl } from "@/api/baseUrl"; import { baseUrl } from "@/api/baseUrl";
import ActivityIndicator from "@/components/indicators/activity-indicator";
import { Button } from "@/components/ui/button"; import { Button } from "@/components/ui/button";
import { import {
Dialog, Dialog,
@ -11,17 +12,21 @@ import {
import { Event } from "@/types/event"; import { Event } from "@/types/event";
import { FrigateConfig } from "@/types/frigateConfig"; import { FrigateConfig } from "@/types/frigateConfig";
import axios from "axios"; import axios from "axios";
import { useCallback, useMemo } from "react"; import { useCallback, useMemo, useState } from "react";
import { TransformWrapper, TransformComponent } from "react-zoom-pan-pinch"; import { TransformWrapper, TransformComponent } from "react-zoom-pan-pinch";
import useSWR from "swr"; import useSWR from "swr";
type SubmissionState = "reviewing" | "uploading" | "submitted";
type FrigatePlusDialogProps = { type FrigatePlusDialogProps = {
upload?: Event; upload?: Event;
dialog?: boolean;
onClose: () => void; onClose: () => void;
onEventUploaded: () => void; onEventUploaded: () => void;
}; };
export function FrigatePlusDialog({ export function FrigatePlusDialog({
upload, upload,
dialog = true,
onClose, onClose,
onEventUploaded, onEventUploaded,
}: FrigatePlusDialogProps) { }: FrigatePlusDialogProps) {
@ -49,6 +54,10 @@ export function FrigatePlusDialog({
// upload // upload
const [state, setState] = useState<SubmissionState>(
upload?.plus_id ? "submitted" : "reviewing",
);
const onSubmitToPlus = useCallback( const onSubmitToPlus = useCallback(
async (falsePositive: boolean) => { async (falsePositive: boolean) => {
if (!upload) { if (!upload) {
@ -61,18 +70,14 @@ export function FrigatePlusDialog({
include_annotation: 1, include_annotation: 1,
}); });
setState("submitted");
onEventUploaded(); onEventUploaded();
onClose(); onClose();
}, },
[upload, onClose, onEventUploaded], [upload, onClose, onEventUploaded],
); );
return ( const content = (
<Dialog
open={upload != undefined}
onOpenChange={(open) => (!open ? onClose() : null)}
>
<DialogContent className="md:max-w-3xl lg:max-w-4xl xl:max-w-7xl">
<TransformWrapper minScale={1.0} wheel={{ smoothStep: 0.005 }}> <TransformWrapper minScale={1.0} wheel={{ smoothStep: 0.005 }}>
<DialogHeader> <DialogHeader>
<DialogTitle>Submit To Frigate+</DialogTitle> <DialogTitle>Submit To Frigate+</DialogTitle>
@ -100,24 +105,49 @@ export function FrigatePlusDialog({
/> />
)} )}
</TransformComponent> </TransformComponent>
<DialogFooter> <DialogFooter>
<Button onClick={onClose}>Cancel</Button> {state == "reviewing" && (
<>
{dialog && <Button onClick={onClose}>Cancel</Button>}
<Button <Button
className="bg-success" className="bg-success"
onClick={() => onSubmitToPlus(false)} onClick={() => {
setState("uploading");
onSubmitToPlus(false);
}}
> >
This is a {upload?.label} This is a {upload?.label}
</Button> </Button>
<Button <Button
className="text-white" className="text-white"
variant="destructive" variant="destructive"
onClick={() => onSubmitToPlus(true)} onClick={() => {
setState("uploading");
onSubmitToPlus(true);
}}
> >
This is not a {upload?.label} This is not a {upload?.label}
</Button> </Button>
</>
)}
{state == "uploading" && <ActivityIndicator />}
</DialogFooter> </DialogFooter>
</TransformWrapper> </TransformWrapper>
);
if (dialog) {
return (
<Dialog
open={upload != undefined}
onOpenChange={(open) => (!open ? onClose() : null)}
>
<DialogContent className="md:max-w-3xl lg:max-w-4xl xl:max-w-7xl">
{content}
</DialogContent> </DialogContent>
</Dialog> </Dialog>
); );
} }
return content;
}

View File

@ -35,6 +35,7 @@ type HlsVideoPlayerProps = {
hotKeys: boolean; hotKeys: boolean;
supportsFullscreen: boolean; supportsFullscreen: boolean;
fullscreen: boolean; fullscreen: boolean;
frigateControls?: boolean;
onClipEnded?: () => void; onClipEnded?: () => void;
onPlayerLoaded?: () => void; onPlayerLoaded?: () => void;
onTimeUpdate?: (time: number) => void; onTimeUpdate?: (time: number) => void;
@ -52,6 +53,7 @@ export default function HlsVideoPlayer({
hotKeys, hotKeys,
supportsFullscreen, supportsFullscreen,
fullscreen, fullscreen,
frigateControls = true,
onClipEnded, onClipEnded,
onPlayerLoaded, onPlayerLoaded,
onTimeUpdate, onTimeUpdate,
@ -167,6 +169,7 @@ export default function HlsVideoPlayer({
return ( return (
<TransformWrapper minScale={1.0} wheel={{ smoothStep: 0.005 }}> <TransformWrapper minScale={1.0} wheel={{ smoothStep: 0.005 }}>
{frigateControls && (
<VideoControls <VideoControls
className={cn( className={cn(
"absolute left-1/2 z-50 -translate-x-1/2", "absolute left-1/2 z-50 -translate-x-1/2",
@ -234,6 +237,7 @@ export default function HlsVideoPlayer({
toggleFullscreen={toggleFullscreen} toggleFullscreen={toggleFullscreen}
containerRef={containerRef} containerRef={containerRef}
/> />
)}
<TransformComponent <TransformComponent
wrapperStyle={{ wrapperStyle={{
display: visible ? undefined : "none", display: visible ? undefined : "none",
@ -253,7 +257,7 @@ export default function HlsVideoPlayer({
className={`size-full rounded-lg bg-black md:rounded-2xl ${loadedMetadata ? "" : "invisible"}`} className={`size-full rounded-lg bg-black md:rounded-2xl ${loadedMetadata ? "" : "invisible"}`}
preload="auto" preload="auto"
autoPlay autoPlay
controls={false} controls={!frigateControls}
playsInline playsInline
muted={muted} muted={muted}
onVolumeChange={() => onVolumeChange={() =>

View File

@ -19,6 +19,7 @@ import { capitalizeFirstLetter } from "@/utils/stringUtil";
import { cn } from "@/lib/utils"; import { cn } from "@/lib/utils";
import { TbExclamationCircle } from "react-icons/tb"; import { TbExclamationCircle } from "react-icons/tb";
import { TooltipPortal } from "@radix-ui/react-tooltip"; import { TooltipPortal } from "@radix-ui/react-tooltip";
import { baseUrl } from "@/api/baseUrl";
type LivePlayerProps = { type LivePlayerProps = {
cameraRef?: (ref: HTMLDivElement | null) => void; cameraRef?: (ref: HTMLDivElement | null) => void;
@ -224,6 +225,9 @@ export default function LivePlayer({
className, className,
)} )}
onClick={onClick} onClick={onClick}
onAuxClick={() =>
window.open(`${baseUrl}#${cameraConfig.name}`, "_blank")?.focus()
}
> >
{((showStillWithoutActivity && !liveReady) || liveReady) && ( {((showStillWithoutActivity && !liveReady) || liveReady) && (
<> <>

View File

@ -20,6 +20,7 @@ import { capitalizeFirstLetter } from "@/utils/stringUtil";
import { cn } from "@/lib/utils"; import { cn } from "@/lib/utils";
import { InProgressPreview, VideoPreview } from "../preview/ScrubbablePreview"; import { InProgressPreview, VideoPreview } from "../preview/ScrubbablePreview";
import { Preview } from "@/types/preview"; import { Preview } from "@/types/preview";
import { baseUrl } from "@/api/baseUrl";
type PreviewPlayerProps = { type PreviewPlayerProps = {
review: ReviewSegment; review: ReviewSegment;
@ -175,6 +176,9 @@ export default function PreviewThumbnailPlayer({
onMouseOver={isMobile ? undefined : () => setIsHovered(true)} onMouseOver={isMobile ? undefined : () => setIsHovered(true)}
onMouseLeave={isMobile ? undefined : () => setIsHovered(false)} onMouseLeave={isMobile ? undefined : () => setIsHovered(false)}
onClick={handleOnClick} onClick={handleOnClick}
onAuxClick={() =>
window.open(`${baseUrl}review?id=${review.id}`, "_blank")?.focus()
}
{...swipeHandlers} {...swipeHandlers}
> >
{playingBack && ( {playingBack && (

View File

@ -1,308 +0,0 @@
import React, { useCallback, useEffect, useMemo, useState } from "react";
import { useApiHost } from "@/api";
import { isCurrentHour } from "@/utils/dateUtil";
import { getIconForLabel } from "@/utils/iconUtil";
import TimeAgo from "../dynamic/TimeAgo";
import useSWR from "swr";
import { FrigateConfig } from "@/types/frigateConfig";
import { isIOS, isMobile, isSafari } from "react-device-detect";
import Chip from "@/components/indicators/Chip";
import { useFormattedTimestamp } from "@/hooks/use-date-utils";
import useImageLoaded from "@/hooks/use-image-loaded";
import { useSwipeable } from "react-swipeable";
import { Tooltip, TooltipContent, TooltipTrigger } from "../ui/tooltip";
import ImageLoadingIndicator from "../indicators/ImageLoadingIndicator";
import ActivityIndicator from "../indicators/activity-indicator";
import { capitalizeFirstLetter } from "@/utils/stringUtil";
import { InProgressPreview, VideoPreview } from "../preview/ScrubbablePreview";
import { Preview } from "@/types/preview";
import { SearchResult } from "@/types/search";
import useContextMenu from "@/hooks/use-contextmenu";
import { cn } from "@/lib/utils";
type SearchPlayerProps = {
searchResult: SearchResult;
allPreviews?: Preview[];
scrollLock?: boolean;
onClick: (searchResult: SearchResult, detail: boolean) => void;
};
export default function SearchThumbnailPlayer({
searchResult,
allPreviews,
scrollLock = false,
onClick,
}: SearchPlayerProps) {
const apiHost = useApiHost();
const { data: config } = useSWR<FrigateConfig>("config");
const [imgRef, imgLoaded, onImgLoad] = useImageLoaded();
// interaction
const [ignoreClick, setIgnoreClick] = useState(false);
const handleOnClick = useCallback(
(e: React.MouseEvent<HTMLDivElement>) => {
if (!ignoreClick) {
onClick(searchResult, e.metaKey);
}
},
[ignoreClick, searchResult, onClick],
);
const swipeHandlers = useSwipeable({
onSwipedLeft: () => setPlayback(false),
onSwipedRight: () => setPlayback(true),
preventScrollOnSwipe: true,
});
useContextMenu(imgRef, () => {
onClick(searchResult, true);
});
// playback
const relevantPreview = useMemo(() => {
if (!allPreviews) {
return undefined;
}
let multiHour = false;
const firstIndex = Object.values(allPreviews).findIndex((preview) => {
if (
preview.camera != searchResult.camera ||
preview.end < searchResult.start_time
) {
return false;
}
if ((searchResult.end_time ?? Date.now() / 1000) > preview.end) {
multiHour = true;
}
return true;
});
if (firstIndex == -1) {
return undefined;
}
if (!multiHour) {
return allPreviews[firstIndex];
}
const firstPrev = allPreviews[firstIndex];
const firstDuration = firstPrev.end - searchResult.start_time;
const secondDuration =
(searchResult.end_time ?? Date.now() / 1000) - firstPrev.end;
if (firstDuration > secondDuration) {
// the first preview is longer than the second, return the first
return firstPrev;
} else {
// the second preview is longer, return the second if it exists
if (firstIndex < allPreviews.length - 1) {
return allPreviews.find(
(preview, idx) =>
idx > firstIndex && preview.camera == searchResult.camera,
);
}
return undefined;
}
}, [allPreviews, searchResult]);
// Hover Playback
const [hoverTimeout, setHoverTimeout] = useState<NodeJS.Timeout | null>();
const [playback, setPlayback] = useState(false);
const [tooltipHovering, setTooltipHovering] = useState(false);
const playingBack = useMemo(
() => playback && !tooltipHovering,
[playback, tooltipHovering],
);
const [isHovered, setIsHovered] = useState(false);
useEffect(() => {
if (isHovered && scrollLock) {
return;
}
if (isHovered && !tooltipHovering) {
setHoverTimeout(
setTimeout(() => {
setPlayback(true);
setHoverTimeout(null);
}, 500),
);
} else {
if (hoverTimeout) {
clearTimeout(hoverTimeout);
}
setPlayback(false);
}
// we know that these deps are correct
// eslint-disable-next-line react-hooks/exhaustive-deps
}, [isHovered, scrollLock, tooltipHovering]);
// date
const formattedDate = useFormattedTimestamp(
searchResult.start_time,
config?.ui.time_format == "24hour" ? "%b %-d, %H:%M" : "%b %-d, %I:%M %p",
);
return (
<div
className="relative size-full cursor-pointer"
onMouseOver={isMobile ? undefined : () => setIsHovered(true)}
onMouseLeave={isMobile ? undefined : () => setIsHovered(false)}
onClick={handleOnClick}
{...swipeHandlers}
>
{playingBack && (
<div className="absolute inset-0 animate-in fade-in">
<PreviewContent
searchResult={searchResult}
relevantPreview={relevantPreview}
setIgnoreClick={setIgnoreClick}
isPlayingBack={setPlayback}
/>
</div>
)}
<ImageLoadingIndicator
className="absolute inset-0"
imgLoaded={imgLoaded}
/>
<div className={`${imgLoaded ? "visible" : "invisible"}`}>
<img
ref={imgRef}
className={cn(
"size-full select-none transition-opacity",
playingBack ? "opacity-0" : "opacity-100",
searchResult.search_source == "thumbnail" && "object-contain",
)}
style={
isIOS
? {
WebkitUserSelect: "none",
WebkitTouchCallout: "none",
}
: undefined
}
draggable={false}
src={`${apiHost}api/events/${searchResult.id}/thumbnail.jpg`}
loading={isSafari ? "eager" : "lazy"}
onLoad={() => {
onImgLoad();
}}
/>
<div className="absolute left-0 top-2 z-40">
<Tooltip>
<div
className="flex"
onMouseEnter={() => setTooltipHovering(true)}
onMouseLeave={() => setTooltipHovering(false)}
>
<TooltipTrigger asChild>
<div className="mx-3 pb-1 text-sm text-white">
{
<>
<Chip
className={`flex items-start justify-between space-x-1 ${playingBack ? "hidden" : ""} "bg-gray-500 z-0 bg-gradient-to-br from-gray-400 to-gray-500`}
onClick={() => onClick(searchResult, true)}
>
{getIconForLabel(
searchResult.label,
"size-3 text-white",
)}
</Chip>
</>
}
</div>
</TooltipTrigger>
</div>
<TooltipContent className="capitalize">
{[...new Set([searchResult.label])]
.filter(
(item) => item !== undefined && !item.includes("-verified"),
)
.map((text) => capitalizeFirstLetter(text))
.sort()
.join(", ")
.replaceAll("-verified", "")}
</TooltipContent>
</Tooltip>
</div>
{!playingBack && (
<>
<div className="rounded-t-l pointer-events-none absolute inset-x-0 top-0 z-10 h-[30%] w-full bg-gradient-to-b from-black/60 to-transparent"></div>
<div className="rounded-b-l pointer-events-none absolute inset-x-0 bottom-0 z-10 h-[20%] w-full bg-gradient-to-t from-black/60 to-transparent">
<div className="mx-3 flex h-full items-end justify-between pb-1 text-sm text-white">
{searchResult.end_time ? (
<TimeAgo time={searchResult.start_time * 1000} dense />
) : (
<div>
<ActivityIndicator size={24} />
</div>
)}
{formattedDate}
</div>
</div>
</>
)}
</div>
</div>
);
}
type PreviewContentProps = {
searchResult: SearchResult;
relevantPreview: Preview | undefined;
setIgnoreClick: (ignore: boolean) => void;
isPlayingBack: (ended: boolean) => void;
onTimeUpdate?: (time: number | undefined) => void;
};
function PreviewContent({
searchResult,
relevantPreview,
setIgnoreClick,
isPlayingBack,
onTimeUpdate,
}: PreviewContentProps) {
// preview
const now = useMemo(() => Date.now() / 1000, []);
if (relevantPreview) {
return (
<VideoPreview
relevantPreview={relevantPreview}
startTime={searchResult.start_time}
endTime={searchResult.end_time}
setIgnoreClick={setIgnoreClick}
isPlayingBack={isPlayingBack}
onTimeUpdate={onTimeUpdate}
windowVisible={true}
setReviewed={() => {}}
/>
);
} else if (isCurrentHour(searchResult.start_time)) {
return (
<InProgressPreview
camera={searchResult.camera}
startTime={searchResult.start_time}
endTime={searchResult.end_time}
timeRange={{
before: now,
after: searchResult.start_time,
}}
setIgnoreClick={setIgnoreClick}
isPlayingBack={isPlayingBack}
onTimeUpdate={onTimeUpdate}
windowVisible={true}
setReviewed={() => {}}
/>
);
}
}

View File

@ -1,28 +1,20 @@
import Logo from "@/components/Logo";
import { ENV } from "@/env"; import { ENV } from "@/env";
import { FrigateConfig } from "@/types/frigateConfig";
import { NavData } from "@/types/navigation"; import { NavData } from "@/types/navigation";
import { useMemo } from "react"; import { useMemo } from "react";
import { FaCompactDisc, FaVideo } from "react-icons/fa"; import { FaCompactDisc, FaVideo } from "react-icons/fa";
import { IoSearch } from "react-icons/io5"; import { IoSearch } from "react-icons/io5";
import { LuConstruction } from "react-icons/lu"; import { LuConstruction } from "react-icons/lu";
import { MdVideoLibrary } from "react-icons/md"; import { MdVideoLibrary } from "react-icons/md";
import useSWR from "swr";
export const ID_LIVE = 1; export const ID_LIVE = 1;
export const ID_REVIEW = 2; export const ID_REVIEW = 2;
export const ID_SEARCH = 3; export const ID_EXPLORE = 3;
export const ID_EXPORT = 4; export const ID_EXPORT = 4;
export const ID_PLUS = 5; export const ID_PLAYGROUND = 5;
export const ID_PLAYGROUND = 6;
export default function useNavigation( export default function useNavigation(
variant: "primary" | "secondary" = "primary", variant: "primary" | "secondary" = "primary",
) { ) {
const { data: config } = useSWR<FrigateConfig>("config", {
revalidateOnFocus: false,
});
return useMemo( return useMemo(
() => () =>
[ [
@ -41,11 +33,11 @@ export default function useNavigation(
url: "/review", url: "/review",
}, },
{ {
id: ID_SEARCH, id: ID_EXPLORE,
variant, variant,
icon: IoSearch, icon: IoSearch,
title: "Search", title: "Explore",
url: "/search", url: "/explore",
}, },
{ {
id: ID_EXPORT, id: ID_EXPORT,
@ -54,14 +46,6 @@ export default function useNavigation(
title: "Export", title: "Export",
url: "/export", url: "/export",
}, },
{
id: ID_PLUS,
variant,
icon: Logo,
title: "Frigate+",
url: "/plus",
enabled: config?.plus?.enabled == true,
},
{ {
id: ID_PLAYGROUND, id: ID_PLAYGROUND,
variant, variant,
@ -71,6 +55,6 @@ export default function useNavigation(
enabled: ENV !== "production", enabled: ENV !== "production",
}, },
] as NavData[], ] as NavData[],
[config?.plus.enabled, variant], [variant],
); );
} }

View File

@ -1,20 +1,16 @@
import { useApiFilterArgs } from "@/hooks/use-api-filter"; import { useApiFilterArgs } from "@/hooks/use-api-filter";
import { useCameraPreviews } from "@/hooks/use-camera-previews"; import { useCameraPreviews } from "@/hooks/use-camera-previews";
import { useOverlayState } from "@/hooks/use-overlay-state"; import { useOverlayState, useSearchEffect } from "@/hooks/use-overlay-state";
import { FrigateConfig } from "@/types/frigateConfig"; import { FrigateConfig } from "@/types/frigateConfig";
import { RecordingStartingPoint } from "@/types/record"; import { RecordingStartingPoint } from "@/types/record";
import { import { SearchFilter, SearchResult } from "@/types/search";
PartialSearchResult,
SearchFilter,
SearchResult,
} from "@/types/search";
import { TimeRange } from "@/types/timeline"; import { TimeRange } from "@/types/timeline";
import { RecordingView } from "@/views/recording/RecordingView"; import { RecordingView } from "@/views/recording/RecordingView";
import SearchView from "@/views/search/SearchView"; import SearchView from "@/views/search/SearchView";
import { useCallback, useEffect, useMemo, useState } from "react"; import { useCallback, useEffect, useMemo, useState } from "react";
import useSWR from "swr"; import useSWR from "swr";
export default function Search() { export default function Explore() {
const { data: config } = useSWR<FrigateConfig>("config", { const { data: config } = useSWR<FrigateConfig>("config", {
revalidateOnFocus: false, revalidateOnFocus: false,
}); });
@ -30,45 +26,26 @@ export default function Search() {
// search filter // search filter
const similaritySearch = useMemo(() => {
if (!searchTerm.includes("similarity:")) {
return undefined;
}
return searchTerm.split(":")[1];
}, [searchTerm]);
const [searchFilter, setSearchFilter, searchSearchParams] = const [searchFilter, setSearchFilter, searchSearchParams] =
useApiFilterArgs<SearchFilter>(); useApiFilterArgs<SearchFilter>();
const onUpdateFilter = useCallback(
(newFilter: SearchFilter) => {
setSearchFilter(newFilter);
},
[setSearchFilter],
);
// search api // search api
const [similaritySearch, setSimilaritySearch] = useSearchEffect("similarity_search_id", (similarityId) => {
useState<PartialSearchResult>(); setSearch(`similarity:${similarityId}`);
// @ts-expect-error we want to clear this
useEffect(() => { setSearchFilter({ ...searchFilter, similarity_search_id: undefined });
if (
config?.semantic_search.enabled &&
searchSearchParams["search_type"] == "similarity" &&
searchSearchParams["event_id"]?.length != 0 &&
searchFilter
) {
setSimilaritySearch({
id: searchSearchParams["event_id"],
}); });
// remove event id from url params
const { event_id: _event_id, ...newFilter } = searchFilter;
setSearchFilter(newFilter);
}
// only run similarity search with event_id in the url when coming from review
// eslint-disable-next-line react-hooks/exhaustive-deps
}, []);
useEffect(() => { useEffect(() => {
if (similaritySearch) {
setSimilaritySearch(undefined);
}
if (searchTimeout) { if (searchTimeout) {
clearTimeout(searchTimeout); clearTimeout(searchTimeout);
} }
@ -88,7 +65,7 @@ export default function Search() {
return [ return [
"events/search", "events/search",
{ {
query: similaritySearch.id, query: similaritySearch,
cameras: searchSearchParams["cameras"], cameras: searchSearchParams["cameras"],
labels: searchSearchParams["labels"], labels: searchSearchParams["labels"],
sub_labels: searchSearchParams["subLabels"], sub_labels: searchSearchParams["subLabels"],
@ -118,6 +95,7 @@ export default function Search() {
]; ];
} }
if (searchSearchParams && Object.keys(searchSearchParams).length !== 0) {
return [ return [
"events", "events",
{ {
@ -133,6 +111,9 @@ export default function Search() {
include_thumbnails: 0, include_thumbnails: 0,
}, },
]; ];
}
return null;
}, [searchTerm, searchSearchParams, similaritySearch]); }, [searchTerm, searchSearchParams, similaritySearch]);
const { data: searchResults, isLoading } = const { data: searchResults, isLoading } =
@ -219,7 +200,7 @@ export default function Search() {
allCameras={selectedReviewData.allCameras} allCameras={selectedReviewData.allCameras}
allPreviews={allPreviews} allPreviews={allPreviews}
timeRange={selectedTimeRange} timeRange={selectedTimeRange}
updateFilter={onUpdateFilter} updateFilter={setSearchFilter}
/> />
); );
} }
@ -230,12 +211,10 @@ export default function Search() {
searchTerm={searchTerm} searchTerm={searchTerm}
searchFilter={searchFilter} searchFilter={searchFilter}
searchResults={searchResults} searchResults={searchResults}
allPreviews={allPreviews}
isLoading={isLoading} isLoading={isLoading}
setSearch={setSearch} setSearch={setSearch}
similaritySearch={similaritySearch} setSimilaritySearch={(search) => setSearch(`similarity:${search.id}`)}
setSimilaritySearch={setSimilaritySearch} onUpdateFilter={setSearchFilter}
onUpdateFilter={onUpdateFilter}
onOpenSearch={onOpenSearch} onOpenSearch={onOpenSearch}
/> />
); );

View File

@ -1,636 +0,0 @@
import { baseUrl } from "@/api/baseUrl";
import { CamerasFilterButton } from "@/components/filter/CamerasFilterButton";
import { GeneralFilterContent } from "@/components/filter/ReviewFilterGroup";
import Chip from "@/components/indicators/Chip";
import ActivityIndicator from "@/components/indicators/activity-indicator";
import { FrigatePlusDialog } from "@/components/overlay/dialog/FrigatePlusDialog";
import { Button } from "@/components/ui/button";
import { Drawer, DrawerContent, DrawerTrigger } from "@/components/ui/drawer";
import {
DropdownMenu,
DropdownMenuContent,
DropdownMenuSeparator,
DropdownMenuTrigger,
} from "@/components/ui/dropdown-menu";
import { Input } from "@/components/ui/input";
import { Label } from "@/components/ui/label";
import { RadioGroup, RadioGroupItem } from "@/components/ui/radio-group";
import { DualThumbSlider } from "@/components/ui/slider";
import {
Tooltip,
TooltipContent,
TooltipTrigger,
} from "@/components/ui/tooltip";
import { Event } from "@/types/event";
import { ATTRIBUTE_LABELS, FrigateConfig } from "@/types/frigateConfig";
import { getIconForLabel } from "@/utils/iconUtil";
import { capitalizeFirstLetter } from "@/utils/stringUtil";
import axios from "axios";
import { useCallback, useEffect, useMemo, useRef, useState } from "react";
import { isMobile } from "react-device-detect";
import {
FaList,
FaSort,
FaSortAmountDown,
FaSortAmountUp,
} from "react-icons/fa";
import { LuFolderX } from "react-icons/lu";
import { PiSlidersHorizontalFill } from "react-icons/pi";
import useSWR from "swr";
import useSWRInfinite from "swr/infinite";
const API_LIMIT = 100;
export default function SubmitPlus() {
// title
useEffect(() => {
document.title = "Plus - Frigate";
}, []);
// filters
const [selectedCameras, setSelectedCameras] = useState<string[]>();
const [selectedLabels, setSelectedLabels] = useState<string[]>();
const [scoreRange, setScoreRange] = useState<number[]>();
// sort
const [sort, setSort] = useState<string>();
// data
const eventFetcher = useCallback((key: string) => {
const [path, params] = Array.isArray(key) ? key : [key, undefined];
return axios.get(path, { params }).then((res) => res.data);
}, []);
const getKey = useCallback(
(index: number, prevData: Event[]) => {
if (index > 0) {
const lastDate = prevData[prevData.length - 1].start_time;
return [
"events",
{
limit: API_LIMIT,
in_progress: 0,
is_submitted: 0,
has_snapshot: 1,
cameras: selectedCameras ? selectedCameras.join(",") : null,
labels: selectedLabels ? selectedLabels.join(",") : null,
min_score: scoreRange ? scoreRange[0] : null,
max_score: scoreRange ? scoreRange[1] : null,
sort: sort ? sort : null,
before: lastDate,
},
];
}
return [
"events",
{
limit: 100,
in_progress: 0,
is_submitted: 0,
has_snapshot: 1,
cameras: selectedCameras ? selectedCameras.join(",") : null,
labels: selectedLabels ? selectedLabels.join(",") : null,
min_score: scoreRange ? scoreRange[0] : null,
max_score: scoreRange ? scoreRange[1] : null,
sort: sort ? sort : null,
},
];
},
[scoreRange, selectedCameras, selectedLabels, sort],
);
const {
data: eventPages,
mutate: refresh,
size,
setSize,
isValidating,
} = useSWRInfinite<Event[]>(getKey, eventFetcher, {
revalidateOnFocus: false,
});
const events = useMemo(
() => (eventPages ? eventPages.flat() : []),
[eventPages],
);
const [upload, setUpload] = useState<Event>();
// paging
const isDone = useMemo(
() => (eventPages?.at(-1)?.length ?? 0) < API_LIMIT,
[eventPages],
);
const pagingObserver = useRef<IntersectionObserver | null>();
const lastEventRef = useCallback(
(node: HTMLElement | null) => {
if (isValidating) return;
if (pagingObserver.current) pagingObserver.current.disconnect();
try {
pagingObserver.current = new IntersectionObserver((entries) => {
if (entries[0].isIntersecting && !isDone) {
setSize(size + 1);
}
});
if (node) pagingObserver.current.observe(node);
} catch (e) {
// no op
}
},
[isValidating, isDone, size, setSize],
);
return (
<div className="flex size-full flex-col">
<div className="scrollbar-container flex h-16 w-full items-center justify-between overflow-x-auto px-2">
<PlusFilterGroup
selectedCameras={selectedCameras}
selectedLabels={selectedLabels}
selectedScoreRange={scoreRange}
setSelectedCameras={setSelectedCameras}
setSelectedLabels={setSelectedLabels}
setSelectedScoreRange={setScoreRange}
/>
<PlusSortSelector selectedSort={sort} setSelectedSort={setSort} />
</div>
<div className="no-scrollbar flex size-full flex-1 flex-wrap content-start gap-2 overflow-y-auto md:gap-4">
{!events?.length ? (
<>
{isValidating ? (
<ActivityIndicator className="absolute left-1/2 top-1/2 -translate-x-1/2 -translate-y-1/2" />
) : (
<div className="absolute left-1/2 top-1/2 flex -translate-x-1/2 -translate-y-1/2 flex-col items-center justify-center text-center">
<LuFolderX className="size-16" />
No snapshots found
</div>
)}
</>
) : (
<>
<div className="grid w-full gap-2 p-2 sm:grid-cols-2 md:grid-cols-3 lg:grid-cols-4 xl:grid-cols-5">
<FrigatePlusDialog
upload={upload}
onClose={() => setUpload(undefined)}
onEventUploaded={() => {
refresh(
(data: Event[][] | undefined) => {
if (!data || !upload) {
return data;
}
let pageIndex = -1;
let index = -1;
data.forEach((page, pIdx) => {
const search = page.findIndex((e) => e.id == upload.id);
if (search != -1) {
pageIndex = pIdx;
index = search;
}
});
if (index == -1) {
return data;
}
return [
...data.slice(0, pageIndex),
[
...data[pageIndex].slice(0, index),
{ ...data[pageIndex][index], plus_id: "new_upload" },
...data[pageIndex].slice(index + 1),
],
...data.slice(pageIndex + 1),
];
},
{ revalidate: false, populateCache: true },
);
}}
/>
{events?.map((event) => {
if (event.data.type != "object" || event.plus_id) {
return;
}
return (
<div
key={event.id}
className="relative flex aspect-video w-full cursor-pointer items-center justify-center rounded-lg bg-black md:rounded-2xl"
onClick={() => setUpload(event)}
>
<div className="absolute left-0 top-2 z-40">
<Tooltip>
<div className="flex">
<TooltipTrigger asChild>
<div className="mx-3 pb-1 text-sm text-white">
<Chip
className={`z-0 flex items-center justify-between space-x-1 bg-gray-500 bg-gradient-to-br from-gray-400 to-gray-500`}
>
{[event.label].map((object) => {
return getIconForLabel(
object,
"size-3 text-white",
);
})}
<div className="text-xs">
{Math.round(event.data.score * 100)}%
</div>
</Chip>
</div>
</TooltipTrigger>
</div>
<TooltipContent className="capitalize">
{[event.label]
.map((text) => capitalizeFirstLetter(text))
.sort()
.join(", ")
.replaceAll("-verified", "")}
</TooltipContent>
</Tooltip>
</div>
<img
className="aspect-video h-full rounded-lg object-contain md:rounded-2xl"
src={`${baseUrl}api/events/${event.id}/snapshot.jpg`}
loading="lazy"
/>
</div>
);
})}
</div>
{!isDone && isValidating ? (
<div className="flex w-full items-center justify-center">
<ActivityIndicator />
</div>
) : (
<div ref={lastEventRef} />
)}
</>
)}
</div>
</div>
);
}
type PlusFilterGroupProps = {
selectedCameras: string[] | undefined;
selectedLabels: string[] | undefined;
selectedScoreRange: number[] | undefined;
setSelectedCameras: (cameras: string[] | undefined) => void;
setSelectedLabels: (cameras: string[] | undefined) => void;
setSelectedScoreRange: (range: number[] | undefined) => void;
};
function PlusFilterGroup({
selectedCameras,
selectedLabels,
selectedScoreRange,
setSelectedCameras,
setSelectedLabels,
setSelectedScoreRange,
}: PlusFilterGroupProps) {
const { data: config } = useSWR<FrigateConfig>("config");
const allCameras = useMemo(() => {
if (!config) {
return [];
}
return Object.keys(config.cameras);
}, [config]);
const allLabels = useMemo<string[]>(() => {
if (!config) {
return [];
}
const labels = new Set<string>();
const cameras = selectedCameras || Object.keys(config.cameras);
cameras.forEach((camera) => {
const cameraConfig = config.cameras[camera];
cameraConfig.objects.track.forEach((label) => {
if (!ATTRIBUTE_LABELS.includes(label)) {
labels.add(label);
}
});
});
return [...labels].sort();
}, [config, selectedCameras]);
const [open, setOpen] = useState<"none" | "camera" | "label" | "score">(
"none",
);
const [currentLabels, setCurrentLabels] = useState<string[] | undefined>(
undefined,
);
const [currentScoreRange, setCurrentScoreRange] = useState<
number[] | undefined
>(undefined);
const Menu = isMobile ? Drawer : DropdownMenu;
const Trigger = isMobile ? DrawerTrigger : DropdownMenuTrigger;
const Content = isMobile ? DrawerContent : DropdownMenuContent;
return (
<div className="flex h-full items-center justify-start gap-2">
<CamerasFilterButton
allCameras={allCameras}
groups={[]}
selectedCameras={selectedCameras}
updateCameraFilter={setSelectedCameras}
/>
<Menu
open={open == "label"}
onOpenChange={(open) => {
if (!open) {
setCurrentLabels(selectedLabels);
}
setOpen(open ? "label" : "none");
}}
>
<Trigger asChild>
<Button
className="flex items-center gap-2 capitalize"
size="sm"
variant={selectedLabels == undefined ? "default" : "select"}
>
<FaList
className={`${selectedLabels == undefined ? "text-secondary-foreground" : "text-selected-foreground"}`}
/>
<div className="hidden text-primary md:block">
{selectedLabels == undefined
? "All Labels"
: `${selectedLabels.length} Labels`}
</div>
</Button>
</Trigger>
<Content className={isMobile ? "max-h-[75dvh]" : ""}>
<GeneralFilterContent
allLabels={allLabels}
selectedLabels={selectedLabels}
currentLabels={currentLabels}
setCurrentLabels={setCurrentLabels}
updateLabelFilter={setSelectedLabels}
onClose={() => setOpen("none")}
/>
</Content>
</Menu>
<Menu
open={open == "score"}
onOpenChange={(open) => {
setOpen(open ? "score" : "none");
}}
>
<Trigger asChild>
<Button
className="flex items-center gap-2 capitalize"
size="sm"
variant={selectedScoreRange == undefined ? "default" : "select"}
>
<PiSlidersHorizontalFill
className={`${selectedScoreRange == undefined ? "text-secondary-foreground" : "text-selected-foreground"}`}
/>
<div className="hidden text-primary md:block">
{selectedScoreRange == undefined
? "Score Range"
: `${Math.round(selectedScoreRange[0] * 100)}% - ${Math.round(selectedScoreRange[1] * 100)}%`}
</div>
</Button>
</Trigger>
<Content
className={`flex min-w-80 flex-col justify-center p-2 ${isMobile ? "gap-2 *:max-h-[75dvh]" : ""}`}
>
<div className="flex items-center gap-1">
<Input
className="w-12"
inputMode="numeric"
value={Math.round((currentScoreRange?.at(0) ?? 0.5) * 100)}
onChange={(e) => {
const value = e.target.value;
if (value) {
setCurrentScoreRange([
parseInt(value) / 100.0,
currentScoreRange?.at(1) ?? 1.0,
]);
}
}}
/>
<DualThumbSlider
className="w-full"
min={0.5}
max={1.0}
step={0.01}
value={currentScoreRange ?? [0.5, 1.0]}
onValueChange={setCurrentScoreRange}
/>
<Input
className="w-12"
inputMode="numeric"
value={Math.round((currentScoreRange?.at(1) ?? 1.0) * 100)}
onChange={(e) => {
const value = e.target.value;
if (value) {
setCurrentScoreRange([
currentScoreRange?.at(0) ?? 0.5,
parseInt(value) / 100.0,
]);
}
}}
/>
</div>
<DropdownMenuSeparator />
<div className="flex items-center justify-evenly p-2">
<Button
variant="select"
onClick={() => {
setSelectedScoreRange(currentScoreRange);
setOpen("none");
}}
>
Apply
</Button>
<Button
onClick={() => {
setCurrentScoreRange(undefined);
setSelectedScoreRange(undefined);
}}
>
Reset
</Button>
</div>
</Content>
</Menu>
</div>
);
}
type PlusSortSelectorProps = {
selectedSort?: string;
setSelectedSort: (sort: string | undefined) => void;
};
function PlusSortSelector({
selectedSort,
setSelectedSort,
}: PlusSortSelectorProps) {
// menu state
const [open, setOpen] = useState(false);
// sort
const [currentSort, setCurrentSort] = useState<string>();
const [currentDir, setCurrentDir] = useState<string>("desc");
// components
const Sort = selectedSort
? selectedSort.split("_")[1] == "desc"
? FaSortAmountDown
: FaSortAmountUp
: FaSort;
const Menu = isMobile ? Drawer : DropdownMenu;
const Trigger = isMobile ? DrawerTrigger : DropdownMenuTrigger;
const Content = isMobile ? DrawerContent : DropdownMenuContent;
return (
<div className="flex h-full items-center justify-start gap-2">
<Menu
open={open}
onOpenChange={(open) => {
setOpen(open);
if (!open) {
const parts = selectedSort?.split("_");
if (parts?.length == 2) {
setCurrentSort(parts[0]);
setCurrentDir(parts[1]);
}
}
}}
>
<Trigger asChild>
<Button
className="flex items-center gap-2 capitalize"
size="sm"
variant={selectedSort == undefined ? "default" : "select"}
>
<Sort
className={`${selectedSort == undefined ? "text-secondary-foreground" : "text-selected-foreground"}`}
/>
<div className="hidden text-primary md:block">
{selectedSort == undefined ? "Sort" : selectedSort.split("_")[0]}
</div>
</Button>
</Trigger>
<Content
className={`flex flex-col justify-center gap-2 p-2 ${isMobile ? "max-h-[75dvh]" : ""}`}
>
<RadioGroup
className={`flex flex-col gap-4 ${isMobile ? "mt-4" : ""}`}
onValueChange={(value) => setCurrentSort(value)}
>
<div className="flex w-full items-center gap-2">
<RadioGroupItem
className={
currentSort == "date"
? "bg-selected from-selected/50 to-selected/90 text-selected"
: "bg-secondary from-secondary/50 to-secondary/90 text-secondary"
}
id="date"
value="date"
/>
<Label
className="w-full cursor-pointer capitalize"
htmlFor="date"
>
Date
</Label>
{currentSort == "date" ? (
currentDir == "desc" ? (
<FaSortAmountDown
className="size-5 cursor-pointer"
onClick={() => setCurrentDir("asc")}
/>
) : (
<FaSortAmountUp
className="size-5 cursor-pointer"
onClick={() => setCurrentDir("desc")}
/>
)
) : (
<div className="size-5" />
)}
</div>
<div className="flex w-full items-center gap-2">
<RadioGroupItem
className={
currentSort == "score"
? "bg-selected from-selected/50 to-selected/90 text-selected"
: "bg-secondary from-secondary/50 to-secondary/90 text-secondary"
}
id="score"
value="score"
/>
<Label
className="w-full cursor-pointer capitalize"
htmlFor="score"
>
Score
</Label>
{currentSort == "score" ? (
currentDir == "desc" ? (
<FaSortAmountDown
className="size-5 cursor-pointer"
onClick={() => setCurrentDir("asc")}
/>
) : (
<FaSortAmountUp
className="size-5 cursor-pointer"
onClick={() => setCurrentDir("desc")}
/>
)
) : (
<div className="size-5" />
)}
</div>
</RadioGroup>
<DropdownMenuSeparator />
<div className="flex items-center justify-evenly p-2">
<Button
variant="select"
disabled={!currentSort}
onClick={() => {
if (currentSort) {
setSelectedSort(`${currentSort}_${currentDir}`);
setOpen(false);
}
}}
>
Apply
</Button>
<Button
onClick={() => {
setCurrentSort(undefined);
setCurrentDir("desc");
setSelectedSort(undefined);
}}
>
Reset
</Button>
</div>
</Content>
</Menu>
</div>
);
}

View File

@ -10,6 +10,9 @@ export type SearchResult = {
label: string; label: string;
sub_label?: string; sub_label?: string;
thumb_path?: string; thumb_path?: string;
plus_id?: string;
has_snapshot: boolean;
has_clip: boolean;
zones: string[]; zones: string[];
search_source: SearchSource; search_source: SearchSource;
search_distance: number; search_distance: number;
@ -25,9 +28,6 @@ export type SearchResult = {
}; };
}; };
export type PartialSearchResult = Partial<SearchResult> & { id: string };
export type SearchFilter = { export type SearchFilter = {
cameras?: string[]; cameras?: string[];
labels?: string[]; labels?: string[];

View File

@ -0,0 +1,195 @@
import { useEffect, useMemo } from "react";
import { isIOS, isMobileOnly, isSafari } from "react-device-detect";
import useSWR from "swr";
import { useApiHost } from "@/api";
import { cn } from "@/lib/utils";
import { LuArrowRightCircle } from "react-icons/lu";
import { useNavigate } from "react-router-dom";
import {
Tooltip,
TooltipContent,
TooltipTrigger,
} from "@/components/ui/tooltip";
import { TooltipPortal } from "@radix-ui/react-tooltip";
import { SearchResult } from "@/types/search";
import ImageLoadingIndicator from "@/components/indicators/ImageLoadingIndicator";
import useImageLoaded from "@/hooks/use-image-loaded";
import ActivityIndicator from "@/components/indicators/activity-indicator";
type ExploreViewProps = {
onSelectSearch: (searchResult: SearchResult) => void;
};
export default function ExploreView({ onSelectSearch }: ExploreViewProps) {
// title
useEffect(() => {
document.title = "Explore - Frigate";
}, []);
// data
const { data: events } = useSWR<SearchResult[]>(
[
"events/explore",
{
limit: isMobileOnly ? 5 : 10,
},
],
{
revalidateOnFocus: false,
},
);
const eventsByLabel = useMemo(() => {
if (!events) return {};
return events.reduce<Record<string, SearchResult[]>>((acc, event) => {
const label = event.label || "Unknown";
if (!acc[label]) {
acc[label] = [];
}
acc[label].push(event);
return acc;
}, {});
}, [events]);
if (!events) {
return (
<ActivityIndicator className="absolute left-1/2 top-1/2 -translate-x-1/2 -translate-y-1/2" />
);
}
return (
<div className="scrollbar-container mx-2 space-y-4 overflow-x-hidden">
{Object.entries(eventsByLabel).map(([label, filteredEvents]) => (
<ThumbnailRow
key={label}
searchResults={filteredEvents}
objectType={label}
onSelectSearch={onSelectSearch}
/>
))}
</div>
);
}
type ThumbnailRowType = {
objectType: string;
searchResults?: SearchResult[];
onSelectSearch: (searchResult: SearchResult) => void;
};
function ThumbnailRow({
objectType,
searchResults,
onSelectSearch,
}: ThumbnailRowType) {
const navigate = useNavigate();
const handleSearch = (label: string) => {
const similaritySearchParams = new URLSearchParams({
labels: label,
}).toString();
navigate(`/explore?${similaritySearchParams}`);
};
return (
<div className="rounded-lg bg-background_alt p-2 md:p-4">
<div className="text-lg capitalize">
{objectType.replaceAll("_", " ")}
{searchResults && (
<span className="ml-3 text-sm text-secondary-foreground">
(
{
// @ts-expect-error we know this is correct
searchResults[0].event_count
}{" "}
tracked objects){" "}
</span>
)}
</div>
<div className="flex flex-row items-center space-x-2 py-2">
{searchResults?.map((event) => (
<div
key={event.id}
className="relative aspect-square h-auto max-w-[20%] flex-grow md:max-w-[10%]"
>
<ExploreThumbnailImage
event={event}
onSelectSearch={onSelectSearch}
/>
</div>
))}
<div
className="flex cursor-pointer items-center justify-center"
onClick={() => handleSearch(objectType)}
>
<Tooltip>
<TooltipTrigger>
<LuArrowRightCircle
className="ml-2 text-secondary-foreground transition-all duration-300 hover:text-primary"
size={24}
/>
</TooltipTrigger>
<TooltipPortal>
<TooltipContent className="capitalize">
<ExploreMoreLink objectType={objectType} />
</TooltipContent>
</TooltipPortal>
</Tooltip>
</div>
</div>
</div>
);
}
type ExploreThumbnailImageProps = {
event: SearchResult;
onSelectSearch: (searchResult: SearchResult) => void;
};
function ExploreThumbnailImage({
event,
onSelectSearch,
}: ExploreThumbnailImageProps) {
const apiHost = useApiHost();
const [imgRef, imgLoaded, onImgLoad] = useImageLoaded();
return (
<>
<ImageLoadingIndicator
className="absolute inset-0"
imgLoaded={imgLoaded}
/>
<img
ref={imgRef}
className={cn(
"absolute h-full w-full cursor-pointer rounded-lg object-cover transition-all duration-300 ease-in-out md:rounded-2xl",
)}
style={
isIOS
? {
WebkitUserSelect: "none",
WebkitTouchCallout: "none",
}
: undefined
}
loading={isSafari ? "eager" : "lazy"}
draggable={false}
src={`${apiHost}api/events/${event.id}/thumbnail.jpg`}
onClick={() => onSelectSearch(event)}
onLoad={() => {
onImgLoad();
}}
/>
</>
);
}
function ExploreMoreLink({ objectType }: { objectType: string }) {
const formattedType = objectType.replaceAll("_", " ");
const label = formattedType.endsWith("s")
? `${formattedType}es`
: `${formattedType}s`;
return <div>Explore More {label}</div>;
}

View File

@ -1,8 +1,8 @@
import SearchThumbnail from "@/components/card/SearchThumbnail";
import SearchFilterGroup from "@/components/filter/SearchFilterGroup"; import SearchFilterGroup from "@/components/filter/SearchFilterGroup";
import ActivityIndicator from "@/components/indicators/activity-indicator"; import ActivityIndicator from "@/components/indicators/activity-indicator";
import Chip from "@/components/indicators/Chip"; import Chip from "@/components/indicators/Chip";
import SearchDetailDialog from "@/components/overlay/detail/SearchDetailDialog"; import SearchDetailDialog from "@/components/overlay/detail/SearchDetailDialog";
import SearchThumbnailPlayer from "@/components/player/SearchThumbnailPlayer";
import { Input } from "@/components/ui/input"; import { Input } from "@/components/ui/input";
import { Toaster } from "@/components/ui/sonner"; import { Toaster } from "@/components/ui/sonner";
import { import {
@ -12,25 +12,19 @@ import {
} from "@/components/ui/tooltip"; } from "@/components/ui/tooltip";
import { cn } from "@/lib/utils"; import { cn } from "@/lib/utils";
import { FrigateConfig } from "@/types/frigateConfig"; import { FrigateConfig } from "@/types/frigateConfig";
import { Preview } from "@/types/preview"; import { SearchFilter, SearchResult } from "@/types/search";
import {
PartialSearchResult,
SearchFilter,
SearchResult,
} from "@/types/search";
import { useCallback, useMemo, useState } from "react"; import { useCallback, useMemo, useState } from "react";
import { isMobileOnly } from "react-device-detect"; import { isMobileOnly } from "react-device-detect";
import { LuImage, LuSearchX, LuText, LuXCircle } from "react-icons/lu"; import { LuImage, LuSearchX, LuText, LuXCircle } from "react-icons/lu";
import useSWR from "swr"; import useSWR from "swr";
import ExploreView from "../explore/ExploreView";
type SearchViewProps = { type SearchViewProps = {
search: string; search: string;
searchTerm: string; searchTerm: string;
searchFilter?: SearchFilter; searchFilter?: SearchFilter;
searchResults?: SearchResult[]; searchResults?: SearchResult[];
allPreviews?: Preview[];
isLoading: boolean; isLoading: boolean;
similaritySearch?: PartialSearchResult;
setSearch: (search: string) => void; setSearch: (search: string) => void;
setSimilaritySearch: (search: SearchResult) => void; setSimilaritySearch: (search: SearchResult) => void;
onUpdateFilter: (filter: SearchFilter) => void; onUpdateFilter: (filter: SearchFilter) => void;
@ -41,13 +35,10 @@ export default function SearchView({
searchTerm, searchTerm,
searchFilter, searchFilter,
searchResults, searchResults,
allPreviews,
isLoading, isLoading,
similaritySearch,
setSearch, setSearch,
setSimilaritySearch, setSimilaritySearch,
onUpdateFilter, onUpdateFilter,
onOpenSearch,
}: SearchViewProps) { }: SearchViewProps) {
const { data: config } = useSWR<FrigateConfig>("config", { const { data: config } = useSWR<FrigateConfig>("config", {
revalidateOnFocus: false, revalidateOnFocus: false,
@ -68,16 +59,9 @@ export default function SearchView({
// search interaction // search interaction
const onSelectSearch = useCallback( const onSelectSearch = useCallback((item: SearchResult) => {
(item: SearchResult, detail: boolean) => {
if (detail) {
setSearchDetail(item); setSearchDetail(item);
} else { }, []);
onOpenSearch(item);
}
},
[onOpenSearch],
);
// confidence score - probably needs tweaking // confidence score - probably needs tweaking
@ -116,26 +100,24 @@ export default function SearchView({
<div <div
className={cn( className={cn(
"relative mb-2 flex h-11 items-center pl-2 pr-2 md:pl-3", "flex flex-col items-start space-y-2 pl-2 pr-2 md:mb-2 md:pl-3 lg:h-10 lg:flex-row lg:items-center lg:space-y-0",
config?.semantic_search?.enabled config?.semantic_search?.enabled
? "justify-between" ? "justify-between"
: "justify-center", : "justify-center",
isMobileOnly && "h-[88px] flex-wrap gap-2", isMobileOnly && "mb-2 h-auto flex-wrap gap-2 space-y-0",
)} )}
> >
{config?.semantic_search?.enabled && ( {config?.semantic_search?.enabled && (
<div <div
className={cn( className={cn(
"relative w-full", "relative w-full",
hasExistingSearch ? "mr-3 md:w-1/3" : "md:ml-[25%] md:w-1/2", hasExistingSearch ? "lg:mr-3 lg:w-1/3" : "lg:ml-[25%] lg:w-1/2",
)} )}
> >
<Input <Input
className="text-md w-full bg-muted pr-10" className="text-md w-full bg-muted pr-10"
placeholder={ placeholder={"Search for a tracked object..."}
isMobileOnly ? "Search" : "Search for a detected object..." value={search}
}
value={similaritySearch ? "" : search}
onChange={(e) => setSearch(e.target.value)} onChange={(e) => setSearch(e.target.value)}
/> />
{search && ( {search && (
@ -149,8 +131,11 @@ export default function SearchView({
{hasExistingSearch && ( {hasExistingSearch && (
<SearchFilterGroup <SearchFilterGroup
className={cn("", isMobileOnly && "w-full justify-between")} className={cn(
"w-full justify-between md:justify-start lg:justify-end",
)}
filter={searchFilter} filter={searchFilter}
searchTerm={searchTerm}
onUpdateFilter={onUpdateFilter} onUpdateFilter={onUpdateFilter}
/> />
)} )}
@ -160,7 +145,7 @@ export default function SearchView({
{searchTerm.length > 0 && searchResults?.length == 0 && ( {searchTerm.length > 0 && searchResults?.length == 0 && (
<div className="absolute left-1/2 top-1/2 flex -translate-x-1/2 -translate-y-1/2 flex-col items-center justify-center text-center"> <div className="absolute left-1/2 top-1/2 flex -translate-x-1/2 -translate-y-1/2 flex-col items-center justify-center text-center">
<LuSearchX className="size-16" /> <LuSearchX className="size-16" />
No Detected Objects Found No Tracked Objects Found
</div> </div>
)} )}
@ -168,7 +153,8 @@ export default function SearchView({
<ActivityIndicator className="absolute left-1/2 top-1/2 -translate-x-1/2 -translate-y-1/2" /> <ActivityIndicator className="absolute left-1/2 top-1/2 -translate-x-1/2 -translate-y-1/2" />
)} )}
<div className="grid w-full gap-2 px-1 sm:grid-cols-2 md:mx-2 md:grid-cols-4 md:gap-4 3xl:grid-cols-6"> {uniqueResults && (
<div className="mt-2 grid w-full gap-2 px-1 sm:grid-cols-2 md:mx-2 md:grid-cols-4 md:gap-4 3xl:grid-cols-6">
{uniqueResults && {uniqueResults &&
uniqueResults.map((value) => { uniqueResults.map((value) => {
const selected = false; const selected = false;
@ -184,13 +170,12 @@ export default function SearchView({
"aspect-square size-full overflow-hidden rounded-lg", "aspect-square size-full overflow-hidden rounded-lg",
)} )}
> >
<SearchThumbnailPlayer <SearchThumbnail
searchResult={value} searchResult={value}
allPreviews={allPreviews} findSimilar={() => setSimilaritySearch(value)}
scrollLock={false} onClick={() => onSelectSearch(value)}
onClick={onSelectSearch}
/> />
{(searchTerm || similaritySearch) && ( {searchTerm && (
<div className={cn("absolute right-2 top-2 z-40")}> <div className={cn("absolute right-2 top-2 z-40")}>
<Tooltip> <Tooltip>
<TooltipTrigger> <TooltipTrigger>
@ -228,6 +213,12 @@ export default function SearchView({
); );
})} })}
</div> </div>
)}
{!uniqueResults && !isLoading && (
<div className="flex size-full flex-col">
<ExploreView onSelectSearch={onSelectSearch} />
</div>
)}
</div> </div>
</div> </div>
); );

View File

@ -34,7 +34,7 @@ export default function ObjectSettingsView({
{ {
param: "bbox", param: "bbox",
title: "Bounding boxes", title: "Bounding boxes",
description: "Show bounding boxes around detected objects", description: "Show bounding boxes around tracked objects",
}, },
{ {
param: "timestamp", param: "timestamp",
@ -130,7 +130,7 @@ export default function ObjectSettingsView({
to detect objects in your camera's video stream. to detect objects in your camera's video stream.
</p> </p>
<p> <p>
Debugging view shows a real-time view of detected objects and their Debugging view shows a real-time view of tracked objects and their
statistics. The object list shows a time-delayed summary of detected statistics. The object list shows a time-delayed summary of detected
objects. objects.
</p> </p>