Merge branch 'blakeblackshear:dev' into dev

This commit is contained in:
Sergey Krashevich 2023-05-25 17:20:33 +03:00 committed by GitHub
commit 8dfd61d9a5
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
53 changed files with 1630 additions and 386 deletions

View File

@ -27,7 +27,7 @@ RUN --mount=type=tmpfs,target=/tmp --mount=type=tmpfs,target=/var/cache/apt \
FROM wget AS go2rtc FROM wget AS go2rtc
ARG TARGETARCH ARG TARGETARCH
WORKDIR /rootfs/usr/local/go2rtc/bin WORKDIR /rootfs/usr/local/go2rtc/bin
RUN wget -qO go2rtc "https://github.com/AlexxIT/go2rtc/releases/download/v1.2.0/go2rtc_linux_${TARGETARCH}" \ RUN wget -qO go2rtc "https://github.com/AlexxIT/go2rtc/releases/download/v1.5.0/go2rtc_linux_${TARGETARCH}" \
&& chmod +x go2rtc && chmod +x go2rtc

View File

@ -18,8 +18,9 @@ apt-get -qq install --no-install-recommends -y \
mkdir -p -m 600 /root/.gnupg mkdir -p -m 600 /root/.gnupg
# add coral repo # add coral repo
wget --quiet -O /usr/share/keyrings/google-edgetpu.gpg https://packages.cloud.google.com/apt/doc/apt-key.gpg curl -fsSLo - https://packages.cloud.google.com/apt/doc/apt-key.gpg | \
echo "deb [signed-by=/usr/share/keyrings/google-edgetpu.gpg] https://packages.cloud.google.com/apt coral-edgetpu-stable main" | tee /etc/apt/sources.list.d/coral-edgetpu.list gpg --dearmor -o /etc/apt/trusted.gpg.d/google-cloud-packages-archive-keyring.gpg
echo "deb https://packages.cloud.google.com/apt coral-edgetpu-stable main" | tee /etc/apt/sources.list.d/coral-edgetpu.list
echo "libedgetpu1-max libedgetpu/accepted-eula select true" | debconf-set-selections echo "libedgetpu1-max libedgetpu/accepted-eula select true" | debconf-set-selections
# enable non-free repo # enable non-free repo

View File

@ -107,3 +107,14 @@ To do this:
3. Restart Frigate and the custom version will be used if the mapping was done correctly. 3. Restart Frigate and the custom version will be used if the mapping was done correctly.
NOTE: The folder that is mapped from the host needs to be the folder that contains `/bin`. So if the full structure is `/home/appdata/frigate/custom-ffmpeg/bin/ffmpeg` then `/home/appdata/frigate/custom-ffmpeg` needs to be mapped to `/usr/lib/btbn-ffmpeg`. NOTE: The folder that is mapped from the host needs to be the folder that contains `/bin`. So if the full structure is `/home/appdata/frigate/custom-ffmpeg/bin/ffmpeg` then `/home/appdata/frigate/custom-ffmpeg` needs to be mapped to `/usr/lib/btbn-ffmpeg`.
## Custom go2rtc version
Frigate currently includes go2rtc v1.5.0, there may be certain cases where you want to run a different version of go2rtc.
To do this:
1. Download the go2rtc build to the /config folder.
2. Rename the build to `go2rtc`.
3. Give `go2rtc` execute permission.
4. Restart Frigate and the custom version will be used, you can verify by checking go2rtc logs.

View File

@ -141,7 +141,7 @@ go2rtc:
- rtspx://192.168.1.1:7441/abcdefghijk - rtspx://192.168.1.1:7441/abcdefghijk
``` ```
[See the go2rtc docs for more information](https://github.com/AlexxIT/go2rtc/tree/v1.2.0#source-rtsp) [See the go2rtc docs for more information](https://github.com/AlexxIT/go2rtc/tree/v1.5.0#source-rtsp)
In the Unifi 2.0 update Unifi Protect Cameras had a change in audio sample rate which causes issues for ffmpeg. The input rate needs to be set for record and rtmp if used directly with unifi protect. In the Unifi 2.0 update Unifi Protect Cameras had a change in audio sample rate which causes issues for ffmpeg. The input rate needs to be set for record and rtmp if used directly with unifi protect.

View File

@ -198,7 +198,7 @@ To generate model files, create a new folder to save the models, download the sc
```bash ```bash
mkdir trt-models mkdir trt-models
wget https://raw.githubusercontent.com/blakeblackshear/frigate/docker/tensorrt_models.sh wget https://github.com/blakeblackshear/frigate/raw/master/docker/tensorrt_models.sh
chmod +x tensorrt_models.sh chmod +x tensorrt_models.sh
docker run --gpus=all --rm -it -v `pwd`/trt-models:/tensorrt_models -v `pwd`/tensorrt_models.sh:/tensorrt_models.sh nvcr.io/nvidia/tensorrt:22.07-py3 /tensorrt_models.sh docker run --gpus=all --rm -it -v `pwd`/trt-models:/tensorrt_models -v `pwd`/tensorrt_models.sh:/tensorrt_models.sh nvcr.io/nvidia/tensorrt:22.07-py3 /tensorrt_models.sh
``` ```

View File

@ -15,7 +15,23 @@ ffmpeg:
hwaccel_args: preset-rpi-64-h264 hwaccel_args: preset-rpi-64-h264
``` ```
### Intel-based CPUs (<10th Generation) via VAAPI :::note
If running Frigate in docker, you either need to run in priviliged mode or be sure to map the /dev/video1x devices to Frigate
```yaml
docker run -d \
--name frigate \
...
--device /dev/video10 \
ghcr.io/blakeblackshear/frigate:stable
```
:::
### Intel-based CPUs
#### Via VAAPI
VAAPI supports automatic profile selection so it will work automatically with both H.264 and H.265 streams. VAAPI is recommended for all generations of Intel-based CPUs if QSV does not work. VAAPI supports automatic profile selection so it will work automatically with both H.264 and H.265 streams. VAAPI is recommended for all generations of Intel-based CPUs if QSV does not work.
@ -26,24 +42,89 @@ ffmpeg:
**NOTICE**: With some of the processors, like the J4125, the default driver `iHD` doesn't seem to work correctly for hardware acceleration. You may need to change the driver to `i965` by adding the following environment variable `LIBVA_DRIVER_NAME=i965` to your docker-compose file or [in the `frigate.yaml` for HA OS users](advanced.md#environment_vars). **NOTICE**: With some of the processors, like the J4125, the default driver `iHD` doesn't seem to work correctly for hardware acceleration. You may need to change the driver to `i965` by adding the following environment variable `LIBVA_DRIVER_NAME=i965` to your docker-compose file or [in the `frigate.yaml` for HA OS users](advanced.md#environment_vars).
### Intel-based CPUs (>=10th Generation) via Quicksync #### Via Quicksync (>=10th Generation only)
QSV must be set specifically based on the video encoding of the stream. QSV must be set specifically based on the video encoding of the stream.
#### H.264 streams ##### H.264 streams
```yaml ```yaml
ffmpeg: ffmpeg:
hwaccel_args: preset-intel-qsv-h264 hwaccel_args: preset-intel-qsv-h264
``` ```
#### H.265 streams ##### H.265 streams
```yaml ```yaml
ffmpeg: ffmpeg:
hwaccel_args: preset-intel-qsv-h265 hwaccel_args: preset-intel-qsv-h265
``` ```
#### Configuring Intel GPU Stats in Docker
Additional configuration is needed for the Docker container to be able to access the `intel_gpu_top` command for GPU stats. Three possible changes can be made:
1. Run the container as privileged.
2. Adding the `CAP_PERFMON` capability.
3. Setting the `perf_event_paranoid` low enough to allow access to the performance event system.
##### Run as privileged
This method works, but it gives more permissions to the container than are actually needed.
###### Docker Compose - Privileged
```yaml
services:
frigate:
...
image: ghcr.io/blakeblackshear/frigate:stable
privileged: true
```
###### Docker Run CLI - Privileged
```bash
docker run -d \
--name frigate \
...
--privileged \
ghcr.io/blakeblackshear/frigate:stable
```
##### CAP_PERFMON
Only recent versions of Docker support the `CAP_PERFMON` capability. You can test to see if yours supports it by running: `docker run --cap-add=CAP_PERFMON hello-world`
###### Docker Compose - CAP_PERFMON
```yaml
services:
frigate:
...
image: ghcr.io/blakeblackshear/frigate:stable
cap_add:
- CAP_PERFMON
```
###### Docker Run CLI - CAP_PERFMON
```bash
docker run -d \
--name frigate \
...
--cap-add=CAP_PERFMON \
ghcr.io/blakeblackshear/frigate:stable
```
##### perf_event_paranoid
_Note: This setting must be changed for the entire system._
For more information on the various values across different distributions, see https://askubuntu.com/questions/1400874/what-does-perf-paranoia-level-four-do.
Depending on your OS and kernel configuration, you may need to change the `/proc/sys/kernel/perf_event_paranoid` kernel tunable. You can test the change by running `sudo sh -c 'echo 2 >/proc/sys/kernel/perf_event_paranoid'` which will persist until a reboot. Make it permanent by running `sudo sh -c 'echo kernel.perf_event_paranoid=1 >> /etc/sysctl.d/local.conf'`
### AMD/ATI GPUs (Radeon HD 2000 and newer GPUs) via libva-mesa-driver ### AMD/ATI GPUs (Radeon HD 2000 and newer GPUs) via libva-mesa-driver
VAAPI supports automatic profile selection so it will work automatically with both H.264 and H.265 streams. VAAPI supports automatic profile selection so it will work automatically with both H.264 and H.265 streams.
@ -59,15 +140,15 @@ ffmpeg:
While older GPUs may work, it is recommended to use modern, supported GPUs. NVIDIA provides a [matrix of supported GPUs and features](https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new). If your card is on the list and supports CUVID/NVDEC, it will most likely work with Frigate for decoding. However, you must also use [a driver version that will work with FFmpeg](https://github.com/FFmpeg/nv-codec-headers/blob/master/README). Older driver versions may be missing symbols and fail to work, and older cards are not supported by newer driver versions. The only way around this is to [provide your own FFmpeg](/configuration/advanced#custom-ffmpeg-build) that will work with your driver version, but this is unsupported and may not work well if at all. While older GPUs may work, it is recommended to use modern, supported GPUs. NVIDIA provides a [matrix of supported GPUs and features](https://developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new). If your card is on the list and supports CUVID/NVDEC, it will most likely work with Frigate for decoding. However, you must also use [a driver version that will work with FFmpeg](https://github.com/FFmpeg/nv-codec-headers/blob/master/README). Older driver versions may be missing symbols and fail to work, and older cards are not supported by newer driver versions. The only way around this is to [provide your own FFmpeg](/configuration/advanced#custom-ffmpeg-build) that will work with your driver version, but this is unsupported and may not work well if at all.
A more complete list of cards and ther compatible drivers is available in the [driver release readme](https://download.nvidia.com/XFree86/Linux-x86_64/525.85.05/README/supportedchips.html). A more complete list of cards and their compatible drivers is available in the [driver release readme](https://download.nvidia.com/XFree86/Linux-x86_64/525.85.05/README/supportedchips.html).
If your distribution does not offer NVIDIA driver packages, you can [download them here](https://www.nvidia.com/en-us/drivers/unix/). If your distribution does not offer NVIDIA driver packages, you can [download them here](https://www.nvidia.com/en-us/drivers/unix/).
#### Docker Configuration #### Configuring Nvidia GPUs in Docker
Additional configuration is needed for the Docker container to be able to access the NVIDIA GPU. The supported method for this is to install the [NVIDIA Container Toolkit](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/install-guide.html#docker) and specify the GPU to Docker. How you do this depends on how Docker is being run: Additional configuration is needed for the Docker container to be able to access the NVIDIA GPU. The supported method for this is to install the [NVIDIA Container Toolkit](https://docs.nvidia.com/datacenter/cloud-native/container-toolkit/install-guide.html#docker) and specify the GPU to Docker. How you do this depends on how Docker is being run:
##### Docker Compose ##### Docker Compose - Nvidia GPU
```yaml ```yaml
services: services:
@ -84,7 +165,7 @@ services:
capabilities: [gpu] capabilities: [gpu]
``` ```
##### Docker Run CLI ##### Docker Run CLI - Nvidia GPU
```bash ```bash
docker run -d \ docker run -d \

View File

@ -377,7 +377,7 @@ rtmp:
enabled: False enabled: False
# Optional: Restream configuration # Optional: Restream configuration
# Uses https://github.com/AlexxIT/go2rtc (v1.2.0) # Uses https://github.com/AlexxIT/go2rtc (v1.5.0)
go2rtc: go2rtc:
# Optional: jsmpeg stream configuration for WebUI # Optional: jsmpeg stream configuration for WebUI

View File

@ -79,6 +79,8 @@ WebRTC works by creating a TCP or UDP connection on port `8555`. However, it req
- stun:8555 - stun:8555
``` ```
- For access through Tailscale, the Frigate system's Tailscale IP must be added as a WebRTC candidate. Tailscale IPs all start with `100.`, and are reserved within the `100.0.0.0/8` CIDR block.
:::tip :::tip
This extra configuration may not be required if Frigate has been installed as a Home Assistant add-on, as Frigate uses the Supervisor's API to generate a WebRTC candidate. This extra configuration may not be required if Frigate has been installed as a Home Assistant add-on, as Frigate uses the Supervisor's API to generate a WebRTC candidate.
@ -97,8 +99,20 @@ However, it is recommended if issues occur to define the candidates manually. Yo
If you are having difficulties getting WebRTC to work and you are running Frigate with docker, you may want to try changing the container network mode: If you are having difficulties getting WebRTC to work and you are running Frigate with docker, you may want to try changing the container network mode:
- `network: host`, in this mode you don't need to forward any ports. The services inside of the Frigate container will have full access to the network interfaces of your host machine as if they were running natively and not in a container. Any port conflicts will need to be resolved. This network mode is recommended by go2rtc, but we recommend you only use it if necessary. - `network: host`, in this mode you don't need to forward any ports. The services inside of the Frigate container will have full access to the network interfaces of your host machine as if they were running natively and not in a container. Any port conflicts will need to be resolved. This network mode is recommended by go2rtc, but we recommend you only use it if necessary.
- `network: bridge` creates a virtual network interface for the container, and the container will have full access to it. You also don't need to forward any ports, however, the IP for accessing Frigate locally will differ from the IP of the host machine. Your router will see Frigate as if it was a new device connected in the network. - `network: bridge` is the default network driver, a bridge network is a Link Layer device which forwards traffic between network segments. You need to forward any ports that you want to be accessible from the host IP.
If not running in host mode, port 8555 will need to be mapped for the container:
docker-compose.yml
```yaml
services:
frigate:
...
ports:
- "8555:8555/tcp" # WebRTC over tcp
- "8555:8555/udp" # WebRTC over udp
```
::: :::
See [go2rtc WebRTC docs](https://github.com/AlexxIT/go2rtc/tree/v1.2.0#module-webrtc) for more information about this. See [go2rtc WebRTC docs](https://github.com/AlexxIT/go2rtc/tree/v1.5.0#module-webrtc) for more information about this.

View File

@ -7,7 +7,7 @@ title: Restream
Frigate can restream your video feed as an RTSP feed for other applications such as Home Assistant to utilize it at `rtsp://<frigate_host>:8554/<camera_name>`. Port 8554 must be open. [This allows you to use a video feed for detection in Frigate and Home Assistant live view at the same time without having to make two separate connections to the camera](#reduce-connections-to-camera). The video feed is copied from the original video feed directly to avoid re-encoding. This feed does not include any annotation by Frigate. Frigate can restream your video feed as an RTSP feed for other applications such as Home Assistant to utilize it at `rtsp://<frigate_host>:8554/<camera_name>`. Port 8554 must be open. [This allows you to use a video feed for detection in Frigate and Home Assistant live view at the same time without having to make two separate connections to the camera](#reduce-connections-to-camera). The video feed is copied from the original video feed directly to avoid re-encoding. This feed does not include any annotation by Frigate.
Frigate uses [go2rtc](https://github.com/AlexxIT/go2rtc/tree/v1.2.0) to provide its restream and MSE/WebRTC capabilities. The go2rtc config is hosted at the `go2rtc` in the config, see [go2rtc docs](https://github.com/AlexxIT/go2rtc/tree/v1.2.0#configuration) for more advanced configurations and features. Frigate uses [go2rtc](https://github.com/AlexxIT/go2rtc/tree/v1.5.0) to provide its restream and MSE/WebRTC capabilities. The go2rtc config is hosted at the `go2rtc` in the config, see [go2rtc docs](https://github.com/AlexxIT/go2rtc/tree/v1.5.0#configuration) for more advanced configurations and features.
:::note :::note
@ -130,7 +130,7 @@ cameras:
## Advanced Restream Configurations ## Advanced Restream Configurations
The [exec](https://github.com/AlexxIT/go2rtc/tree/v1.2.0#source-exec) source in go2rtc can be used for custom ffmpeg commands. An example is below: The [exec](https://github.com/AlexxIT/go2rtc/tree/v1.5.0#source-exec) source in go2rtc can be used for custom ffmpeg commands. An example is below:
NOTE: The output will need to be passed with two curly braces `{{output}}` NOTE: The output will need to be passed with two curly braces `{{output}}`

View File

@ -4,3 +4,5 @@ title: Snapshots
--- ---
Frigate can save a snapshot image to `/media/frigate/clips` for each event named as `<camera>-<id>.jpg`. Frigate can save a snapshot image to `/media/frigate/clips` for each event named as `<camera>-<id>.jpg`.
Snapshots sent via MQTT are configured in the [config file](https://docs.frigate.video/configuration/) under `cameras -> your_camera -> mqtt`

View File

@ -36,7 +36,13 @@ Fork [blakeblackshear/frigate-hass-integration](https://github.com/blakeblackshe
- [Frigate source code](#frigate-core-web-and-docs) - [Frigate source code](#frigate-core-web-and-docs)
- GNU make - GNU make
- Docker - Docker
- Extra Coral device (optional, but very helpful to simulate real world performance) - An extra detector (Coral, OpenVINO, etc.) is optional but recommended to simulate real world performance.
:::note
A Coral device can only be used by a single process at a time, so an extra Coral device is recommended if using a coral for development purposes.
:::
### Setup ### Setup
@ -79,7 +85,7 @@ Create and place these files in a `debug` folder in the root of the repo. This i
VSCode will start the docker compose file for you and open a terminal window connected to `frigate-dev`. VSCode will start the docker compose file for you and open a terminal window connected to `frigate-dev`.
- Run `python3 -m frigate` to start the backend. - Run `python3 -m frigate` to start the backend.
- In a separate terminal window inside VS Code, change into the `web` directory and run `npm install && npm start` to start the frontend. - In a separate terminal window inside VS Code, change into the `web` directory and run `npm install && npm run dev` to start the frontend.
#### 5. Teardown #### 5. Teardown

View File

@ -211,3 +211,109 @@ It is recommended to run Frigate in LXC for maximum performance. See [this discu
## ESX ## ESX
For details on running Frigate under ESX, see details [here](https://github.com/blakeblackshear/frigate/issues/305). For details on running Frigate under ESX, see details [here](https://github.com/blakeblackshear/frigate/issues/305).
## Synology NAS on DSM 7
These settings were tested on DSM 7.1.1-42962 Update 4
**General:**
The `Execute container using high privilege` option needs to be enabled in order to give the frigate container the elevated privileges it may need.
The `Enable auto-restart` option can be enabled if you want the container to automatically restart whenever it improperly shuts down due to an error.
![image](https://user-images.githubusercontent.com/4516296/232586790-0b659a82-561d-4bc5-899b-0f5b39c6b11d.png)
**Advanced Settings:**
If you want to use the password template feature, you should add the "FRIGATE_RTSP_PASSWORD" environment variable and set it to your preferred password under advanced settings. The rest of the environment variables should be left as default for now.
![image](https://user-images.githubusercontent.com/4516296/232587163-0eb662d4-5e28-4914-852f-9db1ec4b9c3d.png)
**Port Settings:**
The network mode should be set to `bridge`. You need to map the default frigate container ports to your local Synology NAS ports that you want to use to access Frigate.
There may be other services running on your NAS that are using the same ports that frigate uses. In that instance you can set the ports to auto or a specific port.
![image](https://user-images.githubusercontent.com/4516296/232582642-773c0e37-7ef5-4373-8ce3-41401b1626e6.png)
**Volume Settings:**
You need to configure 2 paths:
- The location of your config file in yaml format, this needs to be file and you need to go to the location of where your config.yml is located, this will be different depending on your NAS folder structure e.g. `/docker/frigate/config/config.yml` will mount to `/config/config.yml` within the container.
- The location on your NAS where the recordings will be saved this needs to be a folder e.g. `/docker/volumes/frigate-0-media`
![image](https://user-images.githubusercontent.com/4516296/232585872-44431d15-55e0-4004-b78b-1e512702b911.png)
## QNAP NAS
These instructions were tested on a QNAP with an Intel J3455 CPU and 16G RAM, running QTS 4.5.4.2117.
QNAP has a graphic tool named Container Station to intall and manage docker containers. However, there are two limitations with Container Station that make it unsuitable to install Frigate:
1. Container Station does not incorporate GitHub Container Registry (ghcr), which hosts Frigate docker image version 0.12.0 and above.
2. Container Station uses default 64 Mb shared memory size (shm-size), and does not have a mechanism to adjust it. Frigate requires a larger shm-size to be able to work properly with more than two high resolution cameras.
Because of above limitations, the installation has to be done from command line. Here are the steps:
**Preparation**
1. Install Container Station from QNAP App Center if it is not installed.
2. Enable ssh on your QNAP (please do an Internet search on how to do this).
3. Prepare Frigate config file, name it `config.yml`.
4. Calculate shared memory size according to [documentation](https://docs.frigate.video/frigate/installation).
5. Find your time zone value from https://en.wikipedia.org/wiki/List_of_tz_database_time_zones
6. ssh to QNAP.
**Installation**
Run the following commands to install Frigate (using `stable` version as example):
```bash
# Download Frigate image
docker pull ghcr.io/blakeblackshear/frigate:stable
# Create directory to host Frigate config file on QNAP file system.
# E.g., you can choose to create it under /share/Container.
mkdir -p /share/Container/frigate/config
# Copy the config file prepared in step 2 into the newly created config directory.
cp path/to/your/config/file /share/Container/frigate/config
# Create directory to host Frigate media files on QNAP file system.
# (if you have a surveilliance disk, create media directory on the surveilliance disk.
# Example command assumes share_vol2 is the surveilliance drive
mkdir -p /share/share_vol2/frigate/media
# Create Frigate docker container. Replace shm-size value with the value from preparation step 3.
# Also replace the time zone value for 'TZ' in the sample command.
# Example command will create a docker container that uses at most 2 CPUs and 4G RAM.
# You may need to add "--env=LIBVA_DRIVER_NAME=i965 \" to the following docker run command if you
# have certain CPU (e.g., J4125). See https://docs.frigate.video/configuration/hardware_acceleration.
docker run \
--name=frigate \
--shm-size=256m \
--restart=unless-stopped \
--env=TZ=America/New_York \
--volume=/share/Container/frigate/config:/config:rw \
--volume=/share/share_vol2/frigate/media:/media/frigate:rw \
--network=bridge \
--privileged \
--workdir=/opt/frigate \
-p 1935:1935 \
-p 5000:5000 \
-p 8554:8554 \
-p 8555:8555 \
-p 8555:8555/udp \
--label='com.qnap.qcs.network.mode=nat' \
--label='com.qnap.qcs.gpu=False' \
--memory="4g" \
--cpus="2" \
--detach=true \
-t \
ghcr.io/blakeblackshear/frigate:stable
```
Log into QNAP, open Container Station. Frigate docker container should be listed under 'Overview' and running. Visit Frigate Web UI by clicking Frigate docker, and then clicking the URL shown at the top of the detail page.

View File

@ -10,7 +10,7 @@ Use of the bundled go2rtc is optional. You can still configure FFmpeg to connect
# Setup a go2rtc stream # Setup a go2rtc stream
First, you will want to configure go2rtc to connect to your camera stream by adding the stream you want to use for live view in your Frigate config file. If you set the stream name under go2rtc to match the name of your camera, it will automatically be mapped and you will get additional live view options for the camera. Avoid changing any other parts of your config at this step. Note that go2rtc supports [many different stream types](https://github.com/AlexxIT/go2rtc/tree/v1.2.0#module-streams), not just rtsp. First, you will want to configure go2rtc to connect to your camera stream by adding the stream you want to use for live view in your Frigate config file. If you set the stream name under go2rtc to match the name of your camera, it will automatically be mapped and you will get additional live view options for the camera. Avoid changing any other parts of your config at this step. Note that go2rtc supports [many different stream types](https://github.com/AlexxIT/go2rtc/tree/v1.5.0#module-streams), not just rtsp.
```yaml ```yaml
go2rtc: go2rtc:
@ -23,7 +23,7 @@ The easiest live view to get working is MSE. After adding this to the config, re
### What if my video doesn't play? ### What if my video doesn't play?
If you are unable to see your video feed, first check the go2rtc logs in the Frigate UI under Logs in the sidebar. If go2rtc is having difficulty connecting to your camera, you should see some error messages in the log. If you do not see any errors, then the video codec of the stream may not be supported in your browser. If your camera stream is set to H265, try switching to H264. You can see more information about [video codec compatibility](https://github.com/AlexxIT/go2rtc/tree/v1.2.0#codecs-madness) in the go2rtc documentation. If you are not able to switch your camera settings from H265 to H264 or your stream is a different format such as MJPEG, you can use go2rtc to re-encode the video using the [FFmpeg parameters](https://github.com/AlexxIT/go2rtc/tree/v1.2.0#source-ffmpeg). It supports rotating and resizing video feeds and hardware acceleration. Keep in mind that transcoding video from one format to another is a resource intensive task and you may be better off using the built-in jsmpeg view. Here is an example of a config that will re-encode the stream to H264 without hardware acceleration: If you are unable to see your video feed, first check the go2rtc logs in the Frigate UI under Logs in the sidebar. If go2rtc is having difficulty connecting to your camera, you should see some error messages in the log. If you do not see any errors, then the video codec of the stream may not be supported in your browser. If your camera stream is set to H265, try switching to H264. You can see more information about [video codec compatibility](https://github.com/AlexxIT/go2rtc/tree/v1.5.0#codecs-madness) in the go2rtc documentation. If you are not able to switch your camera settings from H265 to H264 or your stream is a different format such as MJPEG, you can use go2rtc to re-encode the video using the [FFmpeg parameters](https://github.com/AlexxIT/go2rtc/tree/v1.5.0#source-ffmpeg). It supports rotating and resizing video feeds and hardware acceleration. Keep in mind that transcoding video from one format to another is a resource intensive task and you may be better off using the built-in jsmpeg view. Here is an example of a config that will re-encode the stream to H264 without hardware acceleration:
```yaml ```yaml
go2rtc: go2rtc:
@ -71,6 +71,12 @@ go2rtc:
- "ffmpeg:rtsp://user:password@10.0.10.10:554/cam/realmonitor?channel=1&subtype=2#video=copy#audio=copy#audio=aac" - "ffmpeg:rtsp://user:password@10.0.10.10:554/cam/realmonitor?channel=1&subtype=2#video=copy#audio=copy#audio=aac"
``` ```
:::caution
To access the go2rtc stream externally when utilizing the Frigate Add-On (for instance through VLC), you must first enable the RTSP Restream port. You can do this by visiting the Frigate Add-On configuration page within Home Assistant and revealing the hidden options under the "Show disabled ports" section.
:::
## Next steps ## Next steps
1. If the stream you added to go2rtc is also used by Frigate for the `record` or `detect` role, you can migrate your config to pull from the RTSP restream to reduce the number of connections to your camera as shown [here](/configuration/restream#reduce-connections-to-camera). 1. If the stream you added to go2rtc is also used by Frigate for the `record` or `detect` role, you can migrate your config to pull from the RTSP restream to reduce the number of connections to your camera as shown [here](/configuration/restream#reduce-connections-to-camera).

View File

@ -14,7 +14,7 @@ mqtt:
enabled: False enabled: False
cameras: cameras:
camera_1: # <------ Name the camera name_of_your_camera: # <------ Name the camera
ffmpeg: ffmpeg:
inputs: inputs:
- path: rtsp://10.0.10.10:554/rtsp # <----- The stream you want to use for detection - path: rtsp://10.0.10.10:554/rtsp # <----- The stream you want to use for detection
@ -44,7 +44,7 @@ Here is an example configuration with hardware acceleration configured for Intel
mqtt: ... mqtt: ...
cameras: cameras:
camera_1: name_of_your_camera:
ffmpeg: ffmpeg:
inputs: ... inputs: ...
hwaccel_args: preset-vaapi hwaccel_args: preset-vaapi
@ -64,7 +64,7 @@ detectors: # <---- add detectors
device: usb device: usb
cameras: cameras:
camera_1: name_of_your_camera:
ffmpeg: ... ffmpeg: ...
detect: detect:
enabled: True # <---- turn on detection enabled: True # <---- turn on detection
@ -99,7 +99,7 @@ detectors:
device: usb device: usb
cameras: cameras:
camera_1: name_of_your_camera:
ffmpeg: ffmpeg:
inputs: inputs:
- path: rtsp://10.0.10.10:554/rtsp - path: rtsp://10.0.10.10:554/rtsp
@ -127,7 +127,7 @@ mqtt: ...
detectors: ... detectors: ...
cameras: cameras:
camera_1: name_of_your_camera:
ffmpeg: ffmpeg:
inputs: inputs:
- path: rtsp://10.0.10.10:554/rtsp - path: rtsp://10.0.10.10:554/rtsp
@ -156,7 +156,7 @@ mqtt: ...
detectors: ... detectors: ...
cameras: cameras:
camera_1: ... name_of_your_camera: ...
detect: ... detect: ...
record: ... record: ...
snapshots: # <----- Enable snapshots snapshots: # <----- Enable snapshots

View File

@ -3,7 +3,7 @@ id: ha_notifications
title: Home Assistant notifications title: Home Assistant notifications
--- ---
The best way to get started with notifications for Frigate is to use the [Blueprint](https://community.home-assistant.io/t/frigate-mobile-app-notifications/311091). You can use the yaml generated from the Blueprint as a starting point and customize from there. The best way to get started with notifications for Frigate is to use the [Blueprint](https://community.home-assistant.io/t/frigate-mobile-app-notifications-2-0/559732). You can use the yaml generated from the Blueprint as a starting point and customize from there.
It is generally recommended to trigger notifications based on the `frigate/events` mqtt topic. This provides the event_id needed to fetch [thumbnails/snapshots/clips](../integrations/home-assistant.md#notification-api) and other useful information to customize when and where you want to receive alerts. The data is published in the form of a change feed, which means you can reference the "previous state" of the object in the `before` section and the "current state" of the object in the `after` section. You can see an example [here](../integrations/mqtt.md#frigateevents). It is generally recommended to trigger notifications based on the `frigate/events` mqtt topic. This provides the event_id needed to fetch [thumbnails/snapshots/clips](../integrations/home-assistant.md#notification-api) and other useful information to customize when and where you want to receive alerts. The data is published in the form of a change feed, which means you can reference the "previous state" of the object in the `before` section and the "current state" of the object in the `after` section. You can see an example [here](../integrations/mqtt.md#frigateevents).
@ -45,7 +45,7 @@ automation:
https://your.public.hass.address.com/api/frigate/notifications/{{trigger.payload_json["after"]["id"]}}/thumbnail.jpg https://your.public.hass.address.com/api/frigate/notifications/{{trigger.payload_json["after"]["id"]}}/thumbnail.jpg
tag: '{{trigger.payload_json["after"]["id"]}}' tag: '{{trigger.payload_json["after"]["id"]}}'
when: '{{trigger.payload_json["after"]["start_time"]|int}}' when: '{{trigger.payload_json["after"]["start_time"]|int}}'
entity_id: camera.{{trigger.payload_json["after"]["camera"]}} entity_id: camera.{{trigger.payload_json["after"]["camera"] | replace("-","_") | lower}}
mode: single mode: single
``` ```

View File

@ -84,3 +84,61 @@ There are many ways to authenticate a website but a straightforward approach is
</Location> </Location>
</VirtualHost> </VirtualHost>
``` ```
## Nginx Reverse Proxy
This method shows a working example for subdomain type reverse proxy with SSL enabled.
### Setup server and port to reverse proxy
This is set in `$server` and `$port` this should match your ports you have exposed to your docker container. Optionally you listen on port `443` and enable `SSL`
```
# ------------------------------------------------------------
# frigate.domain.com
# ------------------------------------------------------------
server {
set $forward_scheme http;
set $server "192.168.100.2"; # FRIGATE SERVER LOCATION
set $port 5000;
listen 80;
listen 443 ssl http2;
server_name frigate.domain.com;
}
```
### Setup SSL (optional)
This section points to your SSL files, the example below shows locations to a default Lets Encrypt SSL certificate.
```
# Let's Encrypt SSL
include conf.d/include/letsencrypt-acme-challenge.conf;
include conf.d/include/ssl-ciphers.conf;
ssl_certificate /etc/letsencrypt/live/npm-1/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/npm-1/privkey.pem;
```
### Setup reverse proxy settings
Thhe settings below enabled connection upgrade, sets up logging (optional) and proxies everything from the `/` context to the docker host and port specified earlier in the configuration
```
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $http_connection;
proxy_http_version 1.1;
access_log /data/logs/proxy-host-40_access.log proxy;
error_log /data/logs/proxy-host-40_error.log warn;
location / {
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection $http_connection;
proxy_http_version 1.1;
}
```

View File

@ -3,7 +3,7 @@ id: stationary_objects
title: Avoiding stationary objects title: Avoiding stationary objects
--- ---
Many people use Frigate to detect cars entering their driveway, and they often run into an issue with repeated events of a parked car being repeatedly detected over the course of multiple days (for example if the car is lost at night and detected again the following morning. Many people use Frigate to detect cars entering their driveway, and they often run into an issue with repeated notifications or events of a parked car being repeatedly detected over the course of multiple days (for example if the car is lost at night and detected again the following morning).
You can use zones to restrict events and notifications to objects that have entered specific areas. You can use zones to restrict events and notifications to objects that have entered specific areas.
@ -15,6 +15,12 @@ Frigate is designed to track objects as they move and over-masking can prevent i
::: :::
:::info
Once a vehicle crosses the entrance into the parking area, that event will stay `In Progress` until it is no longer seen in the frame. Frigate is designed to have an event last as long as an object is visible in the frame, an event being `In Progress` does not mean the event is being constantly recorded. You can define the recording behavior by adjusting the [recording retention settings](../configuration/record.md).
:::
To only be notified of cars that enter your driveway from the street, you could create multiple zones that cover your driveway. For cars, you would only notify if `entered_zones` from the events MQTT topic has more than 1 zone. To only be notified of cars that enter your driveway from the street, you could create multiple zones that cover your driveway. For cars, you would only notify if `entered_zones` from the events MQTT topic has more than 1 zone.
See [this example](../configuration/zones.md#restricting-zones-to-specific-objects) from the Zones documentation to see how to restrict zones to certain object types. See [this example](../configuration/zones.md#restricting-zones-to-specific-objects) from the Zones documentation to see how to restrict zones to certain object types.

View File

@ -295,3 +295,41 @@ Get ffprobe output for camera feed paths.
### `GET /api/<camera_name>/ptz/info` ### `GET /api/<camera_name>/ptz/info`
Get PTZ info for the camera. Get PTZ info for the camera.
### `POST /api/events/<camera_name>/<label>/create`
Create a manual API with a given `label` (ex: doorbell press) to capture a specific event besides an object being detected.
**Optional Body:**
```json
{
"subLabel": "some_string", // add sub label to event
"duration": 30, // predetermined length of event (default: 30 seconds) or can be to null for indeterminate length event
"include_recording": true, // whether the event should save recordings along with the snapshot that is taken
"draw": {
// optional annotations that will be drawn on the snapshot
"boxes": [
{
"box": [0.5, 0.5, 0.25, 0.25], // box consists of x, y, width, height which are on a scale between 0 - 1
"color": [255, 0, 0], // color of the box, default is red
"score": 100 // optional score associated with the box
}
]
}
}
```
**Success Response:**
```json
{
"event_id": "1682970645.13116-1ug7ns",
"message": "Successfully created event.",
"success": true
}
```
### `PUT /api/events/<event_id>/end`
End a specific manual event without a predetermined length.

View File

@ -0,0 +1,19 @@
---
id: third_party_extensions
title: Third Party Extensions
---
Being open source, others have the possibility to modify and extend the rich functionality Frigate already offers.
This page is meant to be an overview over additions one can make to the home NVR setup. The list is not exhaustive and can be extended via PR to the Frigate docs.
:::caution
This page does not recommend or rate the presented projects.
Please use your own knowledge to assess and vet them before you install anything on your system.
:::
## [Double Take](https://github.com/jakowenko/double-take)
[Double Take](https://github.com/jakowenko/double-take) provides an unified UI and API for processing and training images for facial recognition.
It supports automatically setting the sub labels in Frigate for person objects that are detected and recognized.

View File

@ -13,7 +13,7 @@ module.exports = {
themeConfig: { themeConfig: {
algolia: { algolia: {
appId: 'WIURGBNBPY', appId: 'WIURGBNBPY',
apiKey: '81ec882db78f7fed05c51daf973f0362', apiKey: 'd02cc0a6a61178b25da550212925226b',
indexName: 'frigate', indexName: 'frigate',
}, },
docs: { docs: {

View File

@ -37,6 +37,7 @@ module.exports = {
"integrations/home-assistant", "integrations/home-assistant",
"integrations/api", "integrations/api",
"integrations/mqtt", "integrations/mqtt",
"integrations/third_party_extensions",
], ],
Troubleshooting: [ Troubleshooting: [
"troubleshooting/faqs", "troubleshooting/faqs",

View File

@ -28,7 +28,9 @@ from frigate.const import (
RECORD_DIR, RECORD_DIR,
) )
from frigate.object_detection import ObjectDetectProcess from frigate.object_detection import ObjectDetectProcess
from frigate.events import EventCleanup, EventProcessor from frigate.events.cleanup import EventCleanup
from frigate.events.external import ExternalEventProcessor
from frigate.events.maintainer import EventProcessor
from frigate.http import create_app from frigate.http import create_app
from frigate.log import log_process, root_configurer from frigate.log import log_process, root_configurer
from frigate.models import Event, Recordings, Timeline from frigate.models import Event, Recordings, Timeline
@ -204,6 +206,11 @@ class FrigateApp:
self.config, self.camera_metrics, self.detectors, self.processes self.config, self.camera_metrics, self.detectors, self.processes
) )
def init_external_event_processor(self) -> None:
self.external_event_processor = ExternalEventProcessor(
self.config, self.event_queue
)
def init_web_server(self) -> None: def init_web_server(self) -> None:
self.flask_app = create_app( self.flask_app = create_app(
self.config, self.config,
@ -212,6 +219,7 @@ class FrigateApp:
self.detected_frames_processor, self.detected_frames_processor,
self.storage_maintainer, self.storage_maintainer,
self.onvif_controller, self.onvif_controller,
self.external_event_processor,
self.plus_api, self.plus_api,
) )
@ -436,6 +444,7 @@ class FrigateApp:
self.start_camera_capture_processes() self.start_camera_capture_processes()
self.start_storage_maintainer() self.start_storage_maintainer()
self.init_stats() self.init_stats()
self.init_external_event_processor()
self.init_web_server() self.init_web_server()
self.start_timeline_processor() self.start_timeline_processor()
self.start_event_processor() self.start_event_processor()

View File

@ -194,7 +194,13 @@ class Dispatcher:
record_settings = self.config.cameras[camera_name].record record_settings = self.config.cameras[camera_name].record
if payload == "ON": if payload == "ON":
if not self.record_metrics[camera_name]["record_enabled"].value: if not self.config.cameras[camera_name].record.enabled_in_config:
logger.error(
f"Recordings must be enabled in the config to be turned on via MQTT."
)
return
if not record_settings.enabled:
logger.info(f"Turning on recordings for {camera_name}") logger.info(f"Turning on recordings for {camera_name}")
record_settings.enabled = True record_settings.enabled = True
self.record_metrics[camera_name]["record_enabled"].value = True self.record_metrics[camera_name]["record_enabled"].value = True

View File

@ -179,6 +179,9 @@ class RecordConfig(FrigateBaseModel):
events: EventsConfig = Field( events: EventsConfig = Field(
default_factory=EventsConfig, title="Event specific settings." default_factory=EventsConfig, title="Event specific settings."
) )
enabled_in_config: Optional[bool] = Field(
title="Keep track of original state of recording."
)
class MotionConfig(FrigateBaseModel): class MotionConfig(FrigateBaseModel):
@ -961,6 +964,8 @@ class FrigateConfig(FrigateBaseModel):
camera_config.onvif.password = camera_config.onvif.password.format( camera_config.onvif.password = camera_config.onvif.password.format(
**FRIGATE_ENV_VARS **FRIGATE_ENV_VARS
) )
# set config recording value
camera_config.record.enabled_in_config = camera_config.record.enabled
# Add default filters # Add default filters
object_keys = camera_config.objects.track object_keys = camera_config.objects.track

View File

@ -118,6 +118,9 @@ class ModelConfig(BaseModel):
} }
def compute_model_hash(self) -> None: def compute_model_hash(self) -> None:
if not self.path or not os.path.exists(self.path):
self._model_hash = hashlib.md5(b"unknown").hexdigest()
else:
with open(self.path, "rb") as f: with open(self.path, "rb") as f:
file_hash = hashlib.md5() file_hash = hashlib.md5()
while chunk := f.read(8192): while chunk := f.read(8192):

View File

@ -3,7 +3,7 @@ import numpy as np
from frigate.detectors.detection_api import DetectionApi from frigate.detectors.detection_api import DetectionApi
from frigate.detectors.detector_config import BaseDetectorConfig from frigate.detectors.detector_config import BaseDetectorConfig
from typing import Literal from typing_extensions import Literal
from pydantic import Extra, Field from pydantic import Extra, Field
try: try:

View File

@ -5,7 +5,7 @@ import io
from frigate.detectors.detection_api import DetectionApi from frigate.detectors.detection_api import DetectionApi
from frigate.detectors.detector_config import BaseDetectorConfig from frigate.detectors.detector_config import BaseDetectorConfig
from typing import Literal from typing_extensions import Literal
from pydantic import Extra, Field from pydantic import Extra, Field
from PIL import Image from PIL import Image
@ -50,7 +50,10 @@ class DeepStack(DetectionApi):
image_bytes = output.getvalue() image_bytes = output.getvalue()
data = {"api_key": self.api_key} data = {"api_key": self.api_key}
response = requests.post( response = requests.post(
self.api_url, files={"image": image_bytes}, timeout=self.api_timeout self.api_url,
data=data,
files={"image": image_bytes},
timeout=self.api_timeout,
) )
response_json = response.json() response_json = response.json()
detections = np.zeros((20, 6), np.float32) detections = np.zeros((20, 6), np.float32)

View File

@ -3,7 +3,7 @@ import numpy as np
from frigate.detectors.detection_api import DetectionApi from frigate.detectors.detection_api import DetectionApi
from frigate.detectors.detector_config import BaseDetectorConfig from frigate.detectors.detector_config import BaseDetectorConfig
from typing import Literal from typing_extensions import Literal
from pydantic import Extra, Field from pydantic import Extra, Field
try: try:

View File

@ -4,7 +4,7 @@ import openvino.runtime as ov
from frigate.detectors.detection_api import DetectionApi from frigate.detectors.detection_api import DetectionApi
from frigate.detectors.detector_config import BaseDetectorConfig, ModelTypeEnum from frigate.detectors.detector_config import BaseDetectorConfig, ModelTypeEnum
from typing import Literal from typing_extensions import Literal
from pydantic import Extra, Field from pydantic import Extra, Field

View File

@ -13,7 +13,7 @@ except ModuleNotFoundError as e:
from frigate.detectors.detection_api import DetectionApi from frigate.detectors.detection_api import DetectionApi
from frigate.detectors.detector_config import BaseDetectorConfig from frigate.detectors.detector_config import BaseDetectorConfig
from typing import Literal from typing_extensions import Literal
from pydantic import Field from pydantic import Field
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View File

176
frigate/events/cleanup.py Normal file
View File

@ -0,0 +1,176 @@
"""Cleanup events based on configured retention."""
import datetime
import logging
import os
import threading
from pathlib import Path
from peewee import fn
from frigate.config import FrigateConfig
from frigate.const import CLIPS_DIR
from frigate.models import Event
from multiprocessing.synchronize import Event as MpEvent
logger = logging.getLogger(__name__)
class EventCleanup(threading.Thread):
def __init__(self, config: FrigateConfig, stop_event: MpEvent):
threading.Thread.__init__(self)
self.name = "event_cleanup"
self.config = config
self.stop_event = stop_event
self.camera_keys = list(self.config.cameras.keys())
def expire(self, media_type: str) -> None:
# TODO: Refactor media_type to enum
## Expire events from unlisted cameras based on the global config
if media_type == "clips":
retain_config = self.config.record.events.retain
file_extension = "mp4"
update_params = {"has_clip": False}
else:
retain_config = self.config.snapshots.retain
file_extension = "jpg"
update_params = {"has_snapshot": False}
distinct_labels = (
Event.select(Event.label)
.where(Event.camera.not_in(self.camera_keys))
.distinct()
)
# loop over object types in db
for l in distinct_labels:
# get expiration time for this label
expire_days = retain_config.objects.get(l.label, retain_config.default)
expire_after = (
datetime.datetime.now() - datetime.timedelta(days=expire_days)
).timestamp()
# grab all events after specific time
expired_events = Event.select().where(
Event.camera.not_in(self.camera_keys),
Event.start_time < expire_after,
Event.label == l.label,
Event.retain_indefinitely == False,
)
# delete the media from disk
for event in expired_events:
media_name = f"{event.camera}-{event.id}"
media_path = Path(
f"{os.path.join(CLIPS_DIR, media_name)}.{file_extension}"
)
media_path.unlink(missing_ok=True)
if file_extension == "jpg":
media_path = Path(
f"{os.path.join(CLIPS_DIR, media_name)}-clean.png"
)
media_path.unlink(missing_ok=True)
# update the clips attribute for the db entry
update_query = Event.update(update_params).where(
Event.camera.not_in(self.camera_keys),
Event.start_time < expire_after,
Event.label == l.label,
Event.retain_indefinitely == False,
)
update_query.execute()
## Expire events from cameras based on the camera config
for name, camera in self.config.cameras.items():
if media_type == "clips":
retain_config = camera.record.events.retain
else:
retain_config = camera.snapshots.retain
# get distinct objects in database for this camera
distinct_labels = (
Event.select(Event.label).where(Event.camera == name).distinct()
)
# loop over object types in db
for l in distinct_labels:
# get expiration time for this label
expire_days = retain_config.objects.get(l.label, retain_config.default)
expire_after = (
datetime.datetime.now() - datetime.timedelta(days=expire_days)
).timestamp()
# grab all events after specific time
expired_events = Event.select().where(
Event.camera == name,
Event.start_time < expire_after,
Event.label == l.label,
Event.retain_indefinitely == False,
)
# delete the grabbed clips from disk
for event in expired_events:
media_name = f"{event.camera}-{event.id}"
media_path = Path(
f"{os.path.join(CLIPS_DIR, media_name)}.{file_extension}"
)
media_path.unlink(missing_ok=True)
if file_extension == "jpg":
media_path = Path(
f"{os.path.join(CLIPS_DIR, media_name)}-clean.png"
)
media_path.unlink(missing_ok=True)
# update the clips attribute for the db entry
update_query = Event.update(update_params).where(
Event.camera == name,
Event.start_time < expire_after,
Event.label == l.label,
Event.retain_indefinitely == False,
)
update_query.execute()
def purge_duplicates(self) -> None:
duplicate_query = """with grouped_events as (
select id,
label,
camera,
has_snapshot,
has_clip,
row_number() over (
partition by label, camera, round(start_time/5,0)*5
order by end_time-start_time desc
) as copy_number
from event
)
select distinct id, camera, has_snapshot, has_clip from grouped_events
where copy_number > 1;"""
duplicate_events = Event.raw(duplicate_query)
for event in duplicate_events:
logger.debug(f"Removing duplicate: {event.id}")
media_name = f"{event.camera}-{event.id}"
media_path = Path(f"{os.path.join(CLIPS_DIR, media_name)}.jpg")
media_path.unlink(missing_ok=True)
media_path = Path(f"{os.path.join(CLIPS_DIR, media_name)}-clean.png")
media_path.unlink(missing_ok=True)
media_path = Path(f"{os.path.join(CLIPS_DIR, media_name)}.mp4")
media_path.unlink(missing_ok=True)
(
Event.delete()
.where(Event.id << [event.id for event in duplicate_events])
.execute()
)
def run(self) -> None:
# only expire events every 5 minutes
while not self.stop_event.wait(300):
self.expire("clips")
self.expire("snapshots")
self.purge_duplicates()
# drop events from db where has_clip and has_snapshot are false
delete_query = Event.delete().where(
Event.has_clip == False, Event.has_snapshot == False
)
delete_query.execute()
logger.info(f"Exiting event cleanup...")

132
frigate/events/external.py Normal file
View File

@ -0,0 +1,132 @@
"""Handle external events created by the user."""
import base64
import cv2
import datetime
import glob
import logging
import os
import random
import string
from typing import Optional
from multiprocessing.queues import Queue
from frigate.config import CameraConfig, FrigateConfig
from frigate.const import CLIPS_DIR
from frigate.events.maintainer import EventTypeEnum
from frigate.util import draw_box_with_label
logger = logging.getLogger(__name__)
class ExternalEventProcessor:
def __init__(self, config: FrigateConfig, queue: Queue) -> None:
self.config = config
self.queue = queue
self.default_thumbnail = None
def create_manual_event(
self,
camera: str,
label: str,
sub_label: Optional[str],
duration: Optional[int],
include_recording: bool,
draw: dict[str, any],
snapshot_frame: any,
) -> str:
now = datetime.datetime.now().timestamp()
camera_config = self.config.cameras.get(camera)
# create event id and start frame time
rand_id = "".join(random.choices(string.ascii_lowercase + string.digits, k=6))
event_id = f"{now}-{rand_id}"
thumbnail = self._write_images(
camera_config, label, event_id, draw, snapshot_frame
)
self.queue.put(
(
EventTypeEnum.api,
"new",
camera_config,
{
"id": event_id,
"label": label,
"sub_label": sub_label,
"camera": camera,
"start_time": now,
"end_time": now + duration if duration is not None else None,
"thumbnail": thumbnail,
"has_clip": camera_config.record.enabled and include_recording,
"has_snapshot": True,
},
)
)
return event_id
def finish_manual_event(self, event_id: str) -> None:
"""Finish external event with indeterminate duration."""
now = datetime.datetime.now().timestamp()
self.queue.put(
(EventTypeEnum.api, "end", None, {"id": event_id, "end_time": now})
)
def _write_images(
self,
camera_config: CameraConfig,
label: str,
event_id: str,
draw: dict[str, any],
img_frame: any,
) -> str:
# write clean snapshot if enabled
if camera_config.snapshots.clean_copy:
ret, png = cv2.imencode(".png", img_frame)
if ret:
with open(
os.path.join(
CLIPS_DIR,
f"{camera_config.name}-{event_id}-clean.png",
),
"wb",
) as p:
p.write(png.tobytes())
# write jpg snapshot with optional annotations
if draw.get("boxes") and isinstance(draw.get("boxes"), list):
for box in draw.get("boxes"):
x = box["box"][0] * camera_config.detect.width
y = box["box"][1] * camera_config.detect.height
width = box["box"][2] * camera_config.detect.width
height = box["box"][3] * camera_config.detect.height
draw_box_with_label(
img_frame,
x,
y,
x + width,
y + height,
label,
f"{box.get('score', '-')}% {int(width * height)}",
thickness=2,
color=box.get("color", (255, 0, 0)),
)
ret, jpg = cv2.imencode(".jpg", img_frame)
with open(
os.path.join(CLIPS_DIR, f"{camera_config.name}-{event_id}.jpg"),
"wb",
) as j:
j.write(jpg.tobytes())
# create thumbnail with max height of 175 and save
width = int(175 * img_frame.shape[1] / img_frame.shape[0])
thumb = cv2.resize(img_frame, dsize=(width, 175), interpolation=cv2.INTER_AREA)
ret, jpg = cv2.imencode(".jpg", thumb)
return base64.b64encode(jpg.tobytes()).decode("utf-8")

View File

@ -1,16 +1,13 @@
import datetime import datetime
import logging import logging
import os
import queue import queue
import threading import threading
from enum import Enum from enum import Enum
from pathlib import Path
from peewee import fn from peewee import fn
from frigate.config import EventsConfig, FrigateConfig from frigate.config import EventsConfig, FrigateConfig
from frigate.const import CLIPS_DIR
from frigate.models import Event from frigate.models import Event
from frigate.types import CameraMetricsTypes from frigate.types import CameraMetricsTypes
from frigate.util import to_relative_box from frigate.util import to_relative_box
@ -23,7 +20,7 @@ logger = logging.getLogger(__name__)
class EventTypeEnum(str, Enum): class EventTypeEnum(str, Enum):
# api = "api" api = "api"
# audio = "audio" # audio = "audio"
tracked_object = "tracked_object" tracked_object = "tracked_object"
@ -97,6 +94,8 @@ class EventProcessor(threading.Thread):
continue continue
self.handle_object_detection(event_type, camera, event_data) self.handle_object_detection(event_type, camera, event_data)
elif source_type == EventTypeEnum.api:
self.handle_external_detection(event_type, event_data)
# set an end_time on events without an end_time before exiting # set an end_time on events without an end_time before exiting
Event.update(end_time=datetime.datetime.now().timestamp()).where( Event.update(end_time=datetime.datetime.now().timestamp()).where(
@ -197,160 +196,35 @@ class EventProcessor(threading.Thread):
del self.events_in_process[event_data["id"]] del self.events_in_process[event_data["id"]]
self.event_processed_queue.put((event_data["id"], camera)) self.event_processed_queue.put((event_data["id"], camera))
def handle_external_detection(self, type: str, event_data: Event):
if type == "new":
event = {
Event.id: event_data["id"],
Event.label: event_data["label"],
Event.sub_label: event_data["sub_label"],
Event.camera: event_data["camera"],
Event.start_time: event_data["start_time"],
Event.end_time: event_data["end_time"],
Event.thumbnail: event_data["thumbnail"],
Event.has_clip: event_data["has_clip"],
Event.has_snapshot: event_data["has_snapshot"],
Event.zones: [],
Event.data: {},
}
elif type == "end":
event = {
Event.id: event_data["id"],
Event.end_time: event_data["end_time"],
}
class EventCleanup(threading.Thread): try:
def __init__(self, config: FrigateConfig, stop_event: MpEvent):
threading.Thread.__init__(self)
self.name = "event_cleanup"
self.config = config
self.stop_event = stop_event
self.camera_keys = list(self.config.cameras.keys())
def expire(self, media_type: str) -> None:
# TODO: Refactor media_type to enum
## Expire events from unlisted cameras based on the global config
if media_type == "clips":
retain_config = self.config.record.events.retain
file_extension = "mp4"
update_params = {"has_clip": False}
else:
retain_config = self.config.snapshots.retain
file_extension = "jpg"
update_params = {"has_snapshot": False}
distinct_labels = (
Event.select(Event.label)
.where(Event.camera.not_in(self.camera_keys))
.distinct()
)
# loop over object types in db
for l in distinct_labels:
# get expiration time for this label
expire_days = retain_config.objects.get(l.label, retain_config.default)
expire_after = (
datetime.datetime.now() - datetime.timedelta(days=expire_days)
).timestamp()
# grab all events after specific time
expired_events = Event.select().where(
Event.camera.not_in(self.camera_keys),
Event.start_time < expire_after,
Event.label == l.label,
Event.retain_indefinitely == False,
)
# delete the media from disk
for event in expired_events:
media_name = f"{event.camera}-{event.id}"
media_path = Path(
f"{os.path.join(CLIPS_DIR, media_name)}.{file_extension}"
)
media_path.unlink(missing_ok=True)
if file_extension == "jpg":
media_path = Path(
f"{os.path.join(CLIPS_DIR, media_name)}-clean.png"
)
media_path.unlink(missing_ok=True)
# update the clips attribute for the db entry
update_query = Event.update(update_params).where(
Event.camera.not_in(self.camera_keys),
Event.start_time < expire_after,
Event.label == l.label,
Event.retain_indefinitely == False,
)
update_query.execute()
## Expire events from cameras based on the camera config
for name, camera in self.config.cameras.items():
if media_type == "clips":
retain_config = camera.record.events.retain
else:
retain_config = camera.snapshots.retain
# get distinct objects in database for this camera
distinct_labels = (
Event.select(Event.label).where(Event.camera == name).distinct()
)
# loop over object types in db
for l in distinct_labels:
# get expiration time for this label
expire_days = retain_config.objects.get(l.label, retain_config.default)
expire_after = (
datetime.datetime.now() - datetime.timedelta(days=expire_days)
).timestamp()
# grab all events after specific time
expired_events = Event.select().where(
Event.camera == name,
Event.start_time < expire_after,
Event.label == l.label,
Event.retain_indefinitely == False,
)
# delete the grabbed clips from disk
for event in expired_events:
media_name = f"{event.camera}-{event.id}"
media_path = Path(
f"{os.path.join(CLIPS_DIR, media_name)}.{file_extension}"
)
media_path.unlink(missing_ok=True)
if file_extension == "jpg":
media_path = Path(
f"{os.path.join(CLIPS_DIR, media_name)}-clean.png"
)
media_path.unlink(missing_ok=True)
# update the clips attribute for the db entry
update_query = Event.update(update_params).where(
Event.camera == name,
Event.start_time < expire_after,
Event.label == l.label,
Event.retain_indefinitely == False,
)
update_query.execute()
def purge_duplicates(self) -> None:
duplicate_query = """with grouped_events as (
select id,
label,
camera,
has_snapshot,
has_clip,
row_number() over (
partition by label, camera, round(start_time/5,0)*5
order by end_time-start_time desc
) as copy_number
from event
)
select distinct id, camera, has_snapshot, has_clip from grouped_events
where copy_number > 1;"""
duplicate_events = Event.raw(duplicate_query)
for event in duplicate_events:
logger.debug(f"Removing duplicate: {event.id}")
media_name = f"{event.camera}-{event.id}"
media_path = Path(f"{os.path.join(CLIPS_DIR, media_name)}.jpg")
media_path.unlink(missing_ok=True)
media_path = Path(f"{os.path.join(CLIPS_DIR, media_name)}-clean.png")
media_path.unlink(missing_ok=True)
media_path = Path(f"{os.path.join(CLIPS_DIR, media_name)}.mp4")
media_path.unlink(missing_ok=True)
( (
Event.delete() Event.insert(event)
.where(Event.id << [event.id for event in duplicate_events]) .on_conflict(
conflict_target=[Event.id],
update=event,
)
.execute() .execute()
) )
except Exception:
def run(self) -> None: logger.warning(f"Failed to update manual event: {event_data['id']}")
# only expire events every 5 minutes
while not self.stop_event.wait(300):
self.expire("clips")
self.expire("snapshots")
self.purge_duplicates()
# drop events from db where has_clip and has_snapshot are false
delete_query = Event.delete().where(
Event.has_clip == False, Event.has_snapshot == False
)
delete_query.execute()
logger.info(f"Exiting event cleanup...")

View File

@ -53,8 +53,8 @@ _user_agent_args = [
] ]
PRESETS_HW_ACCEL_DECODE = { PRESETS_HW_ACCEL_DECODE = {
"preset-rpi-32-h264": ["-c:v", "h264_v4l2m2m"], "preset-rpi-32-h264": ["-c:v:1", "h264_v4l2m2m"],
"preset-rpi-64-h264": ["-c:v", "h264_v4l2m2m"], "preset-rpi-64-h264": ["-c:v:1", "h264_v4l2m2m"],
"preset-vaapi": [ "preset-vaapi": [
"-hwaccel_flags", "-hwaccel_flags",
"allow_profile_mismatch", "allow_profile_mismatch",
@ -320,7 +320,7 @@ def parse_preset_input(arg: Any, detect_fps: int) -> list[str]:
if arg == "preset-http-jpeg-generic": if arg == "preset-http-jpeg-generic":
input = PRESETS_INPUT[arg].copy() input = PRESETS_INPUT[arg].copy()
input[1] = str(detect_fps) input[len(_user_agent_args) + 1] = str(detect_fps)
return input return input
return PRESETS_INPUT.get(arg, None) return PRESETS_INPUT.get(arg, None)

View File

@ -1,8 +1,8 @@
import base64 import base64
from datetime import datetime, timedelta, timezone from datetime import datetime, timedelta, timezone
import copy import copy
import glob
import logging import logging
import glob
import json import json
import os import os
import subprocess as sp import subprocess as sp
@ -34,6 +34,7 @@ from playhouse.shortcuts import model_to_dict
from frigate.config import FrigateConfig from frigate.config import FrigateConfig
from frigate.const import CLIPS_DIR, MAX_SEGMENT_DURATION, RECORD_DIR from frigate.const import CLIPS_DIR, MAX_SEGMENT_DURATION, RECORD_DIR
from frigate.models import Event, Recordings, Timeline from frigate.models import Event, Recordings, Timeline
from frigate.events.external import ExternalEventProcessor
from frigate.object_processing import TrackedObject from frigate.object_processing import TrackedObject
from frigate.plus import PlusApi from frigate.plus import PlusApi
from frigate.ptz import OnvifController from frigate.ptz import OnvifController
@ -60,6 +61,7 @@ def create_app(
detected_frames_processor, detected_frames_processor,
storage_maintainer: StorageMaintainer, storage_maintainer: StorageMaintainer,
onvif: OnvifController, onvif: OnvifController,
external_processor: ExternalEventProcessor,
plus_api: PlusApi, plus_api: PlusApi,
): ):
app = Flask(__name__) app = Flask(__name__)
@ -79,6 +81,7 @@ def create_app(
app.detected_frames_processor = detected_frames_processor app.detected_frames_processor = detected_frames_processor
app.storage_maintainer = storage_maintainer app.storage_maintainer = storage_maintainer
app.onvif = onvif app.onvif = onvif
app.external_processor = external_processor
app.plus_api = plus_api app.plus_api = plus_api
app.camera_error_image = None app.camera_error_image = None
app.hwaccel_errors = [] app.hwaccel_errors = []
@ -195,7 +198,7 @@ def send_to_plus(id):
return make_response(jsonify({"success": False, "message": message}), 404) return make_response(jsonify({"success": False, "message": message}), 404)
# events from before the conversion to relative dimensions cant include annotations # events from before the conversion to relative dimensions cant include annotations
if any(d > 1 for d in event.data["box"]): if event.data.get("box") is None:
include_annotation = None include_annotation = None
if event.end_time is None: if event.end_time is None:
@ -251,7 +254,6 @@ def send_to_plus(id):
event.save() event.save()
if not include_annotation is None: if not include_annotation is None:
region = event.data["region"]
box = event.data["box"] box = event.data["box"]
try: try:
@ -293,7 +295,7 @@ def false_positive(id):
return make_response(jsonify({"success": False, "message": message}), 404) return make_response(jsonify({"success": False, "message": message}), 404)
# events from before the conversion to relative dimensions cant include annotations # events from before the conversion to relative dimensions cant include annotations
if any(d > 1 for d in event.data["box"]): if event.data.get("box") is None:
message = f"Events prior to 0.13 cannot be submitted as false positives" message = f"Events prior to 0.13 cannot be submitted as false positives"
logger.error(message) logger.error(message)
return make_response(jsonify({"success": False, "message": message}), 400) return make_response(jsonify({"success": False, "message": message}), 400)
@ -848,6 +850,58 @@ def events():
return jsonify([model_to_dict(e, exclude=excluded_fields) for e in events]) return jsonify([model_to_dict(e, exclude=excluded_fields) for e in events])
@bp.route("/events/<camera_name>/<label>/create", methods=["POST"])
def create_event(camera_name, label):
if not camera_name or not current_app.frigate_config.cameras.get(camera_name):
return jsonify(
{"success": False, "message": f"{camera_name} is not a valid camera."}, 404
)
if not label:
return jsonify({"success": False, "message": f"{label} must be set."}, 404)
json: dict[str, any] = request.get_json(silent=True) or {}
try:
frame = current_app.detected_frames_processor.get_current_frame(camera_name)
event_id = current_app.external_processor.create_manual_event(
camera_name,
label,
json.get("sub_label", None),
json.get("duration", 30),
json.get("include_recording", True),
json.get("draw", {}),
frame,
)
except Exception as e:
logger.error(f"The error is {e}")
return jsonify(
{"success": False, "message": f"An unknown error occurred: {e}"}, 404
)
return jsonify(
{
"success": True,
"message": "Successfully created event.",
"event_id": event_id,
},
200,
)
@bp.route("/events/<event_id>/end", methods=["PUT"])
def end_event(event_id):
try:
current_app.external_processor.finish_manual_event(event_id)
except:
return jsonify(
{"success": False, "message": f"{event_id} must be set and valid."}, 404
)
return jsonify({"success": True, "message": f"Event successfully ended."}, 200)
@bp.route("/config") @bp.route("/config")
def config(): def config():
config = current_app.frigate_config.dict() config = current_app.frigate_config.dict()
@ -906,6 +960,7 @@ def config_save():
# Validate the config schema # Validate the config schema
try: try:
new_yaml = FrigateConfig.parse_raw(new_config) new_yaml = FrigateConfig.parse_raw(new_config)
check_runtime = new_yaml.runtime_config
except Exception as e: except Exception as e:
return make_response( return make_response(
jsonify( jsonify(

View File

@ -21,7 +21,7 @@ from frigate.config import (
FrigateConfig, FrigateConfig,
) )
from frigate.const import CLIPS_DIR from frigate.const import CLIPS_DIR
from frigate.events import EventTypeEnum from frigate.events.maintainer import EventTypeEnum
from frigate.util import ( from frigate.util import (
SharedMemoryFrameManager, SharedMemoryFrameManager,
calculate_region, calculate_region,

View File

View File

@ -52,7 +52,7 @@ class TestFfmpegPresets(unittest.TestCase):
assert "preset-rpi-64-h264" not in ( assert "preset-rpi-64-h264" not in (
" ".join(frigate_config.cameras["back"].ffmpeg_cmds[0]["cmd"]) " ".join(frigate_config.cameras["back"].ffmpeg_cmds[0]["cmd"])
) )
assert "-c:v h264_v4l2m2m" in ( assert "-c:v:1 h264_v4l2m2m" in (
" ".join(frigate_config.cameras["back"].ffmpeg_cmds[0]["cmd"]) " ".join(frigate_config.cameras["back"].ffmpeg_cmds[0]["cmd"])
) )

View File

@ -120,6 +120,7 @@ class TestHttp(unittest.TestCase):
None, None,
None, None,
None, None,
None,
PlusApi(), PlusApi(),
) )
id = "123456.random" id = "123456.random"
@ -155,6 +156,7 @@ class TestHttp(unittest.TestCase):
None, None,
None, None,
None, None,
None,
PlusApi(), PlusApi(),
) )
id = "123456.random" id = "123456.random"
@ -175,6 +177,7 @@ class TestHttp(unittest.TestCase):
None, None,
None, None,
None, None,
None,
PlusApi(), PlusApi(),
) )
id = "123456.random" id = "123456.random"
@ -194,6 +197,7 @@ class TestHttp(unittest.TestCase):
None, None,
None, None,
None, None,
None,
PlusApi(), PlusApi(),
) )
id = "123456.random" id = "123456.random"
@ -215,6 +219,7 @@ class TestHttp(unittest.TestCase):
None, None,
None, None,
None, None,
None,
PlusApi(), PlusApi(),
) )
id = "123456.random" id = "123456.random"
@ -240,6 +245,7 @@ class TestHttp(unittest.TestCase):
None, None,
None, None,
None, None,
None,
PlusApi(), PlusApi(),
) )
id = "123456.random" id = "123456.random"
@ -274,6 +280,7 @@ class TestHttp(unittest.TestCase):
None, None,
None, None,
None, None,
None,
PlusApi(), PlusApi(),
) )
id = "123456.random" id = "123456.random"
@ -298,6 +305,7 @@ class TestHttp(unittest.TestCase):
None, None,
None, None,
None, None,
None,
PlusApi(), PlusApi(),
) )
@ -314,6 +322,7 @@ class TestHttp(unittest.TestCase):
None, None,
None, None,
None, None,
None,
PlusApi(), PlusApi(),
) )
id = "123456.random" id = "123456.random"
@ -333,6 +342,7 @@ class TestHttp(unittest.TestCase):
None, None,
None, None,
None, None,
None,
PlusApi(), PlusApi(),
) )
mock_stats.return_value = self.test_stats mock_stats.return_value = self.test_stats

View File

@ -5,7 +5,7 @@ import threading
import queue import queue
from frigate.config import FrigateConfig from frigate.config import FrigateConfig
from frigate.events import EventTypeEnum from frigate.events.maintainer import EventTypeEnum
from frigate.models import Timeline from frigate.models import Timeline
from multiprocessing.queues import Queue from multiprocessing.queues import Queue

View File

@ -864,10 +864,12 @@ def get_bandwidth_stats() -> dict[str, dict]:
for line in lines: for line in lines:
stats = list(filter(lambda a: a != "", line.strip().split("\t"))) stats = list(filter(lambda a: a != "", line.strip().split("\t")))
try: try:
if re.search("^ffmpeg/([0-9]+)/", stats[0]): if re.search(
"(^ffmpeg|\/go2rtc|frigate\.detector\.[a-z]+)/([0-9]+)/", stats[0]
):
process = stats[0].split("/") process = stats[0].split("/")
usages[process[1]] = { usages[process[len(process) - 2]] = {
"bandwidth": round(float(stats[2]), 1), "bandwidth": round(float(stats[1]) + float(stats[2]), 1),
} }
except: except:
continue continue

View File

@ -16,7 +16,7 @@ export const handlers = [
front: { front: {
name: 'front', name: 'front',
objects: { track: ['taco', 'cat', 'dog'] }, objects: { track: ['taco', 'cat', 'dog'] },
record: { enabled: true }, record: { enabled: true, enabled_in_config: true },
detect: { width: 1280, height: 720 }, detect: { width: 1280, height: 720 },
snapshots: {}, snapshots: {},
restream: { enabled: true, jsmpeg: { height: 720 } }, restream: { enabled: true, jsmpeg: { height: 720 } },
@ -25,7 +25,7 @@ export const handlers = [
side: { side: {
name: 'side', name: 'side',
objects: { track: ['taco', 'cat', 'dog'] }, objects: { track: ['taco', 'cat', 'dog'] },
record: { enabled: false }, record: { enabled: false, enabled_in_config: true },
detect: { width: 1280, height: 720 }, detect: { width: 1280, height: 720 },
snapshots: {}, snapshots: {},
restream: { enabled: true, jsmpeg: { height: 720 } }, restream: { enabled: true, jsmpeg: { height: 720 } },

View File

@ -0,0 +1,640 @@
class VideoRTC extends HTMLElement {
constructor() {
super();
this.DISCONNECT_TIMEOUT = 5000;
this.RECONNECT_TIMEOUT = 30000;
this.CODECS = [
'avc1.640029', // H.264 high 4.1 (Chromecast 1st and 2nd Gen)
'avc1.64002A', // H.264 high 4.2 (Chromecast 3rd Gen)
'avc1.640033', // H.264 high 5.1 (Chromecast with Google TV)
'hvc1.1.6.L153.B0', // H.265 main 5.1 (Chromecast Ultra)
'mp4a.40.2', // AAC LC
'mp4a.40.5', // AAC HE
'flac', // FLAC (PCM compatible)
'opus', // OPUS Chrome, Firefox
];
/**
* [config] Supported modes (webrtc, mse, mp4, mjpeg).
* @type {string}
*/
this.mode = 'webrtc,mse,mp4,mjpeg';
/**
* [config] Run stream when not displayed on the screen. Default `false`.
* @type {boolean}
*/
this.background = false;
/**
* [config] Run stream only when player in the viewport. Stop when user scroll out player.
* Value is percentage of visibility from `0` (not visible) to `1` (full visible).
* Default `0` - disable;
* @type {number}
*/
this.visibilityThreshold = 0;
/**
* [config] Run stream only when browser page on the screen. Stop when user change browser
* tab or minimise browser windows.
* @type {boolean}
*/
this.visibilityCheck = true;
/**
* [config] WebRTC configuration
* @type {RTCConfiguration}
*/
this.pcConfig = {
iceServers: [{ urls: 'stun:stun.l.google.com:19302' }],
sdpSemantics: 'unified-plan', // important for Chromecast 1
};
/**
* [info] WebSocket connection state. Values: CONNECTING, OPEN, CLOSED
* @type {number}
*/
this.wsState = WebSocket.CLOSED;
/**
* [info] WebRTC connection state.
* @type {number}
*/
this.pcState = WebSocket.CLOSED;
/**
* @type {HTMLVideoElement}
*/
this.video = null;
/**
* @type {WebSocket}
*/
this.ws = null;
/**
* @type {string|URL}
*/
this.wsURL = '';
/**
* @type {RTCPeerConnection}
*/
this.pc = null;
/**
* @type {number}
*/
this.connectTS = 0;
/**
* @type {string}
*/
this.mseCodecs = '';
/**
* [internal] Disconnect TimeoutID.
* @type {number}
*/
this.disconnectTID = 0;
/**
* [internal] Reconnect TimeoutID.
* @type {number}
*/
this.reconnectTID = 0;
/**
* [internal] Handler for receiving Binary from WebSocket.
* @type {Function}
*/
this.ondata = null;
/**
* [internal] Handlers list for receiving JSON from WebSocket
* @type {Object.<string,Function>}}
*/
this.onmessage = null;
}
/**
* Set video source (WebSocket URL). Support relative path.
* @param {string|URL} value
*/
set src(value) {
if (typeof value !== 'string') value = value.toString();
if (value.startsWith('http')) {
value = `ws${value.substring(4)}`;
} else if (value.startsWith('/')) {
value = `ws${location.origin.substring(4)}${value}`;
}
this.wsURL = value;
this.onconnect();
}
/**
* Play video. Support automute when autoplay blocked.
* https://developer.chrome.com/blog/autoplay/
*/
play() {
this.video.play().catch((er) => {
if (er.name === 'NotAllowedError' && !this.video.muted) {
this.video.muted = true;
this.video.play().catch(() => { });
}
});
}
/**
* Send message to server via WebSocket
* @param {Object} value
*/
send(value) {
if (this.ws) this.ws.send(JSON.stringify(value));
}
codecs(type) {
const test =
type === 'mse'
? (codec) => MediaSource.isTypeSupported(`video/mp4; codecs="${codec}"`)
: (codec) => this.video.canPlayType(`video/mp4; codecs="${codec}"`);
return this.CODECS.filter(test).join();
}
/**
* `CustomElement`. Invoked each time the custom element is appended into a
* document-connected element.
*/
connectedCallback() {
if (this.disconnectTID) {
clearTimeout(this.disconnectTID);
this.disconnectTID = 0;
}
// because video autopause on disconnected from DOM
if (this.video) {
const seek = this.video.seekable;
if (seek.length > 0) {
this.video.currentTime = seek.end(seek.length - 1);
}
this.play();
} else {
this.oninit();
}
this.onconnect();
}
/**
* `CustomElement`. Invoked each time the custom element is disconnected from the
* document's DOM.
*/
disconnectedCallback() {
if (this.background || this.disconnectTID) return;
if (this.wsState === WebSocket.CLOSED && this.pcState === WebSocket.CLOSED) return;
this.disconnectTID = setTimeout(() => {
if (this.reconnectTID) {
clearTimeout(this.reconnectTID);
this.reconnectTID = 0;
}
this.disconnectTID = 0;
this.ondisconnect();
}, this.DISCONNECT_TIMEOUT);
}
/**
* Creates child DOM elements. Called automatically once on `connectedCallback`.
*/
oninit() {
this.video = document.createElement('video');
this.video.controls = true;
this.video.playsInline = true;
this.video.preload = 'auto';
this.video.style.display = 'block'; // fix bottom margin 4px
this.video.style.width = '100%';
this.video.style.height = '100%';
this.appendChild(this.video);
if (this.background) return;
if ('hidden' in document && this.visibilityCheck) {
document.addEventListener('visibilitychange', () => {
if (document.hidden) {
this.disconnectedCallback();
} else if (this.isConnected) {
this.connectedCallback();
}
});
}
if ('IntersectionObserver' in window && this.visibilityThreshold) {
const observer = new IntersectionObserver(
(entries) => {
entries.forEach((entry) => {
if (!entry.isIntersecting) {
this.disconnectedCallback();
} else if (this.isConnected) {
this.connectedCallback();
}
});
},
{ threshold: this.visibilityThreshold }
);
observer.observe(this);
}
}
/**
* Connect to WebSocket. Called automatically on `connectedCallback`.
* @return {boolean} true if the connection has started.
*/
onconnect() {
if (!this.isConnected || !this.wsURL || this.ws || this.pc) return false;
// CLOSED or CONNECTING => CONNECTING
this.wsState = WebSocket.CONNECTING;
this.connectTS = Date.now();
this.ws = new WebSocket(this.wsURL);
this.ws.binaryType = 'arraybuffer';
this.ws.addEventListener('open', (ev) => this.onopen(ev));
this.ws.addEventListener('close', (ev) => this.onclose(ev));
return true;
}
ondisconnect() {
this.wsState = WebSocket.CLOSED;
if (this.ws) {
this.ws.close();
this.ws = null;
}
this.pcState = WebSocket.CLOSED;
if (this.pc) {
this.pc.close();
this.pc = null;
}
}
/**
* @returns {Array.<string>} of modes (mse, webrtc, etc.)
*/
onopen() {
// CONNECTING => OPEN
this.wsState = WebSocket.OPEN;
this.ws.addEventListener('message', (ev) => {
if (typeof ev.data === 'string') {
const msg = JSON.parse(ev.data);
for (const mode in this.onmessage) {
this.onmessage[mode](msg);
}
} else {
this.ondata(ev.data);
}
});
this.ondata = null;
this.onmessage = {};
const modes = [];
if (this.mode.indexOf('mse') >= 0 && 'MediaSource' in window) {
// iPhone
modes.push('mse');
this.onmse();
} else if (this.mode.indexOf('mp4') >= 0) {
modes.push('mp4');
this.onmp4();
}
if (this.mode.indexOf('webrtc') >= 0 && 'RTCPeerConnection' in window) {
// macOS Desktop app
modes.push('webrtc');
this.onwebrtc();
}
if (this.mode.indexOf('mjpeg') >= 0) {
if (modes.length) {
this.onmessage['mjpeg'] = (msg) => {
if (msg.type !== 'error' || msg.value.indexOf(modes[0]) !== 0) return;
this.onmjpeg();
};
} else {
modes.push('mjpeg');
this.onmjpeg();
}
}
return modes;
}
/**
* @return {boolean} true if reconnection has started.
*/
onclose() {
if (this.wsState === WebSocket.CLOSED) return false;
// CONNECTING, OPEN => CONNECTING
this.wsState = WebSocket.CONNECTING;
this.ws = null;
// reconnect no more than once every X seconds
const delay = Math.max(this.RECONNECT_TIMEOUT - (Date.now() - this.connectTS), 0);
this.reconnectTID = setTimeout(() => {
this.reconnectTID = 0;
this.onconnect();
}, delay);
return true;
}
onmse() {
const ms = new MediaSource();
ms.addEventListener(
'sourceopen',
() => {
URL.revokeObjectURL(this.video.src);
this.send({ type: 'mse', value: this.codecs('mse') });
},
{ once: true }
);
this.video.src = URL.createObjectURL(ms);
this.video.srcObject = null;
this.play();
this.mseCodecs = '';
this.onmessage['mse'] = (msg) => {
if (msg.type !== 'mse') return;
this.mseCodecs = msg.value;
const sb = ms.addSourceBuffer(msg.value);
sb.mode = 'segments'; // segments or sequence
sb.addEventListener('updateend', () => {
if (sb.updating) return;
try {
if (bufLen > 0) {
const data = buf.slice(0, bufLen);
bufLen = 0;
sb.appendBuffer(data);
} else if (sb.buffered && sb.buffered.length) {
const end = sb.buffered.end(sb.buffered.length - 1) - 15;
const start = sb.buffered.start(0);
if (end > start) {
sb.remove(start, end);
ms.setLiveSeekableRange(end, end + 15);
}
// console.debug("VideoRTC.buffered", start, end);
}
} catch (e) {
// console.debug(e);
}
});
const buf = new Uint8Array(2 * 1024 * 1024);
let bufLen = 0;
this.ondata = (data) => {
if (sb.updating || bufLen > 0) {
const b = new Uint8Array(data);
buf.set(b, bufLen);
bufLen += b.byteLength;
// console.debug("VideoRTC.buffer", b.byteLength, bufLen);
} else {
try {
sb.appendBuffer(data);
} catch (e) {
// console.debug(e);
}
}
};
};
}
onwebrtc() {
const pc = new RTCPeerConnection(this.pcConfig);
/** @type {HTMLVideoElement} */
const video2 = document.createElement('video');
video2.addEventListener('loadeddata', (ev) => this.onpcvideo(ev), { once: true });
pc.addEventListener('icecandidate', (ev) => {
const candidate = ev.candidate ? ev.candidate.toJSON().candidate : '';
this.send({ type: 'webrtc/candidate', value: candidate });
});
pc.addEventListener('track', (ev) => {
// when stream already init
if (video2.srcObject !== null) return;
// when audio track not exist in Chrome
if (ev.streams.length === 0) return;
// when audio track not exist in Firefox
if (ev.streams[0].id[0] === '{') return;
video2.srcObject = ev.streams[0];
});
pc.addEventListener('connectionstatechange', () => {
if (pc.connectionState === 'failed' || pc.connectionState === 'disconnected') {
pc.close(); // stop next events
this.pcState = WebSocket.CLOSED;
this.pc = null;
this.onconnect();
}
});
this.onmessage['webrtc'] = (msg) => {
switch (msg.type) {
case 'webrtc/candidate':
pc.addIceCandidate({
candidate: msg.value,
sdpMid: '0',
}).catch(() => { });
break;
case 'webrtc/answer':
pc.setRemoteDescription({
type: 'answer',
sdp: msg.value,
}).catch(() => { });
break;
case 'error':
if (msg.value.indexOf('webrtc/offer') < 0) return;
pc.close();
}
};
// Safari doesn't support "offerToReceiveVideo"
pc.addTransceiver('video', { direction: 'recvonly' });
pc.addTransceiver('audio', { direction: 'recvonly' });
pc.createOffer().then((offer) => {
pc.setLocalDescription(offer).then(() => {
this.send({ type: 'webrtc/offer', value: offer.sdp });
});
});
this.pcState = WebSocket.CONNECTING;
this.pc = pc;
}
/**
* @param ev {Event}
*/
onpcvideo(ev) {
if (!this.pc) return;
/** @type {HTMLVideoElement} */
const video2 = ev.target;
const state = this.pc.connectionState;
// Firefox doesn't support pc.connectionState
if (state === 'connected' || state === 'connecting' || !state) {
// Video+Audio > Video, H265 > H264, Video > Audio, WebRTC > MSE
let rtcPriority = 0,
msePriority = 0;
/** @type {MediaStream} */
const ms = video2.srcObject;
if (ms.getVideoTracks().length > 0) rtcPriority += 0x220;
if (ms.getAudioTracks().length > 0) rtcPriority += 0x102;
if (this.mseCodecs.indexOf('hvc1.') >= 0) msePriority += 0x230;
if (this.mseCodecs.indexOf('avc1.') >= 0) msePriority += 0x210;
if (this.mseCodecs.indexOf('mp4a.') >= 0) msePriority += 0x101;
if (rtcPriority >= msePriority) {
this.video.srcObject = ms;
this.play();
this.pcState = WebSocket.OPEN;
this.wsState = WebSocket.CLOSED;
this.ws.close();
this.ws = null;
} else {
this.pcState = WebSocket.CLOSED;
this.pc.close();
this.pc = null;
}
}
video2.srcObject = null;
}
onmjpeg() {
this.ondata = (data) => {
this.video.controls = false;
this.video.poster = `data:image/jpeg;base64,${VideoRTC.btoa(data)}`;
};
this.send({ type: 'mjpeg' });
}
onmp4() {
/** @type {HTMLCanvasElement} **/
const canvas = document.createElement('canvas');
/** @type {CanvasRenderingContext2D} */
let context;
/** @type {HTMLVideoElement} */
const video2 = document.createElement('video');
video2.autoplay = true;
video2.playsInline = true;
video2.muted = true;
video2.addEventListener('loadeddata', (_) => {
if (!context) {
canvas.width = video2.videoWidth;
canvas.height = video2.videoHeight;
context = canvas.getContext('2d');
}
context.drawImage(video2, 0, 0, canvas.width, canvas.height);
this.video.controls = false;
this.video.poster = canvas.toDataURL('image/jpeg');
});
this.ondata = (data) => {
video2.src = `data:video/mp4;base64,${VideoRTC.btoa(data)}`;
};
this.send({ type: 'mp4', value: this.codecs('mp4') });
}
static btoa(buffer) {
const bytes = new Uint8Array(buffer);
const len = bytes.byteLength;
let binary = '';
for (let i = 0; i < len; i++) {
binary += String.fromCharCode(bytes[i]);
}
return window.btoa(binary);
}
}
class VideoStream extends VideoRTC {
/**
* Custom GUI
*/
oninit() {
super.oninit();
const info = this.querySelector('.info');
this.insertBefore(this.video, info);
}
onconnect() {
const result = super.onconnect();
if (result) this.divMode = 'loading';
return result;
}
ondisconnect() {;
super.ondisconnect();
}
onopen() {
const result = super.onopen();
this.onmessage['stream'] = (_) => {
};
return result;
}
onclose() {
return super.onclose();
}
onpcvideo(ev) {
super.onpcvideo(ev);
if (this.pcState !== WebSocket.CLOSED) {
this.divMode = 'RTC';
}
}
}
customElements.define('video-stream', VideoStream);

View File

@ -1,79 +0,0 @@
import { h } from 'preact';
import { baseUrl } from '../api/baseUrl';
import { useEffect } from 'preact/hooks';
export default function MsePlayer({ camera, width, height }) {
const url = `${baseUrl.replace(/^http/, 'ws')}live/mse/api/ws?src=${camera}`;
useEffect(() => {
const video = document.querySelector('#video');
// support api_path
const ws = new WebSocket(url);
ws.binaryType = 'arraybuffer';
let mediaSource,
sourceBuffer,
queueBuffer = [];
ws.onopen = () => {
mediaSource = new MediaSource();
video.src = URL.createObjectURL(mediaSource);
mediaSource.onsourceopen = () => {
mediaSource.onsourceopen = null;
URL.revokeObjectURL(video.src);
ws.send(JSON.stringify({ type: 'mse' }));
};
};
ws.onmessage = (ev) => {
if (typeof ev.data === 'string') {
const data = JSON.parse(ev.data);
if (data.type === 'mse') {
sourceBuffer = mediaSource.addSourceBuffer(data.value);
sourceBuffer.mode = 'segments'; // segments or sequence
sourceBuffer.onupdateend = () => {
if (!sourceBuffer.updating && queueBuffer.length > 0) {
try {
sourceBuffer.appendBuffer(queueBuffer.shift());
} catch (e) {
// console.warn(e);
}
}
};
}
} else if (sourceBuffer.updating || queueBuffer.length > 0) {
queueBuffer.push(ev.data);
} else {
try {
sourceBuffer.appendBuffer(ev.data);
} catch (e) {
// console.warn(e);
}
}
if (video.seekable.length > 0) {
const delay = video.seekable.end(video.seekable.length - 1) - video.currentTime;
if (delay < 1) {
video.playbackRate = 1;
} else if (delay > 10) {
video.playbackRate = 10;
} else if (delay > 2) {
video.playbackRate = Math.floor(delay);
}
}
};
return () => {
const video = document.getElementById('video');
video.srcObject = null;
ws.close();
};
}, [url]);
return (
<div>
<video id="video" autoplay playsinline controls muted width={width} height={height} />
</div>
);
}

View File

@ -7,30 +7,34 @@ import Button from './Button';
import CameraIcon from '../icons/Camera'; import CameraIcon from '../icons/Camera';
export default function MultiSelect({ className, title, options, selection, onToggle, onShowAll, onSelectSingle }) { export default function MultiSelect({ className, title, options, selection, onToggle, onShowAll, onSelectSingle }) {
const popupRef = useRef(null); const popupRef = useRef(null);
const [state, setState] = useState({ const [state, setState] = useState({
showMenu: false, showMenu: false,
}); });
const isOptionSelected = (item) => { return selection == "all" || selection.split(',').indexOf(item) > -1; } const isOptionSelected = (item) => {
return selection == 'all' || selection.split(',').indexOf(item) > -1;
};
const menuHeight = Math.round(window.innerHeight * 0.55); const menuHeight = Math.round(window.innerHeight * 0.55);
return ( return (
<div className={`${className} p-2`} ref={popupRef}> <div className={`${className} p-2`} ref={popupRef}>
<div <div className="flex justify-between min-w-[120px]" onClick={() => setState({ showMenu: true })}>
className="flex justify-between min-w-[120px]"
onClick={() => setState({ showMenu: true })}
>
<label>{title}</label> <label>{title}</label>
<ArrowDropdown className="w-6" /> <ArrowDropdown className="w-6" />
</div> </div>
{state.showMenu ? ( {state.showMenu ? (
<Menu className={`max-h-[${menuHeight}px] overflow-scroll`} relativeTo={popupRef} onDismiss={() => setState({ showMenu: false })}> <Menu
className={`max-h-[${menuHeight}px] overflow-auto`}
relativeTo={popupRef}
onDismiss={() => setState({ showMenu: false })}
>
<div className="flex flex-wrap justify-between items-center"> <div className="flex flex-wrap justify-between items-center">
<Heading className="p-4 justify-center" size="md">{title}</Heading> <Heading className="p-4 justify-center" size="md">
{title}
</Heading>
<Button tabindex="false" className="mx-4" onClick={() => onShowAll()}> <Button tabindex="false" className="mx-4" onClick={() => onShowAll()}>
Show All Show All
</Button> </Button>
@ -38,16 +42,23 @@ export default function MultiSelect({ className, title, options, selection, onTo
{options.map((item) => ( {options.map((item) => (
<div className="flex flex-grow" key={item}> <div className="flex flex-grow" key={item}>
<label <label
className={`flex flex-shrink space-x-2 p-1 my-1 min-w-[176px] hover:bg-gray-200 dark:hover:bg-gray-800 dark:hover:text-white cursor-pointer capitalize text-sm`}> className={`flex flex-shrink space-x-2 p-1 my-1 min-w-[176px] hover:bg-gray-200 dark:hover:bg-gray-800 dark:hover:text-white cursor-pointer capitalize text-sm`}
>
<input <input
className="mx-4 m-0 align-middle" className="mx-4 m-0 align-middle"
type="checkbox" type="checkbox"
checked={isOptionSelected(item)} checked={isOptionSelected(item)}
onChange={() => onToggle(item)} /> onChange={() => onToggle(item)}
{item.replaceAll("_", " ")} />
{item.replaceAll('_', ' ')}
</label> </label>
<div className="justify-right"> <div className="justify-right">
<Button color={isOptionSelected(item) ? "blue" : "black"} type="text" className="max-h-[35px] mx-2" onClick={() => onSelectSingle(item)}> <Button
color={isOptionSelected(item) ? 'blue' : 'black'}
type="text"
className="max-h-[35px] mx-2"
onClick={() => onSelectSingle(item)}
>
<CameraIcon /> <CameraIcon />
</Button> </Button>
</div> </div>

View File

@ -57,7 +57,7 @@ export default function RelativeModal({
x: relativeToX, x: relativeToX,
y: relativeToY, y: relativeToY,
width: relativeToWidth, width: relativeToWidth,
// height: relativeToHeight, height: relativeToHeight,
} = relativeTo.current.getBoundingClientRect(); } = relativeTo.current.getBoundingClientRect();
const _width = widthRelative ? relativeToWidth : menuWidth; const _width = widthRelative ? relativeToWidth : menuWidth;
@ -78,10 +78,13 @@ export default function RelativeModal({
newLeft = windowWidth - width - WINDOW_PADDING; newLeft = windowWidth - width - WINDOW_PADDING;
} }
// too close to bottom // This condition checks if the menu overflows the bottom of the page and
if (top + menuHeight > windowHeight - WINDOW_PADDING + window.scrollY) { // if there's enough space to position the menu above the clicked icon.
// If the pop-up modal would extend beyond the bottom of the visible window, // If both conditions are met, the menu will be positioned above the clicked icon
// reposition the modal to appear above the clicked icon instead if (
top + menuHeight > windowHeight - WINDOW_PADDING + window.scrollY &&
top - menuHeight - relativeToHeight >= WINDOW_PADDING
) {
newTop = top - menuHeight; newTop = top - menuHeight;
} }
@ -89,7 +92,13 @@ export default function RelativeModal({
newTop = WINDOW_PADDING; newTop = WINDOW_PADDING;
} }
const maxHeight = windowHeight - WINDOW_PADDING * 2 > menuHeight ? null : windowHeight - WINDOW_PADDING * 2; // This calculation checks if there's enough space below the clicked icon for the menu to fit.
// If there is, it sets the maxHeight to null(meaning no height constraint). If not, it calculates the maxHeight based on the remaining space in the window
const maxHeight =
windowHeight - WINDOW_PADDING * 2 - top > menuHeight
? null
: windowHeight - WINDOW_PADDING * 2 - top + window.scrollY;
const newPosition = { left: newLeft, top: newTop, maxHeight }; const newPosition = { left: newLeft, top: newTop, maxHeight };
if (widthRelative) { if (widthRelative) {
newPosition.width = relativeToWidth; newPosition.width = relativeToWidth;
@ -115,7 +124,7 @@ export default function RelativeModal({
<div data-testid="scrim" key="scrim" className="fixed inset-0 z-10" onClick={handleDismiss} /> <div data-testid="scrim" key="scrim" className="fixed inset-0 z-10" onClick={handleDismiss} />
<div <div
key="menu" key="menu"
className={`z-10 bg-white dark:bg-gray-700 dark:text-white absolute shadow-lg rounded w-auto h-auto transition-transform transition-opacity duration-75 transform scale-90 opacity-0 overflow-x-hidden overflow-y-auto ${ className={`z-10 bg-white dark:bg-gray-700 dark:text-white absolute shadow-lg rounded w-auto h-auto transition-transform duration-75 transform scale-90 opacity-0 overflow-x-hidden overflow-y-auto ${
show ? 'scale-100 opacity-100' : '' show ? 'scale-100 opacity-100' : ''
} ${className}`} } ${className}`}
onKeyDown={handleKeydown} onKeyDown={handleKeydown}

View File

@ -1,68 +1,95 @@
import { h } from 'preact'; import { h } from 'preact';
import { baseUrl } from '../api/baseUrl'; import { baseUrl } from '../api/baseUrl';
import { useEffect } from 'preact/hooks'; import { useCallback, useEffect } from 'preact/hooks';
export default function WebRtcPlayer({ camera, width, height }) { export default function WebRtcPlayer({ camera, width, height }) {
const url = `${baseUrl.replace(/^http/, 'ws')}live/webrtc/api/ws?src=${camera}`; const url = `${baseUrl.replace(/^http/, 'ws')}live/webrtc/api/ws?src=${camera}`;
useEffect(() => { const PeerConnection = useCallback(async (media) => {
const pc = new RTCPeerConnection({
iceServers: [{ urls: 'stun:stun.l.google.com:19302' }],
});
const localTracks = [];
if (/camera|microphone/.test(media)) {
const tracks = await getMediaTracks('user', {
video: media.indexOf('camera') >= 0,
audio: media.indexOf('microphone') >= 0,
});
tracks.forEach((track) => {
pc.addTransceiver(track, { direction: 'sendonly' });
if (track.kind === 'video') localTracks.push(track);
});
}
if (media.indexOf('display') >= 0) {
const tracks = await getMediaTracks('display', {
video: true,
audio: media.indexOf('speaker') >= 0,
});
tracks.forEach((track) => {
pc.addTransceiver(track, { direction: 'sendonly' });
if (track.kind === 'video') localTracks.push(track);
});
}
if (/video|audio/.test(media)) {
const tracks = ['video', 'audio']
.filter((kind) => media.indexOf(kind) >= 0)
.map((kind) => pc.addTransceiver(kind, { direction: 'recvonly' }).receiver.track);
localTracks.push(...tracks);
}
document.getElementById('video').srcObject = new MediaStream(localTracks);
return pc;
}, []);
async function getMediaTracks(media, constraints) {
try {
const stream =
media === 'user'
? await navigator.mediaDevices.getUserMedia(constraints)
: await navigator.mediaDevices.getDisplayMedia(constraints);
return stream.getTracks();
} catch (e) {
return [];
}
}
const connect = useCallback(async () => {
const pc = await PeerConnection('video+audio');
const ws = new WebSocket(url); const ws = new WebSocket(url);
ws.onopen = () => {
pc.createOffer().then((offer) => { ws.addEventListener('open', () => {
pc.setLocalDescription(offer).then(() => { pc.addEventListener('icecandidate', (ev) => {
if (!ev.candidate) return;
const msg = { type: 'webrtc/candidate', value: ev.candidate.candidate };
ws.send(JSON.stringify(msg));
});
pc.createOffer()
.then((offer) => pc.setLocalDescription(offer))
.then(() => {
const msg = { type: 'webrtc/offer', value: pc.localDescription.sdp }; const msg = { type: 'webrtc/offer', value: pc.localDescription.sdp };
ws.send(JSON.stringify(msg)); ws.send(JSON.stringify(msg));
}); });
}); });
};
ws.onmessage = (ev) => {
const msg = JSON.parse(ev.data);
ws.addEventListener('message', (ev) => {
const msg = JSON.parse(ev.data);
if (msg.type === 'webrtc/candidate') { if (msg.type === 'webrtc/candidate') {
pc.addIceCandidate({ candidate: msg.value, sdpMid: '0' }); pc.addIceCandidate({ candidate: msg.value, sdpMid: '0' });
} else if (msg.type === 'webrtc/answer') { } else if (msg.type === 'webrtc/answer') {
pc.setRemoteDescription({ type: 'answer', sdp: msg.value }); pc.setRemoteDescription({ type: 'answer', sdp: msg.value });
} }
};
const pc = new RTCPeerConnection({
iceServers: [{ urls: 'stun:stun.l.google.com:19302' }],
}); });
pc.onicecandidate = (ev) => { }, [PeerConnection, url]);
if (ev.candidate !== null) {
ws.send(
JSON.stringify({
type: 'webrtc/candidate',
value: ev.candidate.toJSON().candidate,
})
);
}
};
pc.ontrack = (ev) => {
const video = document.getElementById('video');
// when audio track not exist in Chrome useEffect(() => {
if (ev.streams.length === 0) return; connect();
// when audio track not exist in Firefox }, [connect]);
if (ev.streams[0].id[0] === '{') return;
// when stream already init
if (video.srcObject !== null) return;
video.srcObject = ev.streams[0];
};
// Safari don't support "offerToReceiveVideo"
// so need to create transeivers manually
pc.addTransceiver('video', { direction: 'recvonly' });
pc.addTransceiver('audio', { direction: 'recvonly' });
return () => {
const video = document.getElementById('video');
video.srcObject = null;
pc.close();
ws.close();
};
}, [url]);
return ( return (
<div> <div>

View File

@ -4,18 +4,16 @@ import ActivityIndicator from '../components/ActivityIndicator';
import JSMpegPlayer from '../components/JSMpegPlayer'; import JSMpegPlayer from '../components/JSMpegPlayer';
import Heading from '../components/Heading'; import Heading from '../components/Heading';
import WebRtcPlayer from '../components/WebRtcPlayer'; import WebRtcPlayer from '../components/WebRtcPlayer';
import MsePlayer from '../components/MsePlayer'; import '../components/MsePlayer';
import useSWR from 'swr'; import useSWR from 'swr';
import { useMemo } from 'preact/hooks'; import { useMemo } from 'preact/hooks';
import CameraControlPanel from '../components/CameraControlPanel'; import CameraControlPanel from '../components/CameraControlPanel';
import { baseUrl } from '../api/baseUrl';
export default function Birdseye() { export default function Birdseye() {
const { data: config } = useSWR('config'); const { data: config } = useSWR('config');
const [viewSource, setViewSource, sourceIsLoaded] = usePersistence( const [viewSource, setViewSource, sourceIsLoaded] = usePersistence('birdseye-source', getDefaultLiveMode(config));
'birdseye-source',
getDefaultLiveMode(config)
);
const sourceValues = ['mse', 'webrtc', 'jsmpeg']; const sourceValues = ['mse', 'webrtc', 'jsmpeg'];
const ptzCameras = useMemo(() => { const ptzCameras = useMemo(() => {
@ -38,7 +36,10 @@ export default function Birdseye() {
player = ( player = (
<Fragment> <Fragment>
<div className={ptzCameras.length ? 'max-w-5xl xl:w-1/2' : 'max-w-5xl'}> <div className={ptzCameras.length ? 'max-w-5xl xl:w-1/2' : 'max-w-5xl'}>
<MsePlayer camera="birdseye" /> <video-stream
mode="mse"
src={new URL(`${baseUrl.replace(/^http/, 'ws')}live/webrtc/api/ws?src=birdseye`)}
/>
</div> </div>
</Fragment> </Fragment>
); );
@ -110,7 +111,6 @@ export default function Birdseye() {
); );
} }
function getDefaultLiveMode(config) { function getDefaultLiveMode(config) {
if (config) { if (config) {
if (config.birdseye.restream) { if (config.birdseye.restream) {

View File

@ -14,8 +14,9 @@ import { useCallback, useMemo, useState } from 'preact/hooks';
import { useApiHost } from '../api'; import { useApiHost } from '../api';
import useSWR from 'swr'; import useSWR from 'swr';
import WebRtcPlayer from '../components/WebRtcPlayer'; import WebRtcPlayer from '../components/WebRtcPlayer';
import MsePlayer from '../components/MsePlayer'; import '../components/MsePlayer';
import CameraControlPanel from '../components/CameraControlPanel'; import CameraControlPanel from '../components/CameraControlPanel';
import { baseUrl } from '../api/baseUrl';
const emptyObject = Object.freeze({}); const emptyObject = Object.freeze({});
@ -118,7 +119,10 @@ export default function Camera({ camera }) {
player = ( player = (
<Fragment> <Fragment>
<div className="max-w-5xl"> <div className="max-w-5xl">
<MsePlayer camera={cameraConfig.live.stream_name} /> <video-stream
mode="mse"
src={new URL(`${baseUrl.replace(/^http/, 'ws')}live/webrtc/api/ws?src=${camera}`)}
/>
</div> </div>
</Fragment> </Fragment>
); );

View File

@ -16,12 +16,12 @@ export default function Cameras() {
<ActivityIndicator /> <ActivityIndicator />
) : ( ) : (
<div className="grid grid-cols-1 3xl:grid-cols-3 md:grid-cols-2 gap-4 p-2 px-4"> <div className="grid grid-cols-1 3xl:grid-cols-3 md:grid-cols-2 gap-4 p-2 px-4">
<SortedCameras unsortedCameras={config.cameras} /> <SortedCameras config={config} unsortedCameras={config.cameras} />
</div> </div>
); );
} }
function SortedCameras({ unsortedCameras }) { function SortedCameras({ config, unsortedCameras }) {
const sortedCameras = useMemo( const sortedCameras = useMemo(
() => () =>
Object.entries(unsortedCameras) Object.entries(unsortedCameras)
@ -33,13 +33,13 @@ function SortedCameras({ unsortedCameras }) {
return ( return (
<Fragment> <Fragment>
{sortedCameras.map(([camera, conf]) => ( {sortedCameras.map(([camera, conf]) => (
<Camera key={camera} name={camera} conf={conf} /> <Camera key={camera} name={camera} config={config.cameras[camera]} conf={conf} />
))} ))}
</Fragment> </Fragment>
); );
} }
function Camera({ name }) { function Camera({ name, config }) {
const { payload: detectValue, send: sendDetect } = useDetectState(name); const { payload: detectValue, send: sendDetect } = useDetectState(name);
const { payload: recordValue, send: sendRecordings } = useRecordingsState(name); const { payload: recordValue, send: sendRecordings } = useRecordingsState(name);
const { payload: snapshotValue, send: sendSnapshots } = useSnapshotsState(name); const { payload: snapshotValue, send: sendSnapshots } = useSnapshotsState(name);
@ -65,11 +65,13 @@ function Camera({ name }) {
}, },
}, },
{ {
name: `Toggle recordings ${recordValue === 'ON' ? 'off' : 'on'}`, name: config.record.enabled_in_config ? `Toggle recordings ${recordValue === 'ON' ? 'off' : 'on'}` : 'Recordings must be enabled in the config to be turned on in the UI.',
icon: ClipIcon, icon: ClipIcon,
color: recordValue === 'ON' ? 'blue' : 'gray', color: config.record.enabled_in_config ? (recordValue === 'ON' ? 'blue' : 'gray') : 'red',
onClick: () => { onClick: () => {
if (config.record.enabled_in_config) {
sendRecordings(recordValue === 'ON' ? 'OFF' : 'ON', true); sendRecordings(recordValue === 'ON' ? 'OFF' : 'ON', true);
}
}, },
}, },
{ {
@ -81,7 +83,7 @@ function Camera({ name }) {
}, },
}, },
], ],
[detectValue, sendDetect, recordValue, sendRecordings, snapshotValue, sendSnapshots] [config, detectValue, sendDetect, recordValue, sendRecordings, snapshotValue, sendSnapshots]
); );
return ( return (

View File

@ -599,7 +599,9 @@ export default function Events({ path, ...props }) {
{event.sub_label {event.sub_label
? `${event.label.replaceAll('_', ' ')}: ${event.sub_label.replaceAll('_', ' ')}` ? `${event.label.replaceAll('_', ' ')}: ${event.sub_label.replaceAll('_', ' ')}`
: event.label.replaceAll('_', ' ')} : event.label.replaceAll('_', ' ')}
({((event?.data?.top_score || event.top_score) * 100).toFixed(0)}%) {(event?.data?.top_score || event.top_score || 0) == 0
? null
: ` (${((event?.data?.top_score || event.top_score) * 100).toFixed(0)}%)`}
</div> </div>
<div className="text-sm flex"> <div className="text-sm flex">
<Clock className="h-5 w-5 mr-2 inline" /> <Clock className="h-5 w-5 mr-2 inline" />

View File

@ -119,7 +119,7 @@ export default function System() {
{state.showFfprobe && ( {state.showFfprobe && (
<Dialog> <Dialog>
<div className="p-4 mb-2 max-h-96 whitespace-pre-line overflow-scroll"> <div className="p-4 mb-2 max-h-96 whitespace-pre-line overflow-auto">
<Heading size="lg">Ffprobe Output</Heading> <Heading size="lg">Ffprobe Output</Heading>
{state.ffprobe != '' ? ( {state.ffprobe != '' ? (
<div> <div>
@ -183,7 +183,7 @@ export default function System() {
{state.showVainfo && ( {state.showVainfo && (
<Dialog> <Dialog>
<div className="p-4 overflow-scroll whitespace-pre-line"> <div className="p-4 overflow-auto whitespace-pre-line">
<Heading size="lg">Vainfo Output</Heading> <Heading size="lg">Vainfo Output</Heading>
{state.vainfo != '' ? ( {state.vainfo != '' ? (
<div className="mb-2 max-h-96 whitespace-pre-line"> <div className="mb-2 max-h-96 whitespace-pre-line">
@ -239,6 +239,7 @@ export default function System() {
<Th>Inference Speed</Th> <Th>Inference Speed</Th>
<Th>CPU %</Th> <Th>CPU %</Th>
<Th>Memory %</Th> <Th>Memory %</Th>
<Th>Network Bandwidth</Th>
</Tr> </Tr>
</Thead> </Thead>
<Tbody> <Tbody>
@ -247,6 +248,7 @@ export default function System() {
<Td>{detectors[detector]['inference_speed']} ms</Td> <Td>{detectors[detector]['inference_speed']} ms</Td>
<Td>{cpu_usages[detectors[detector]['pid']]?.['cpu'] || '- '}%</Td> <Td>{cpu_usages[detectors[detector]['pid']]?.['cpu'] || '- '}%</Td>
<Td>{cpu_usages[detectors[detector]['pid']]?.['mem'] || '- '}%</Td> <Td>{cpu_usages[detectors[detector]['pid']]?.['mem'] || '- '}%</Td>
<Td>{bandwidth_usages[detectors[detector]['pid']]?.['bandwidth'] || '- '}KB/s</Td>
</Tr> </Tr>
</Tbody> </Tbody>
</Table> </Table>
@ -428,6 +430,7 @@ export default function System() {
<Th>CPU %</Th> <Th>CPU %</Th>
<Th>Avg CPU %</Th> <Th>Avg CPU %</Th>
<Th>Memory %</Th> <Th>Memory %</Th>
<Th>Network Bandwidth</Th>
</Tr> </Tr>
</Thead> </Thead>
<Tbody> <Tbody>
@ -436,6 +439,7 @@ export default function System() {
<Td>{cpu_usages[processes[process]['pid']]?.['cpu'] || '- '}%</Td> <Td>{cpu_usages[processes[process]['pid']]?.['cpu'] || '- '}%</Td>
<Td>{cpu_usages[processes[process]['pid']]?.['cpu_average'] || '- '}%</Td> <Td>{cpu_usages[processes[process]['pid']]?.['cpu_average'] || '- '}%</Td>
<Td>{cpu_usages[processes[process]['pid']]?.['mem'] || '- '}%</Td> <Td>{cpu_usages[processes[process]['pid']]?.['mem'] || '- '}%</Td>
<Td>{bandwidth_usages[processes[process]['pid']]?.['bandwidth'] || '- '}KB/s</Td>
</Tr> </Tr>
</Tbody> </Tbody>
</Table> </Table>