Moved pipeline to the Guides section

This commit is contained in:
cat101 2023-09-08 20:37:34 -03:00
parent 6ee2ae28f6
commit bc509d9000
3 changed files with 57 additions and 30 deletions

View File

@ -25,36 +25,6 @@ cameras:
VSCode (and VSCode addon) supports the JSON schemas which will automatically validate the config. This can be added by adding `# yaml-language-server: $schema=http://frigate_host:5000/api/config/schema.json` to the top of the config file. `frigate_host` being the IP address of Frigate or `ccab4aaf-frigate` if running in the addon.
### Overview of the video pipeline
The following diagram shows the different processing stages for a video source. Each stage shows the key elements and how they relate to each other.
```mermaid
%%{init: {"themeVariables": {"edgeLabelBackground": "transparent"}}}%%
flowchart TD
ClipStore[(Clip\nstore)]
SnapStore[(Snapshot\nstore)]
subgraph Camera
Stream[Video\nstreams] --> |detect stream|Decode
Decode --> Downscale
end
subgraph Motion
Downscale --> MotionM(Apply\nmotion masks)
MotionM --> MotionD(Motion\ndetection)
end
subgraph Detection
MotionD --> |motion regions| ObjectD(Object\ndetection)
Downscale --> ObjectD
ObjectD --> ObjectZ(Track objects and apply zones)
end
MotionD --> |motion clips|ClipStore
ObjectZ --> |detection clip|ClipStore
Stream -->|continuous record| ClipStore
ObjectZ --> |detection snapshot|SnapStore
```
### Full configuration reference
:::caution

View File

@ -0,0 +1,56 @@
---
id: video_pipeline
title: The Video Pipeline
---
Frigate uses a sophisticated video pipeline that starts with the camera feeds and progressively applies transformations to them (e.g. decoding, motion detection, etc.).
This guide provides an overview to help users put the key Frigate concepts on a map.
### High level view of the video pipeline
```mermaid
%%{init: {"themeVariables": {"edgeLabelBackground": "transparent"}}}%%
flowchart LR
Feed(Feed\nProcessing) --> Decode(Video\ndecoding)
Decode --> Motion(Motion\nDetection)
Motion --> Object(Object\nDetection)
Feed --> Recording(Recording\n&\nVisualization)
Motion --> Recording
Object --> Recording
```
### Detailed view of the video pipeline
```mermaid
%%{init: {"themeVariables": {"edgeLabelBackground": "transparent"}}}%%
flowchart TD
ClipStore[(Clip\nstore)]
SnapStore[(Snapshot\nstore)]
subgraph Camera
MainS[\Main Stream\n/] --> Go2RTC
SubS[\Sub Stream/] -.-> Go2RTC
Go2RTC("Go2RTC\n(optional)") --> Stream
Stream[Video\nstreams] --> |detect stream|Decode(Decode & Downscale)
end
subgraph Motion
Decode --> MotionM(Apply\nmotion masks)
MotionM --> MotionD(Motion\ndetection)
end
subgraph Detection
MotionD --> |motion regions| ObjectD(Object\ndetection)
Decode --> ObjectD
ObjectD --> ObjectZ(Track objects and apply zones)
end
MotionD --> |motion snapshots|BirdsEye
ObjectZ --> |detection snapshot|BirdsEye
MotionD --> |motion clips|ClipStore
ObjectZ --> |detection clip|ClipStore
Stream -->|continuous record| ClipStore
ObjectZ --> |detection snapshot|SnapStore
```

View File

@ -14,6 +14,7 @@ module.exports = {
"guides/ha_network_storage",
"guides/stationary_objects",
"guides/reverse_proxy",
"guides/video_pipeline",
],
Configuration: {
"Configuration Files": [