frigate/frigate/genai
Nicolas Mowen eeefbf2bb5
Add support for multiple GenAI Providers (#22144)
* GenAI client manager

* Add config migration

* Convert to roles list

* Support getting client via manager

* Cleanup

* Fix import issues

* Set model in llama.cpp config

* Clenaup

* Use config update

* Clenaup

* Add new title and desc
2026-02-27 08:35:33 -07:00
..
__init__.py Add support for multiple GenAI Providers (#22144) 2026-02-27 08:35:33 -07:00
azure-openai.py Implement LLM Chat API with tool calling support (#21731) 2026-02-26 21:27:56 -07:00
gemini.py Adapt to new Gemini format 2026-02-26 21:27:56 -07:00
llama_cpp.py Add support for multiple GenAI Providers (#22144) 2026-02-27 08:35:33 -07:00
manager.py Add support for multiple GenAI Providers (#22144) 2026-02-27 08:35:33 -07:00
ollama.py Implement LLM Chat API with tool calling support (#21731) 2026-02-26 21:27:56 -07:00
openai.py Implement LLM Chat API with tool calling support (#21731) 2026-02-26 21:27:56 -07:00