From c85463dc8c642fe0bbc5269e214c34a3797ad825 Mon Sep 17 00:00:00 2001 From: Nicolas Mowen Date: Sun, 18 Jan 2026 06:18:19 -0700 Subject: [PATCH] Update links --- docs/docs/configuration/genai/config.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/docs/configuration/genai/config.md b/docs/docs/configuration/genai/config.md index a61a98649..94eb02323 100644 --- a/docs/docs/configuration/genai/config.md +++ b/docs/docs/configuration/genai/config.md @@ -69,7 +69,7 @@ genai: ## llama.cpp -[llama.cpp](https://github.com/ggerganov/llama.cpp) is a C++ implementation of LLaMA that provides a high-performance inference server. Using llama.cpp directly gives you access to all native llama.cpp options and parameters. +[llama.cpp](https://github.com/ggml-org/llama.cpp) is a C++ implementation of LLaMA that provides a high-performance inference server. Using llama.cpp directly gives you access to all native llama.cpp options and parameters. :::warning @@ -99,7 +99,7 @@ genai: seed: -1 ``` -All llama.cpp native options can be passed through `provider_options`, including `temperature`, `top_k`, `top_p`, `min_p`, `repeat_penalty`, `repeat_last_n`, `seed`, `grammar`, and more. See the [llama.cpp server documentation](https://github.com/ggerganov/llama.cpp/blob/master/examples/server/README.md) for a complete list of available parameters. +All llama.cpp native options can be passed through `provider_options`, including `temperature`, `top_k`, `top_p`, `min_p`, `repeat_penalty`, `repeat_last_n`, `seed`, `grammar`, and more. See the [llama.cpp server documentation](https://github.com/ggml-org/llama.cpp/blob/master/tools/server/README.md) for a complete list of available parameters. ## Google Gemini