Improve streaming response and add system prompt support
- Add configurable initial streaming message - Support system prompt in API requests - Fix config key typo (open-webui -> open_webui) - Add validation for required config values - Improve error handling for network and API errors - Set proper timeout for API requests (900s) - Better logging for rate limit errors
This commit is contained in:
@@ -9,7 +9,7 @@ allow_dms: false # set to true if you want the bot to answer private DM
|
||||
|
||||
# ─────────── Open‑WebUI Settings ───────────
|
||||
open_webui_url: "http://your_open-webui_ip_or_domain:port"
|
||||
open-webui_api_key: "user_api_key_from_open_webui"
|
||||
open_webui_api_key: "user_api_key_from_open_webui"
|
||||
model_name: "model_id_from_open-webui"
|
||||
knowledge_base: "knowledge_base_id_from_open-webui"
|
||||
|
||||
@@ -20,5 +20,7 @@ tools:
|
||||
|
||||
use_streaming: true # Allows to stream the answer to feel more interactive.
|
||||
|
||||
streaming_initial_message: "Bitte warte kurz, die Informationen werden gesammelt..."
|
||||
|
||||
# optional system prompt (you can leave it empty to use the default one or the systemprompt given in open-webui for the specific model)
|
||||
system_prompt: ""
|
||||
|
||||
Reference in New Issue
Block a user