SC-Discord-Bot

This project provides a sophisticated Discord bot focused on the game Star Citizen. It leverages a Large Language Model (LLM) hosted via an Open-WebUI backend, enhanced with Retrieval-Augmented Generation (RAG) and custom tools to provide accurate, up-to-date information.

Features

  • Discord Integration: A simple and effective Discord bot that responds to user queries in whitelisted channels.
  • LLM-Powered: Connects to any OpenAI-compatible API, allowing you to use a variety of powerful language models.
  • Retrieval-Augmented Generation (RAG): The bot's knowledge is supplemented by a collection of JSON files in llm_rag_knowledge/ containing detailed information about in-game events, crafting, and vehicle data.
  • Custom Tools: The LLM can invoke a suite of Python tools located in llm_tools/ to fetch dynamic data:
    • Ship Information: Get detailed ship specifications and compare vessels using data from starcitizen.tools.
    • Price Lookups: Query real-time prices for commodities and items from a local database synced with uexcorp.space.
    • Fleet Ownership: Check who in your organization owns which ship using data from fleetyards.net.
  • Data Persistence: Utilizes SQLite databases in databases/ to store and quickly access commodity, item, and fleet information.
  • Containerized: Comes with a Dockerfile and docker-compose.yaml for easy and consistent deployment.

Architecture

The system is composed of several key parts:

  1. Discord Connector: The open-webui_to_discord.py script is the frontend, listening for user messages on Discord.
  2. Open-WebUI & LLM: The bot forwards queries to an Open-WebUI instance, which in turn uses an LLM (e.g., Llama3, Mixtral) for inference.
  3. Knowledge Base & Tools: The LLM's responses are enriched by:
  4. Data Sync Scripts: The scripts get_commodities.py, get_items.py, and fleetyard.py run independently to keep the local SQLite databases current.

Getting Started

Prerequisites

  • Docker and Docker Compose
  • A running Open-WebUI instance with a loaded model.
  • A Discord Bot Token.

Setup

  1. Clone the repository:

    git clone https://gitea.zephyre.one/Pakobbix/SC-Discord-Bot.git
    cd SC-Discord-Bot
    
  2. Configure the bot:

    • Navigate to the discord_connector directory.
    • Copy example.config.yml to config.yml.
    • Edit config.yml with your details:
      • discord_token
      • whitelist_channels
      • open_webui_url and open-webui_api_key
      • model_name and any tools you have configured in Open-WebUI.
  3. Populate Databases: Before running the bot, you may need to run the data sync scripts to populate the databases. These scripts are designed to be run within the bot's environment or a similar one with the required dependencies.

    # Example for one script
    python llm_tools/get_commodities.py
    python llm_tools/get_items.py
    python llm_tools/fleetyard.py
    

    Note: You may need to adjust paths or run these within the Docker container for them to access the correct database location.

  4. Run with Docker Compose: From the discord_connector directory, run:

    docker-compose up --build -d
    

License

This project is licensed under the Apache License 2.0. See the LICENSE file for

Description
Using open-webui features and ollama's inference engine to host a LLM Discord bot related to Star Citizen
Readme Apache-2.0 1.2 MiB
Languages
Python 99.4%
Dockerfile 0.6%