72 lines
3.9 KiB
Markdown
72 lines
3.9 KiB
Markdown
# SC-Discord-Bot
|
|
|
|
This project provides a sophisticated Discord bot focused on the game *Star Citizen*. It leverages a Large Language Model (LLM) hosted via an [Open-WebUI](https://github.com/open-webui/open-webui) backend, enhanced with Retrieval-Augmented Generation (RAG) and custom tools to provide accurate, up-to-date information.
|
|
|
|
## Features
|
|
|
|
- **Discord Integration**: A simple and effective Discord bot that responds to user queries in whitelisted channels.
|
|
- **LLM-Powered**: Connects to any OpenAI-compatible API, allowing you to use a variety of powerful language models.
|
|
- **Retrieval-Augmented Generation (RAG)**: The bot's knowledge is supplemented by a collection of JSON files in [`llm_rag_knowledge/`](llm_rag_knowledge/) containing detailed information about in-game events, crafting, and vehicle data.
|
|
- **Custom Tools**: The LLM can invoke a suite of Python tools located in [`llm_tools/`](llm_tools/) to fetch dynamic data:
|
|
- **Ship Information**: Get detailed ship specifications and compare vessels using data from `starcitizen.tools`.
|
|
- **Price Lookups**: Query real-time prices for commodities and items from a local database synced with `uexcorp.space`.
|
|
- **Fleet Ownership**: Check who in your organization owns which ship using data from `fleetyards.net`.
|
|
- **Data Persistence**: Utilizes SQLite databases in [`databases/`](databases/) to store and quickly access commodity, item, and fleet information.
|
|
- **Containerized**: Comes with a [`Dockerfile`](discord_connector/Dockerfile) and [`docker-compose.yaml`](discord_connector/docker-compose.yaml) for easy and consistent deployment.
|
|
|
|
## Architecture
|
|
|
|
The system is composed of several key parts:
|
|
|
|
1. **Discord Connector**: The [`open-webui_to_discord.py`](discord_connector/open-webui_to_discord.py) script is the frontend, listening for user messages on Discord.
|
|
2. **Open-WebUI & LLM**: The bot forwards queries to an Open-WebUI instance, which in turn uses an LLM (e.g., Llama3, Mixtral) for inference.
|
|
3. **Knowledge Base & Tools**: The LLM's responses are enriched by:
|
|
- The static JSON files in [`llm_rag_knowledge/`](llm_rag_knowledge/).
|
|
- The dynamic Python scripts in [`llm_tools/`](llm_tools/).
|
|
4. **Data Sync Scripts**: The scripts [`get_commodities.py`](llm_tools/get_commodities.py), [`get_items.py`](llm_tools/get_items.py), and [`fleetyard.py`](llm_tools/fleetyard.py) run independently to keep the local SQLite databases current.
|
|
|
|
## Getting Started
|
|
|
|
### Prerequisites
|
|
|
|
- Docker and Docker Compose
|
|
- A running Open-WebUI instance with a loaded model.
|
|
- A Discord Bot Token.
|
|
|
|
### Setup
|
|
|
|
1. **Clone the repository:**
|
|
```sh
|
|
git clone https://gitea.zephyre.one/Pakobbix/SC-Discord-Bot.git
|
|
cd SC-Discord-Bot
|
|
```
|
|
|
|
2. **Configure the bot:**
|
|
- Navigate to the `discord_connector` directory.
|
|
- Copy [`example.config.yml`](discord_connector/example.config.yml) to `config.yml`.
|
|
- Edit `config.yml` with your details:
|
|
- `discord_token`
|
|
- `whitelist_channels`
|
|
- `open_webui_url` and `open-webui_api_key`
|
|
- `model_name` and any `tools` you have configured in Open-WebUI.
|
|
|
|
3. **Populate Databases:**
|
|
Before running the bot, you may need to run the data sync scripts to populate the databases. These scripts are designed to be run within the bot's environment or a similar one with the required dependencies.
|
|
```sh
|
|
# Example for one script
|
|
python llm_tools/get_commodities.py
|
|
python llm_tools/get_items.py
|
|
python llm_tools/fleetyard.py
|
|
```
|
|
*Note: You may need to adjust paths or run these within the Docker container for them to access the correct database location.*
|
|
|
|
4. **Run with Docker Compose:**
|
|
From the `discord_connector` directory, run:
|
|
```sh
|
|
docker-compose up --build -d
|
|
```
|
|
|
|
## License
|
|
|
|
This project is licensed under the Apache License 2.0. See the [LICENSE](LICENSE) file for
|