- Implemented Tautulli information retrieval in `tautulli_informations.py` to fetch movie, anime, TV show, music amounts, and more. - Created a weather forecast tool in `weather_forecast.py` that retrieves and formats a 7-day weather forecast in German. - Developed a YouTube transcript provider in `youtube_summarizer.py` to fetch video transcripts and titles using Langchain Community's YoutubeLoader.
154 lines
6.0 KiB
Markdown
154 lines
6.0 KiB
Markdown
# AI Collections Repository
|
||
|
||
> **A curated collection of prompts, tools, models, and configurations for AI projects.**
|
||
|
||
---
|
||
|
||
## Table of Contents
|
||
1. [Overview](#overview)
|
||
2. [Repository Structure](#repository-structure)
|
||
3. [Key Tools](#key-tools)
|
||
4. [How to Use](#how-to-use)
|
||
5. [Setup & Dependencies](#setup--dependencies)
|
||
6. [Contributing](#contributing)
|
||
7. [License](#license)
|
||
8. [Acknowledgements](#acknowledgements)
|
||
|
||
---
|
||
|
||
## Overview
|
||
This repository gathers a wide variety of assets that can help you build, experiment with, and deploy AI models, especially large language models (LLMs) and vision models. It is intentionally modular so you can drop in only the parts you need:
|
||
- **Model configuration files** (YAML) for popular open‑source LLMs such as Qwen, Mistral, GLM, GPT‑OSS, etc.
|
||
- **Python tools** that interact with APIs, fetch data, and perform common tasks (e.g., transcript extraction, GPU monitoring, workflow switching).
|
||
- **Prompt libraries** for use with LangChain or any framework that supports prompt templates.
|
||
- **System prompts** that help define the behaviour of a virtual assistant.
|
||
|
||
Feel free to copy, adapt, or extend these resources for your own projects.
|
||
|
||
## Repository Structure
|
||
```
|
||
ai_collections/
|
||
├── Alibaba/
|
||
│ └── Qwen3/
|
||
│ └── 30B-A3B/
|
||
│ ├── Coder.yml
|
||
│ ├── Instruct-2507.yml
|
||
│ └── Thinking-2507.yml
|
||
├── ByteDance/
|
||
│ └── Seed-OSS/
|
||
├── images/
|
||
├── Knowledge/
|
||
├── Mistral/
|
||
│ ├── Magistral-Small/
|
||
│ │ └── 1_2_2509.yml
|
||
│ └── Mistral-Small/
|
||
│ └── 3_2_2506.yml
|
||
├── OpenAI/
|
||
│ └── GPT-OSS/
|
||
│ └── 20B.yml
|
||
├── self_created/
|
||
│ ├── System Prompts/
|
||
│ │ ├── ARIA.md
|
||
│ │ └── star_citizen_answer_bot.md
|
||
│ └── Tools/
|
||
│ ├── article_summarizer.py
|
||
│ ├── comfyUI_Workflow_switch.py
|
||
│ ├── gitea_management.py
|
||
│ ├── memory.py
|
||
│ ├── nvidia_gpu_information.py
|
||
│ ├── proxmox_management.py
|
||
│ ├── star_citizen_informations.py
|
||
│ ├── tautulli_informations.py
|
||
│ ├── weather_forecast.py
|
||
│ └── youtube_summarizer.py
|
||
├── Z_AI/
|
||
│ └── GLM/
|
||
│ ├── GLM-414-32B.yml
|
||
│ └── GLM-Z1-414-32B.yml
|
||
├── LICENSE
|
||
└── README.md
|
||
```
|
||
|
||
> **Note:** The YAML files are configuration snippets that can be dropped into a LangChain `Config` or used with any LLM framework that accepts a YAML config.
|
||
|
||
## Key Tools
|
||
Below are the most frequently used tools in the *Tools* folder. Each tool follows a consistent interface: a class named `Tools` with an async method that emits status events. They are designed to work inside the LangChain framework, but you can adapt them for other ecosystems.
|
||
|
||
| Tool | Purpose | Key Features |
|
||
|------|---------|--------------|
|
||
| `article_summarizer.py` | Summarise web articles or PDFs. | Uses `BeautifulSoup` + LLM for summarisation. |
|
||
| `comfyUI_Workflow_switch.py` | Dynamically switch ComfyUI workflows. | Works with local ComfyUI API. |
|
||
| `gitea_management.py` | Create and manage Gitea repositories. | Supports repo creation, issue management, etc. |
|
||
| `memory.py` | Persistent memory store for LangChain. | Simple key‑value store backed by SQLite. |
|
||
| `nvidia_gpu_information.py` | Fetch GPU usage statistics. | Utilises `pynvml` to report memory & utilization. |
|
||
| `proxmox_management.py` | Control Proxmox VMs via API. | Start/stop, snapshot, etc. |
|
||
| `star_citizen_informations.py` | Gather Star Citizen game data. | Retrieves server status, player counts. |
|
||
| `tautulli_informations.py` | Monitor Plex via Tautulli API. | Returns user activity and media stats. |
|
||
| `weather_forecast.py` | Simple weather look‑ups via OpenWeatherMap. | Returns current temp, humidity, etc. |
|
||
| `youtube_summarizer.py` | Retrieve transcript and title of a YouTube video. | Uses `langchain_community.document_loaders.YoutubeLoader`. |
|
||
|
||
## How to Use
|
||
1. **Clone the repository**
|
||
```bash
|
||
git clone https://github.com/your-username/ai_collections.git
|
||
```
|
||
2. **Create a Python virtual environment** (recommended)
|
||
```bash
|
||
python -m venv venv
|
||
.\venv\Scripts\activate
|
||
```
|
||
3. **Install dependencies**
|
||
```bash
|
||
pip install -r requirements.txt
|
||
```
|
||
*If a `requirements.txt` does not exist, install the needed packages manually, e.g.*
|
||
```bash
|
||
pip install langchain langchain-community beautifulsoup4 requests pynvml
|
||
```
|
||
4. **Import a tool** in your project:
|
||
```python
|
||
from ai_collections.self_created.Tools.youtube_summarizer import Tools
|
||
tool = Tools()
|
||
# Use the async method inside an async context
|
||
````
|
||
5. **Drop a YAML config** into your LangChain setup:
|
||
```yaml
|
||
# Example: loading a Qwen3 configuration
|
||
model_name: qwen3-30B-A3B
|
||
config_file: Alibaba/Qwen3/30B-A3B/Coder.yml
|
||
```
|
||
|
||
> **Tip:** Most tools emit a status event dictionary. You can hook into these events to update a UI, log progress, or trigger downstream actions.
|
||
|
||
## Setup & Dependencies
|
||
The repository is intentionally lightweight. Core dependencies are:
|
||
- `langchain` and `langchain-community`
|
||
- `beautifulsoup4`
|
||
- `requests`
|
||
- `pynvml` (for GPU stats)
|
||
|
||
Create a `requirements.txt` with:
|
||
```
|
||
language=python
|
||
pip install -r requirements.txt
|
||
```
|
||
|
||
## Contributing
|
||
Feel free to submit pull requests! Please follow these guidelines:
|
||
1. Add tests for any new functionality.
|
||
2. Keep the code style consistent with the rest of the repository.
|
||
3. Document any new tool or model configuration.
|
||
|
||
## License
|
||
This repository is licensed under the [MIT License](LICENSE).
|
||
|
||
## Acknowledgements
|
||
- **LangChain** – for the modular prompt framework.
|
||
- **OpenAI** – for the GPT‑OSS models.
|
||
- **Alibaba** – for Qwen open‑source models.
|
||
- **Mistral AI** – for the small‑size Mistral variants.
|
||
- **Z_AI** – for the GLM configurations.
|
||
|
||
---
|
||
|
||
> **Contact** – If you have questions, open an issue or email the repository maintainer. |