AI Collections Repository
A curated collection of prompts, tools, models, and configurations for AI projects.
Table of Contents
- Overview
- Repository Structure
- Key Tools
- How to Use
- Setup & Dependencies
- Contributing
- License
- Acknowledgements
Overview
This repository gathers a wide variety of assets that can help you build, experiment with, and deploy AI models, especially large language models (LLMs) and vision models. It is intentionally modular so you can drop in only the parts you need:
- Model configuration files (YAML) for popular open‑source LLMs such as Qwen, Mistral, GLM, GPT‑OSS, etc.
- Python tools that interact with APIs, fetch data, and perform common tasks (e.g., transcript extraction, GPU monitoring, workflow switching).
- Prompt libraries for use with LangChain or any framework that supports prompt templates.
- System prompts that help define the behaviour of a virtual assistant.
Feel free to copy, adapt, or extend these resources for your own projects.
Repository Structure
ai_collections/
├── Alibaba/
│ └── Qwen3/
│ └── 30B-A3B/
│ ├── Coder.yml
│ ├── Instruct-2507.yml
│ └── Thinking-2507.yml
├── ByteDance/
│ └── Seed-OSS/
├── images/
├── Knowledge/
├── Mistral/
│ ├── Magistral-Small/
│ │ └── 1_2_2509.yml
│ └── Mistral-Small/
│ └── 3_2_2506.yml
├── OpenAI/
│ └── GPT-OSS/
│ └── 20B.yml
├── self_created/
│ ├── System Prompts/
│ │ ├── ARIA.md
│ │ └── star_citizen_answer_bot.md
│ └── Tools/
│ ├── article_summarizer.py
│ ├── comfyUI_Workflow_switch.py
│ ├── gitea_management.py
│ ├── memory.py
│ ├── nvidia_gpu_information.py
│ ├── proxmox_management.py
│ ├── star_citizen_informations.py
│ ├── tautulli_informations.py
│ ├── weather_forecast.py
│ └── youtube_summarizer.py
├── Z_AI/
│ └── GLM/
│ ├── GLM-414-32B.yml
│ └── GLM-Z1-414-32B.yml
├── LICENSE
└── README.md
Note: The YAML files are configuration snippets that can be dropped into a LangChain
Configor used with any LLM framework that accepts a YAML config.
Key Tools
Below are the most frequently used tools in the Tools folder. Each tool follows a consistent interface: a class named Tools with an async method that emits status events. They are designed to work inside the LangChain framework, but you can adapt them for other ecosystems.
| Tool | Purpose | Key Features |
|---|---|---|
article_summarizer.py |
Summarise web articles or PDFs. | Uses BeautifulSoup + LLM for summarisation. |
comfyUI_Workflow_switch.py |
Dynamically switch ComfyUI workflows. | Works with local ComfyUI API. |
gitea_management.py |
Create and manage Gitea repositories. | Supports repo creation, issue management, etc. |
memory.py |
Persistent memory store for LangChain. | Simple key‑value store backed by SQLite. |
nvidia_gpu_information.py |
Fetch GPU usage statistics. | Utilises pynvml to report memory & utilization. |
proxmox_management.py |
Control Proxmox VMs via API. | Start/stop, snapshot, etc. |
star_citizen_informations.py |
Gather Star Citizen game data. | Retrieves server status, player counts. |
tautulli_informations.py |
Monitor Plex via Tautulli API. | Returns user activity and media stats. |
weather_forecast.py |
Simple weather look‑ups via OpenWeatherMap. | Returns current temp, humidity, etc. |
youtube_summarizer.py |
Retrieve transcript and title of a YouTube video. | Uses langchain_community.document_loaders.YoutubeLoader. |
How to Use
- Clone the repository
git clone https://github.com/your-username/ai_collections.git - Create a Python virtual environment (recommended)
python -m venv venv .\venv\Scripts\activate - Install dependencies
If apip install -r requirements.txtrequirements.txtdoes not exist, install the needed packages manually, e.g.pip install langchain langchain-community beautifulsoup4 requests pynvml - Import a tool in your project:
from ai_collections.self_created.Tools.youtube_summarizer import Tools tool = Tools() # Use the async method inside an async context - Drop a YAML config into your LangChain setup:
# Example: loading a Qwen3 configuration model_name: qwen3-30B-A3B config_file: Alibaba/Qwen3/30B-A3B/Coder.yml
Tip: Most tools emit a status event dictionary. You can hook into these events to update a UI, log progress, or trigger downstream actions.
Setup & Dependencies
The repository is intentionally lightweight. Core dependencies are:
langchainandlangchain-communitybeautifulsoup4requestspynvml(for GPU stats)
Create a requirements.txt with:
language=python
pip install -r requirements.txt
Contributing
Feel free to submit pull requests! Please follow these guidelines:
- Add tests for any new functionality.
- Keep the code style consistent with the rest of the repository.
- Document any new tool or model configuration.
License
This repository is licensed under the MIT License.
Acknowledgements
- LangChain – for the modular prompt framework.
- OpenAI – for the GPT‑OSS models.
- Alibaba – for Qwen open‑source models.
- Mistral AI – for the small‑size Mistral variants.
- Z_AI – for the GLM configurations.
Contact – If you have questions, open an issue or email the repository maintainer.