AI Collections Repository

A curated collection of prompts, tools, models, and configurations for AI projects.


Table of Contents

  1. Overview
  2. Repository Structure
  3. Key Tools
  4. How to Use
  5. Setup & Dependencies
  6. Contributing
  7. License
  8. Acknowledgements

Overview

This repository gathers a wide variety of assets that can help you build, experiment with, and deploy AI models, especially large language models (LLMs) and vision models. It is intentionally modular so you can drop in only the parts you need:

  • Model configuration files (YAML) for popular opensource LLMs such as Qwen, Mistral, GLM, GPTOSS, etc.
  • Python tools that interact with APIs, fetch data, and perform common tasks (e.g., transcript extraction, GPU monitoring, workflow switching).
  • Prompt libraries for use with LangChain or any framework that supports prompt templates.
  • System prompts that help define the behaviour of a virtual assistant.

Feel free to copy, adapt, or extend these resources for your own projects.

Repository Structure

ai_collections/
├── Alibaba/
│   └── Qwen3/
│       └── 30B-A3B/
│           ├── Coder.yml
│           ├── Instruct-2507.yml
│           └── Thinking-2507.yml
├── ByteDance/
│   └── Seed-OSS/
├── images/
├── Knowledge/
├── Mistral/
│   ├── Magistral-Small/
│   │   └── 1_2_2509.yml
│   └── Mistral-Small/
│       └── 3_2_2506.yml
├── OpenAI/
│   └── GPT-OSS/
│       └── 20B.yml
├── self_created/
│   ├── System Prompts/
│   │   ├── ARIA.md
│   │   └── star_citizen_answer_bot.md
│   └── Tools/
│       ├── article_summarizer.py
│       ├── comfyUI_Workflow_switch.py
│       ├── gitea_management.py
│       ├── memory.py
│       ├── nvidia_gpu_information.py
│       ├── proxmox_management.py
│       ├── star_citizen_informations.py
│       ├── tautulli_informations.py
│       ├── weather_forecast.py
│       └── youtube_summarizer.py
├── Z_AI/
│   └── GLM/
│       ├── GLM-414-32B.yml
│       └── GLM-Z1-414-32B.yml
├── LICENSE
└── README.md

Note: The YAML files are configuration snippets that can be dropped into a LangChain Config or used with any LLM framework that accepts a YAML config.

Key Tools

Below are the most frequently used tools in the Tools folder. Each tool follows a consistent interface: a class named Tools with an async method that emits status events. They are designed to work inside the LangChain framework, but you can adapt them for other ecosystems.

Tool Purpose Key Features
article_summarizer.py Summarise web articles or PDFs. Uses BeautifulSoup + LLM for summarisation.
comfyUI_Workflow_switch.py Dynamically switch ComfyUI workflows. Works with local ComfyUI API.
gitea_management.py Create and manage Gitea repositories. Supports repo creation, issue management, etc.
memory.py Persistent memory store for LangChain. Simple keyvalue store backed by SQLite.
nvidia_gpu_information.py Fetch GPU usage statistics. Utilises pynvml to report memory & utilization.
proxmox_management.py Control Proxmox VMs via API. Start/stop, snapshot, etc.
star_citizen_informations.py Gather Star Citizen game data. Retrieves server status, player counts.
tautulli_informations.py Monitor Plex via Tautulli API. Returns user activity and media stats.
weather_forecast.py Simple weather lookups via OpenWeatherMap. Returns current temp, humidity, etc.
youtube_summarizer.py Retrieve transcript and title of a YouTube video. Uses langchain_community.document_loaders.YoutubeLoader.

How to Use

  1. Clone the repository
    git clone https://github.com/your-username/ai_collections.git
    
  2. Create a Python virtual environment (recommended)
    python -m venv venv
    .\venv\Scripts\activate
    
  3. Install dependencies
    pip install -r requirements.txt
    
    If a requirements.txt does not exist, install the needed packages manually, e.g.
    pip install langchain langchain-community beautifulsoup4 requests pynvml
    
  4. Import a tool in your project:
    from ai_collections.self_created.Tools.youtube_summarizer import Tools
    tool = Tools()
    # Use the async method inside an async context
    
  5. Drop a YAML config into your LangChain setup:
    # Example: loading a Qwen3 configuration
    model_name: qwen3-30B-A3B
    config_file: Alibaba/Qwen3/30B-A3B/Coder.yml
    

Tip: Most tools emit a status event dictionary. You can hook into these events to update a UI, log progress, or trigger downstream actions.

Setup & Dependencies

The repository is intentionally lightweight. Core dependencies are:

  • langchain and langchain-community
  • beautifulsoup4
  • requests
  • pynvml (for GPU stats)

Create a requirements.txt with:

language=python
pip install -r requirements.txt

Contributing

Feel free to submit pull requests! Please follow these guidelines:

  1. Add tests for any new functionality.
  2. Keep the code style consistent with the rest of the repository.
  3. Document any new tool or model configuration.

License

This repository is licensed under the MIT License.

Acknowledgements

  • LangChain for the modular prompt framework.
  • OpenAI for the GPTOSS models.
  • Alibaba for Qwen opensource models.
  • Mistral AI for the smallsize Mistral variants.
  • Z_AI for the GLM configurations.

Contact If you have questions, open an issue or email the repository maintainer.

Description
Collection of system rompts, tools, mcp's, sampler settings and everything else related to AI
Readme MIT 69 KiB
Languages
Python 100%