LLM inference in C/C++
An ecosystem of Rust libraries for working with large language models
Stable Diffusion in pure C/C++
Replace OpenAI GPT with another LLM in your app by changing a single lin...
Calculate token/s & GPU memory requirement for any LLM. Supports llama....
Run inference on MPT-30B using CPU
Port of MiniGPT4 in C++ (4bit, 5bit, 6bit, 8bit, 16bit CPU inference wit...
Self-evaluating interview for AI coders
Large Language Models for All, 🦙 Cult and More, Stay in touch !
Build LLM apps safely and securely🛡️
WIP Library Text To Speech From Suno AI's Bark in C/C++ for fast inference
GENERAL Ai Library For DART & Flutter
Chat with your data privately using MPT-30b