✨ Kubectl plugin to create manifests with LLMs
-
Updated
Sep 2, 2024 - Go
✨ Kubectl plugin to create manifests with LLMs
Like grep but for natural language questions. Based on Mistral 7B or Mixtral 8x7B.
🏗️ Fine-tune, build, and deploy open-source LLMs easily!
KVQuant: Towards 10 Million Context Length LLM Inference with KV Cache Quantization
AubAI brings you on-device gen-AI capabilities, including offline text generation and more, directly within your app.
Social and customizable AI writing assistant! ✍️
Use your open source local model from the terminal
Run gguf LLM models in Latest Version TextGen-webui
Local AI Search assistant web or CLI for ollama and llama.cpp. Lightweight and easy to run, providing a Perplexity-like experience.
Summarize emails received by Thunderbird mail client extension via locally run LLM. Early development.
Add a description, image, and links to the localllama topic page so that developers can more easily learn about it.
To associate your repository with the localllama topic, visit your repo's landing page and select "manage topics."