✨ Kubectl plugin to create manifests with LLMs
-
Updated
Dec 16, 2024 - Go
✨ Kubectl plugin to create manifests with LLMs
The easiest way to use the Ollama API in .NET
🏗️ Fine-tune, build, and deploy open-source LLMs easily!
Like grep but for natural language questions. Based on Mistral 7B or Mixtral 8x7B.
[NeurIPS 2024] KVQuant: Towards 10 Million Context Length LLM Inference with KV Cache Quantization
AubAI brings you on-device gen-AI capabilities, including offline text generation and more, directly within your app.
Social and customizable AI writing assistant! ✍️
LLM RAG Application with Cross-Encoders Re-ranking for YouTube video 🎥
Full featured demo application for OllamaSharp
Use your open source local model from the terminal
Run gguf LLM models in Latest Version TextGen-webui
Copilot hack for running local copilot without auth and proxying
Summarize emails received by Thunderbird mail client extension via locally run LLM. Early development.
Local AI Search assistant web or CLI for ollama and llama.cpp. Lightweight and easy to run, providing a Perplexity-like experience.
Add a description, image, and links to the localllama topic page so that developers can more easily learn about it.
To associate your repository with the localllama topic, visit your repo's landing page and select "manage topics."