68 packages tagged with “LLama”
The easiest way to use the Ollama API in .NET
LLamaSharp is a cross-platform library to run 🦙LLaMA/Mtmd model (and others) in your local device. Based on [llama.cpp](https://github.com/ggerganov/llama.cpp), inference with LLamaSharp is efficient on both CPU and GPU. With the higher-level APIs and RAG support, it's convenient to deploy LLM (Large Language Model) in your application with LLamaSharp.
LLamaSharp.Backend.Cpu is a backend for LLamaSharp to use with Cpu only.
LLamaSharp.Backend.Cuda12 is a backend for LLamaSharp to use with Cuda12.
Provide access to OpenAI LLM models in Kernel Memory to generate text
LLamaSharp Chat model provider.
LLamaSharp.Backend.Cuda12.Windows contains the Windows binaries for LLamaSharp with Cuda12 support.
The integration of LLamaSharp and Microsoft semantic-kernel.
LLamaSharp.Backend.Cuda12.Linux contains the Linux binaries for LLamaSharp with Cuda12 support.
Cuda12 backend for LM-Kit.NET (Windows)
The integration of LLamaSharp and Microsoft kernel-memory. It could make it easy to support document search for LLamaSharp model inference.
Cuda12 backend for LM-Kit.NET (Linux)
LLamaSharp.Backend.Cuda11 is a backend for LLamaSharp to use with Cuda11.
Auto-generated .NET bindings for Llama.cpp.
Cloud based exception logging through www.llamalogger.com
LLamaSharp.Backend.Cuda11.Windows contains the Windows binaries for LLamaSharp with Cuda11 support.
LLamaSharp.Backend.Cuda11.Linux contains the Linux binaries for LLamaSharp with Cuda11 support.
Enterprise-Grade .NET SDK for Integrating Generative AI Capabilities.
LLamaSharp.Backend.Vulkan is a backend for LLamaSharp to use with Vulkan.
LLamaSharp.Backend.Vulkan.Linux contains the Linux binaries for LLamaSharp with Vulkan support.
LLamaSharp.Backend.Vulkan.Windows contains the Windows binaries for LLamaSharp with Vulkan support.
API for LLaMA models Universe.LLaMa.API
Cuda12 backend dependencies for LM-Kit.NET (Windows)
Serilog integration for cloud based exception logging through www.llamalogger.com
Cuda12 backend for LM-Kit.NET on Linux ARM64
Use tools from model context protocol (MCP) servers with Ollama
Microsoft.Extensions.Logging integration for cloud based exception logging through www.llamalogger.com
GroqSharp is a C# client library that makes it easy to interact with GroqCloud. It's designed to provide a simple and flexible interface, allowing you to seamlessly integrate the Groq service into your C# applications.
LLamaSharp.Backend.MacMetal is a backend for LLamaSharp to use MAC with GPU support.
Groq provider for NovaCore.AgentKit (OpenAI-compatible)