661 packages tagged with “llm”
.NET SDK for the Model Context Protocol (MCP) with hosting and dependency injection extensions.
Core .NET SDK for the Model Context Protocol (MCP)
ASP.NET Core extensions for the C# Model Context Protocol (MCP) SDK.
Open source LLM application framework to build scalable, flexible and robust AI system.
The easiest way to use the Ollama API in .NET
LLamaSharp is a cross-platform library to run 🦙LLaMA/Mtmd model (and others) in your local device. Based on [llama.cpp](https://github.com/ggerganov/llama.cpp), inference with LLamaSharp is efficient on both CPU and GPU. With the higher-level APIs and RAG support, it's convenient to deploy LLM (Large Language Model) in your application with LLamaSharp.
Fast and memory-efficient WordPiece tokenizer as it is used by BERT and others. Tokenizes text for further processing using NLP/language models.
LLamaSharp.Backend.Cpu is a backend for LLamaSharp to use with Cpu only.
LLamaSharp.Backend.Cuda12 is a backend for LLamaSharp to use with Cuda12.
ChatGPT.Net is a C# library that allows developers to access ChatGPT using official OpenAI API, a chat-based large language model, With this API, developers can send queries to ChatGPT and receive responses in real-time, making it easy to integrate ChatGPT into their own applications.
FoundationaLLM.Common is a .NET library that the FoundationaLLM.Client.Core and FoundationaLLM.Client.Management client libraries share as a common dependency.
LLamaSharp.Backend.Cuda12.Windows contains the Windows binaries for LLamaSharp with Cuda12 support.
LLamaSharp.Backend.Cuda12.Linux contains the Linux binaries for LLamaSharp with Cuda12 support.
The integration of LLamaSharp and Microsoft semantic-kernel.
FoundationaLLM.Configuration is a .NET library for the Configuration resource provider. The resource provider provides a consistent way to manage and access configuration settings across a FoundationaLLM instance.
The integration of LLamaSharp and Microsoft kernel-memory. It could make it easy to support document search for LLamaSharp model inference.
FoundationaLLM.Client.Management is a .NET library that simplifies integrating the FoundationaLLM Management API into your projects and data workflows.
Cuda12 backend for LM-Kit.NET (Windows)
FoundationaLLM.Client.Core is a .NET library that simplifies integrating the FoundationaLLM Core API into your projects and data workflows.
Cuda12 backend for LM-Kit.NET (Linux)
A library for slicing text data into smaller chunks while attempting to preserve context.
FoundationaLLM.DataPipelinePlugin contains the standard Data Pipeline plugins provided by FoundationaLLM.
LLamaSharp.Backend.Cuda11 is a backend for LLamaSharp to use with Cuda11.
.NET library for the Model Context Protocol (MCP)
FoundationaLLM.DataSource is a .NET library for the DataSource resource provider. The resource provider manages data sources across a FoundationaLLM instance.
FoundationaLLM.Plugin is a .NET library for the Plugin resource provider. The resource provider manages plugins across a FoundationaLLM instance.
FoundationaLLM.Prompt is a .NET library for the Prompt resource provider. The resource provider manages prompts across a FoundationaLLM instance.
A powerful list view with swipe, pull to refresh, load more(support group list)
A simple Semantic Kernel library for .NET applications
Auto-generated .NET bindings for Llama.cpp.