17 packages tagged with “LLamaSharp”
LLamaSharp.Backend.Cpu is a backend for LLamaSharp to use with Cpu only.
LLamaSharp.Backend.Cuda12 is a backend for LLamaSharp to use with Cuda12.
LLamaSharp Chat model provider.
LLamaSharp.Backend.Cuda12.Windows contains the Windows binaries for LLamaSharp with Cuda12 support.
LLamaSharp.Backend.Cuda12.Linux contains the Linux binaries for LLamaSharp with Cuda12 support.
LLamaSharp.Backend.Cuda11 is a backend for LLamaSharp to use with Cuda11.
LLamaSharp.Backend.Cuda11.Windows contains the Windows binaries for LLamaSharp with Cuda11 support.
LLamaSharp.Backend.Cuda11.Linux contains the Linux binaries for LLamaSharp with Cuda11 support.
LLamaSharp.Backend.Vulkan is a backend for LLamaSharp to use with Vulkan.
LLamaSharp.Backend.Vulkan.Windows contains the Windows binaries for LLamaSharp with Vulkan support.
LLamaSharp.Backend.Vulkan.Linux contains the Linux binaries for LLamaSharp with Vulkan support.
LLamaSharp.Backend.MacMetal is a backend for LLamaSharp to use MAC with GPU support.
LLamaSharp.Backend.OpenCL is a backend for LLamaSharp to use with OpenCL.
LLamaSharp Integration Library for .NET Core Enhance your .NET Core applications with seamless integration of LLamaSharp models and contexts using our comprehensive .NET Core services. Our library facilitates robust interfacing with LLamaSharp models, allowing you to harness its powerful features in a wide range of .NET Core applications.
LlamaSharp connector
LLamaSharp.Backend.Cpu.Android is a backend for LLamaSharp to use with Android Cpu only.
𝗩𝗲𝗹𝗼𝗿𝗮𝗔𝗜 is an open-source library powered by LLamaSharp, designed for integrating local AI capabilities into commercial .NET applications. It focuses on efficiency and accessibility, using highly optimized, small-sized models and a strategic downloader that prioritizes speed to ensure fast and seamless access to VeloraAI-supported models.