Enable SemanticKernelPooling to pool OpenAI and Azure OpenAI kenels
License
—
Deps
3
Install Size
—
Vulns
✓ 0
Published
May 10, 2025
$ dotnet add package SemanticKernelPooling.Connectors.OpenAISemanticKernelPooling is a .NET library designed to facilitate seamless integration with multiple AI service providers, such as OpenAI, Azure OpenAI, HuggingFace, Google, Mistral AI, and others. It utilizes a kernel pooling approach to manage resources efficiently and provide robust AI capabilities in your .NET applications.
Microsoft.Extensions.Logging for detailed logging and diagnostics.Microsoft.Extensions.DependencyInjectionMicrosoft.Extensions.LoggingMicrosoft.SemanticKernelPolly for advanced retry logicTo install SemanticKernelPooling, you can use the NuGet package manager:
dotnet add package SemanticKernelPooling
Configure Services
Start by configuring the services in your Program.cs or Startup.cs file:
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;
using SemanticKernelPooling;
using SemanticKernelPooling.Connectors.OpenAI;
var services = new ServiceCollection();
services.AddLogging(configure => configure.AddConsole());
services.UseSemanticKernelPooling(); // Core service pooling registration
services.UseOpenAIKernelPool(); // Register OpenAI kernel pool
services.UseAzureOpenAIKernelPool(); // Register Azure OpenAI kernel pool
var serviceProvider = services.BuildServiceProvider();
Configure Providers
You need to set up configuration settings for each AI service provider you intend to use. These settings can be defined in a appsettings.json or any configuration source supported by .NET:
{
"AIServiceProviderConfigurations": [
{
"UniqueName": "OpenAI",
"ServiceType": "OpenAI",
"ApiKey": "YOUR_OPENAI_API_KEY",
"ModelId": "YOUR_MODEL_ID"
},
{
"UniqueName": "AzureOpenAI",
"ServiceType": "AzureOpenAI",
"DeploymentName": "YOUR_DEPLOYMENT_NAME",
"ApiKey": "YOUR_AZURE_API_KEY",
"Endpoint": "YOUR_ENDPOINT",
"ModelId": "YOUR_MODEL_ID",
"ServiceId": "YOUR_SERVICE_ID"
}
// Add more providers as needed
]
}
Retrieve a Kernel and Execute Commands
Once the service providers are configured and registered, you can retrieve a kernel from the pool and execute commands:
var kernelPoolManager = serviceProvider.GetRequiredService<IKernelPoolManager>();
// Example: Getting a kernel for OpenAI
using var kernelWrapper = await kernelPoolManager.GetKernelAsync(AIServiceProviderType.OpenAI);
// Use the kernel to perform AI operations
var response = await kernelWrapper.Kernel.ExecuteAsync("What is Semantic Kernel?");
Console.WriteLine(response);
// Return the kernel to the pool after use
Using Retry Policies
To handle API rate limits and transient errors, use Polly to define retry policies:
AsyncPolicy httpTimeoutAndRetryPolicy = Policy
.Handle<Exception>(ex => ex.IsTransientError())
.WaitAndRetryAsync(
retryCount: 6,
sleepDurationProvider: retryAttempt => TimeSpan.FromSeconds(Math.Pow(2, retryAttempt)) + TimeSpan.FromMilliseconds(new Random().Next(0, 3000)),
onRetry: (exception, timespan, retryCount, context) =>
{
logger.LogError($"Retry {retryCount} after {timespan.TotalSeconds} seconds due to: {exception.Message}");
});
Adding New AI Providers
To add support for a new AI provider, follow these steps:
Create a Configuration Class: Define a new configuration class inheriting from AIServiceProviderConfiguration.
Implement a Kernel Pool Class: Create a new kernel pool class inheriting from AIServicePool<T>.
Register the New Provider: Add the registration method in the ServiceExtension class to register your new provider with the DI container.
For example, to add a new "CustomAI" provider:
public record CustomAIConfiguration : AIServiceProviderConfiguration
{
public required string ModelId { get; init; }
public required string ApiKey { get; init; }
// Additional settings...
}
class CustomAIKernelPool(
CustomAIConfiguration config,
ILoggerFactory loggerFactory)
: AIServicePool<CustomAIConfiguration>(config)
{
protected override void RegisterChatCompletionService(IKernelBuilder kernelBuilder, CustomAIConfiguration config, HttpClient? httpClient)
{
// Register service logic...
}
protected override ILogger Logger { get; } = loggerFactory.CreateLogger<CustomAIKernelPool>();
}
public static class ServiceExtension
{
public static void UseCustomAIKernelPool(this IServiceProvider serviceProvider)
{
var registrar = serviceProvider.GetRequiredService<IKernelPoolFactoryRegistrar>();
registrar.RegisterKernelPoolFactory(
AIServiceProviderType.CustomAI,
(aiServiceProviderConfiguration, loggerFactory) =>
new CustomAIKernelPool((CustomAIConfiguration)aiServiceProviderConfiguration, loggerFactory));
}
}
OpenAIConfiguration and OpenAIKernelPool to interact with OpenAI services.AzureOpenAIConfiguration and AzureOpenAIKernelPool for Azure OpenAI.HuggingFaceConfiguration and HuggingFaceKernelPool to integrate with HuggingFace models.GoogleConfiguration and GoogleKernelPool for Google AI services.MistralAIConfiguration and MistralAIKernelPool to leverage Mistral AI services.Contributions are welcome! Please fork the repository, make your changes, and submit a pull request. Ensure your code adheres to the project's coding standards and includes appropriate tests.
This project is licensed under the MIT License. See the LICENSE file for more details.