Text generation with local models
$ dotnet add package SmartAICompendium.TextGenerationThis package allow easy and typed text generation with a local model and a embedded python. Compatible with windows, linux and mac.
| Name | Description |
|---|---|
| prompt | Prompt to guide the text generation. |
| maxNewTokens | Override the default value of maxNewTokens. The maximum number of new tokens to generate. |
| badWords | Override the default value of badWords. A list of sequences to stop generation when encountered. |
| topK | Override the default value of topK. |
| topP | Override the default value of topP. |
| temperature | Override the default value of temperature. |
| repetitionPenalty | Override the default value of repetitionPenalty. |
| lastNTokens | Override the default value of lastNTokens. The number of last tokens to use for repetition penalty. |
| seed | Override the default value of seed. For a specific seed, a prompt will return the same result. By default, -1 return a randomized result. |
| reset | Override the default value of reset. Whether to reset the model state before generating text. |
| batchSize | Override the default value of batchSize. The batch size to use for evaluating tokens. |
**valid parameters will depend of the model
The model can be a huggingface repoId ("{user}/{model}") or local folder containing the model. The model will be downloaded if not already present (see SmartAICompendium.Common)
var model = new TextInference("aihub-app/zyte-1B");
var text = model.Generate("Instruct: Give the list of french speaking countries\nOutput:",
maxNewTokens: 256, badWords:null, topK: 40, topP: 0.95f, temperature: 0.8f, repetitionPenalty: 1.1f, lastNTokens: 64, int seed:-1, reset: true, batchSize: 8);
Console.WriteLine(text);
var model = new TextInference("C:\\TinyLlama-1.1B-Chat-v1.0");
var message = model.GenerateChat(new ChatMessage[]{
new(){
Role = "system",
Content = "You are an adventurer who always responds in the style of a pirate."
},
new(){
Role = "user",
Content = "Tell me more about your last adventure."
}
},
maxNewTokens: 256, temperature: 0.7f, topK: 50, topP: 0.95f);
Console.WriteLine(message.Content);
if not done, the packages will be installed at the class initialization.
TextInference.Update();