A .NET implementation of adaptive text classification concepts inspired by the Adaptive Classifier research team. Provides production-ready inference with dual prediction mechanisms.
$ dotnet add package AdaptiveClassifier.NETA .NET inference-only implementation of adaptive text classification concepts inspired by the Adaptive Classifier project .
This library allows you to run Adaptive Classifier–style models in .NET using ONNX Runtime, combining prototype-based memory with a neural adaptive head for robust few-shot classification.
This project is NOT an original work.
It is a .NET inference implementation designed to consume models trained with the original Adaptive Classifier project.
All research, algorithms, training logic, and model design belong to the original authors.
If you are looking to train models, you must use the original Python project.
Adaptive Classifier models are powerful but Python-centric.
This library enables:
Input Text
↓
ONNX Embedding Model
↓
┌─────────────────┬─────────────────┐
│ Prototype │ Adaptive Head │
│ Memory │ (Neural Net) │
│ (Similarity) │ │
└─────────────────┴─────────────────┘
↓ ↓
Similarity Scores Neural Scores
↓ ↓
└─────┬───────────┘
↓
Weighted Combination
↓
Final Prediction
This library requires models exported from the original Adaptive Classifier project.
You MUST export all components listed below.
Partial exports will not work.
| File | Purpose |
|---|---|
model.onnx | Embedding model |
tokenizer.json | Hugging Face tokenizer |
examples.json | Prototype memory |
weights.json | Adaptive head weights |
Models must be trained and exported using the original project.
Below is a real export example based directly on the Adaptive Classifier training workflow.
import json
import torch
import shutil
from adaptive_classifier import AdaptiveClassifier
# creation of the classifier
classifier = AdaptiveClassifier(
"distilbert/distilbert-base-multilingual-cased",
device="cuda"
)
# training
classifier.add_examples(texts, labels)
# export
classifier.save(
"./adaptive_classifier_model",
include_onnx=True,
quantize_onnx=True
)
def export_adaptive_head(classifier, file_path):
head_data = {
"id_to_label": classifier.id_to_label,
"layers": []
}
for layer in classifier.adaptive_head.modules():
if isinstance(layer, torch.nn.Linear):
head_data["layers"].append({
"weights": layer.weight.detach().cpu().numpy().tolist(),
"bias": layer.bias.detach().cpu().numpy().tolist()
})
with open(file_path, "w") as f:
json.dump(head_data, f)
export_adaptive_head(
classifier,
"./adaptive_classifier_model/output/weights.json"
)
Models/Classifier/
├── model.onnx
├── tokenizer.json
├── examples.json
└── weights.json
using AdaptiveClassifier.NET;
using AdaptiveClassifier.NET.Configuration;
var config = new ClassifierConfiguration
{
HuggingFaceModelId = "your-org/your-model",
ModelFolder = "Models/MyClassifier",
UseCUDA = true,
ProtoWeight = 0.7,
AdaptiveHeadWeight = 0.3
};
using var classifier = new Classifier(config);
await classifier.InitializeAsync(CancellationToken.None);
var predictions = classifier.Predict("Your text here", topK: 5);
InitializeAsync() must be called onceMicrosoft.ML.OnnxRuntime.GpuThis project exists solely to enable inference in .NET.
For:
Refer to the original Adaptive classifier
This project is released under the MIT License.
It is an independent .NET inference implementation inspired by the research and reference implementation from the Adaptive Classifier project.
The original Adaptive Classifier project is licensed under the Apache License 2.0.
This repository:
All rights to the original research, algorithms, and training implementation remain with the original authors.
@software{adaptive-classifier,
title = {Adaptive Classifier: Dynamic Text Classification with Continuous Learning},
author = {Asankhaya Sharma},
year = {2025},
publisher = {GitHub},
url = {https://github.com/codelion/adaptive-classifier}
}