A lightweight .NET library for capturing RTSP video frames using GStreamer with direct P/Invoke bindings. Zero NuGet dependencies, supports H.265/HEVC and H.264, hardware acceleration on ARM64 devices (Raspberry Pi, Khadas VIM3, Jetson). Provides precise stream timestamps (PTS) and BGR output ready for OpenCV.
$ dotnet add package RtspGStreamerLibA lightweight .NET library for capturing RTSP video frames using GStreamer with direct P/Invoke bindings.
dotnet add package RtspGStreamerLib
This library requires GStreamer to be installed on your system.
C:\Program Files\gstreamer\1.0\msvc_x86_64\binVerify installation:
gst-launch-1.0 --version
sudo apt update
sudo apt install -y libgstreamer1.0-0 libgstreamer1.0-dev \
gstreamer1.0-plugins-base gstreamer1.0-plugins-good \
gstreamer1.0-plugins-bad gstreamer1.0-libav \
gstreamer1.0-tools
sudo apt update
sudo apt install -y libgstreamer1.0-0 libgstreamer1.0-dev \
gstreamer1.0-plugins-base gstreamer1.0-plugins-good \
gstreamer1.0-plugins-bad gstreamer1.0-libav \
gstreamer1.0-tools gstreamer1.0-omx
using RtspGStreamerLib;
// Initialize GStreamer (call once at application startup)
RtspFrameCapture.Initialize();
// Create capture instance
using var capture = new RtspFrameCapture();
// Set up frame callback
capture.OnFrameReceived += (frame) =>
{
Console.WriteLine($"Frame: {frame.Width}x{frame.Height}, Format: {frame.Format}");
Console.WriteLine($"Stream Timestamp: {frame.StreamTimestamp}");
Console.WriteLine($"Data size: {frame.Data.Length} bytes");
// Process frame.Data (BGR pixel array)
// Your processing code here...
};
// Set up error callback
capture.OnError += (error) =>
{
Console.WriteLine($"Error: {error}");
};
// Start capturing
// useHardwareAccel: true for ARM64 devices with HW decoder support
bool useHardwareAccel = false; // Set to true on ARM64
if (capture.Start("rtsp://user:password@192.168.1.100:554/stream", useHardwareAccel))
{
Console.WriteLine("Capture started!");
// Keep running...
Console.ReadKey();
// Stop capture
capture.Stop();
}
The main class for capturing RTSP video frames.
| Method | Description |
|---|---|
static void Initialize() | Initialize GStreamer. Call once at application startup. |
bool Start(string rtspUrl, bool useHardwareAccel = false) | Start capturing frames from the RTSP stream. |
void Stop() | Stop the capture. |
void Dispose() | Release all resources. |
| Event | Description |
|---|---|
OnFrameReceived | Fired when a new frame is available. Provides a VideoFrame object. |
OnError | Fired when an error occurs. Provides an error message string. |
| Property | Type | Description |
|---|---|---|
IsRunning | bool | Indicates whether capture is currently active. |
Represents a captured video frame.
| Property | Type | Description |
|---|---|---|
Data | byte[] | Raw pixel data in BGR format (3 bytes per pixel). |
Width | int | Frame width in pixels. |
Height | int | Frame height in pixels. |
Format | string | Pixel format (typically "BGR"). |
StreamTimestamp | TimeSpan | Presentation timestamp from the RTSP stream. |
StreamTimestampNanoseconds | ulong | Raw PTS value in nanoseconds. |
ReceivedAt | DateTime | Time when frame was received in C# (includes network latency). |
Frames are delivered in BGR format (3 bytes per pixel), compatible with OpenCV and most computer vision libraries.
Memory layout: [B0, G0, R0, B1, G1, R1, B2, G2, R2, ...]
Total size: Width x Height x 3 bytes
Accessing a pixel at position (x, y):
int index = (y * frame.Width + x) * 3;
byte blue = frame.Data[index];
byte green = frame.Data[index + 1];
byte red = frame.Data[index + 2];
The library provides two types of timestamps:
The PTS (Presentation Timestamp) from the RTSP stream. This is the timestamp assigned by the camera and represents the actual capture time with nanosecond precision.
// Calculate time difference between frames
double deltaSeconds = (currentFrame.StreamTimestamp - previousFrame.StreamTimestamp).TotalSeconds;
The DateTime when the frame was received in your C# code. This includes network latency, decoding time, and thread scheduling delays. Use this only for logging or debugging purposes.
On ARM64 devices with hardware video decoding support, enable hardware acceleration for significantly lower CPU usage:
// ARM64 with v4l2 hardware decoder
capture.Start(rtspUrl, useHardwareAccel: true);
| Platform | Decoder | Notes |
|---|---|---|
| ARM64 (HW) | v4l2h265dec | Uses hardware decoder, ~10-15% CPU |
| All platforms | avdec_h265 | Software decoder, ~25-30% CPU |
| Resolution | FPS | CPU Usage | RAM | Latency |
|---|---|---|---|---|
| 2560x1440 | 30 | ~10-15% | ~60 MB | ~200ms |
| 1920x1080 | 30 | ~8-12% | ~40 MB | ~150ms |
| Resolution | FPS | CPU Usage | RAM | Latency |
|---|---|---|---|---|
| 2560x1440 | 30 | ~25-30% | ~100 MB | ~250ms |
| 1920x1080 | 30 | ~15-20% | ~70 MB | ~200ms |
By default, the library is configured for H.265/HEVC streams. For H.264 streams, you'll need to modify the pipeline in RtspFrameCapture.cs:
// For H.264 streams, change the pipeline to:
string pipeline =
$"rtspsrc location=\"{rtspUrl}\" protocols=tcp latency=200 ! " +
"rtph264depay ! h264parse ! " +
$"{(useHardwareAccel ? "v4l2h264dec" : "avdec_h264")} ! " +
"videoconvert ! video/x-raw,format=BGR ! " +
"appsink name=sink emit-signals=false max-buffers=1 drop=true";
Modify the latency parameter in the pipeline:
To reduce CPU usage, add a frame rate limiter to the pipeline:
"videoconvert ! videorate ! video/x-raw,framerate=10/1,format=BGR ! "
// RGBA (4 bytes per pixel)
"videoconvert ! video/x-raw,format=RGBA ! "
// Grayscale (1 byte per pixel)
"videoconvert ! video/x-raw,format=GRAY8 ! "
Windows:
# Verify GStreamer is in PATH
where gst-launch-1.0
Linux:
# Verify library is installed
ldconfig -p | grep gstreamer
# Should show: libgstreamer-1.0.so.0
gst-launch-1.0 rtspsrc location="rtsp://your-url" ! fakesink
Verify the codec matches (H.264 vs H.265)
Try switching protocols (TCP vs UDP):
// In pipeline: protocols=tcp or protocols=udp
Verify hardware decoder is available:
gst-inspect-1.0 v4l2h265dec
Ensure you're enabling hardware acceleration:
capture.Start(rtspUrl, useHardwareAccel: true);
Dispose() or use using statement to release GStreamer resourcesRtspGStreamerLib/
├── RtspGStreamerLib/ # Main library (NuGet package)
│ ├── GStreamerNative.cs # P/Invoke bindings to GStreamer
│ ├── RtspFrameCapture.cs # Main capture class and VideoFrame
│ └── RtspGStreamerLib.csproj
│
├── RtspGStreamerExample/ # Usage example with library reference
│ ├── Program.cs # Basic example
│ ├── ProgramWithImage.cs # Example with SkiaSharp image saving
│ └── ImageHelper.cs # Image conversion utilities
│
├── RtspGStreamerInterop/ # Standalone example (no library reference)
│ ├── GStreamerNative.cs
│ ├── RtspFrameCapture.cs
│ └── Program.cs
│
└── docs/ # Documentation
├── USAGE_GUIDE.md # Complete usage guide
├── TIMESTAMPS.md # Understanding RTSP timestamps
└── COMPARISON.md # Library comparison
# Clone the repository
git clone https://github.com/clovisjr/RtspGStreamerLib.git
cd RtspGStreamerLib
# Build the solution
dotnet build RtspGStreamerLib.slnx
# Build release and create NuGet package
cd RtspGStreamerLib
dotnet pack -c Release
Contributions are welcome! Please feel free to submit a Pull Request.
This project is licensed under the MIT License - see the LICENSE.txt file for details.