Found 81 packages
AWS IoT-Data enables secure, bi-directional communication between Internet-connected things (such as sensors, actuators, embedded devices, or smart appliances) and the AWS cloud. It implements a broker for applications and things to publish messages over HTTP (Publish) and retrieve, update, and delete thing shadows. A thing shadow is a persistent representation of your things and their state in the AWS cloud.
A C# implementation of Concise Binary Object Representation (CBOR), a general-purpose binary data format defined in RFC 8949.
This library enables the end users to create a lightweight representation of the google.visualization.DataTable object directly in Microsoft.NET and the creation of the necessary JSON as needed by the Google Chart Tools Javascript library.
MARS LIFE is a modelling framework for agent-based simulations. It provides the following features: * Agent definitions * Layer definitions * Integration of GIS spatial data like raster-files (*.asc, *.geotiff) and vector formats (*.shp, *.geojson, *.kml, *.gml) * Representations for temporal data with optional spatial reference (spatiotemporal) * Spatial data-structures and agent-environments for movement and explorations * Methods and algorithms for numerical computations for every day use * Result output-pipeline and simulation result persistence For more details how to use MARS, please use the documentation: https://www.mars-group.org/docs/tutorial/intro
Fast lightweight data access library for .NET Core (micro-ORM): simple API for CRUD operations, dynamic queries, SQL commands generation (command builder), abstract queries that have simple string representation + parser, schema-less data access, flexible query results mapping to annotated POCO models, app-level data views, RecordSet structure (replacement for DataTable). Try out NReco.Data if you're looking for Dapper alternative with abstract queries and automated SQL generation.
This is the support library for working with the POCO representation of HL7's FHIR model data
The overall aim of this project is to create a term rewriting system that could be useful in everyday programming, and to represent data in a way that roughly correspond to the definition of a term in formal logic. Terms should be familiar to any programmer because they are basically constants, variables, and function symbols.
Protocol Buffers is a binary serialization format and technology, released to the open source community by Google in 2008. Its primary use is to produce small fast binary representations of a 'message' or object for serialization or transportation. There are various implementations of Protocol Buffers in .NET. This project is a fairly close port of the Google Java implementation. There are two main parts: tools/protoc.exe, which takes the textual representation of the protocol buffer and turns it into a binary representation for use with ProtoGen.exe. tools/ProtoGen.exe, which takes binary representations of protocol buffer descriptors (as generated by the "stock" protoc binary supplied by Google) and creates C# source code. This is only required at build time. lib/*/Google.ProtocolBuffers.dll, which is a supporting library. This is required at execution time. lib/*/Google.ProtocolBuffers.Serialization.dll, a supplementary library that provides extensions for reading and writing protocol buffers to xml, json, and others. LINKS: Project Home - http://code.google.com/p/protobuf-csharp-port Online Help - http://help.protobuffers.net Developer Guide - http://code.google.com/apis/protocolbuffers/docs/overview.html Language Guide - http://code.google.com/apis/protocolbuffers/docs/proto.html
This is the support library for working with the POCO representation of HL7's FHIR model data. Special made for DSTU2
TauCode graph representation library
TauCode library for text data representation
Reinforced.Lattice is ultimate remote data representation framework for ASP.NET with extended templating capabilities
This package is part of the Semiodesk Trinity API. It allows for the creation of C# representation of ontologies as well as the mapping of RDF classes to C# classes.
Data Standardizer provides implementations of various internationally recognised standards in data processing, covering topics ranging from languages to currencies and geographical entities. With strongly-typed enumerations for each standard (where applicable) or other targeted data types, you can represent these elements in your code such that errors with invalid values are minimised. Supports use of ISO 4217, "Codes for the representation of currencies and funds".
Data Standardizer provides implementations of various internationally recognised standards in data processing, covering topics ranging from languages to currencies and geographical entities. With strongly-typed enumerations for each standard (where applicable) or other targeted data types, you can represent these elements in your code such that errors with invalid values are minimised. Supports use of ISO 3166, "Codes for the representation of names of countries and their subdivisions" parts 1 & 2.
Data Standardizer provides implementations of various internationally recognised standards in data processing, covering topics ranging from languages to currencies and geographical entities. With strongly-typed enumerations for each standard (where applicable) or other targeted data types, you can represent these elements in your code such that errors with invalid values are minimised. Supports use of ISO 639, "Codes for the representation of names of languages" parts 1, 2, 3 & 5.
Data Standardizer provides implementations of various internationally recognised standards in data processing, covering topics ranging from languages to currencies and geographical entities. With strongly-typed enumerations for each standard (where applicable) or other targeted data types, you can represent these elements in your code such that errors with invalid values are minimised. Supports use of ISO 15924, "Codes for the representation of names of scripts".
Data Standardizer provides implementations of various internationally recognised standards in data processing, covering topics ranging from languages to currencies and geographical entities. With strongly-typed enumerations for each standard (where applicable) or other targeted data types, you can represent these elements in your code such that errors with invalid values are minimised. Includes: - Supports use of ISO 4217, "Codes for the representation of currencies and funds" - Table A.1 – Current currency & funds code list - Table A.3 – List of codes for historic denominations of currencies & funds - Money data type as described in "Patterns of Enterprise Application Architecture" by Martin Fowler. Combines an amount with an ISO 4217 currency code to manage a monetary value.
Data Standardizer provides implementations of various internationally recognised standards in data processing, covering topics ranging from languages to currencies and geographical entities. With strongly-typed enumerations for each standard (where applicable) or other targeted data types, you can represent these elements in your code such that errors with invalid values are minimised. Includes: - Supports use of ISO 639, "Codes for the representation of names of languages" parts 1, 2, 3 & 5. - Supports use of ISO 15924, "Codes for the representation of names of scripts".
Data Standardizer provides implementations of various internationally recognised standards in data processing, covering topics ranging from languages to currencies and geographical entities. With strongly-typed enumerations for each standard (where applicable) or other targeted data types, you can represent these elements in your code such that errors with invalid values are minimised. Includes: - Supports use of ISO 3166, "Codes for the representation of names of countries and their subdivisions" parts 1 & 2. - Supports use of UN M49 or the "Standard Country or Area Codes for Statistical Use (Series M, No. 49)".