Found 127 packages
Create and simulate ABB, KUKA, UR, and Staubli robot programs.
Robots.txt middleware with fluent interface.
Provides an Admin interface in Optimizely CMS 12 for managing robots.txt files. A controller responds to the /robots.txt path and returns the appropriate content specific to the site.
Robots.txt parsing and querying utility
An Umbraco plugin to help you to easily setup a robots.txt for your website
Sitemap and robots dynamic generator with gadget and admin plugins for EPiserver
Assemblies for the Robots.txt package
Package Description
robots.txt middleware for .NET core
A robots.txt parser for .NET Supports ; - Allow directives. - Crawl-delay directives. - Sitemap declarations. - * and $ wildcards. See https://bitbucket.org/cagdas/robotstxt for usage examples.
This client library enables working with Robots.txt. Key Features: - Parse robots.txt into Typed object. - Lookup Allowed/Disallowed/Crawldelay based on User-Agent. - Traverse sitemap in robots.txt for urls. For More info see: https://github.com/nicholasbergesen/robotsSharp/master/README.md
Serilog enricher for assembly information
Allows to serve a robots instruction file.
Dynamic route for /robots.txt
SEO plugin for ASP.NET Core applications.
An Umbraco package to edit your robots.txt file
Implements communication with Universal Robots industrial robots (RTDE, Primary interfaces, Dashboard Server, REST API, SSH, SFTP, XML-RPC, Sockets, Interpreter Mode) - Fully-managed .NET commercial DLL without dependencies. Offline tools like forward and invert kinematics, or pose conversion
Create and simulate ABB, KUKA, UR, and Staubli robot programs. This package is for development of Rhino and Grasshopper plug-ins.
A friendly tool for creating dynamic robots.txt files in Umbraco
The Robots Handler package provides editors with the ability to dynamically change the contents of a site's robots file. Instead of storing the contents of a robots file on the file system, an editor can specify its contents in an Umbraco content page. The property content is then served via a Http Handler for the current site. This package works for multi-site Umbraco installations, meaning it will serve the correct contents for a requested domain's robots file.