34 packages tagged with “Robots”
robots.txt middleware for .NET core
Provides an Admin interface in Optimizely CMS 12 for managing robots.txt files. A controller responds to the /robots.txt path and returns the appropriate content specific to the site.
Robots.txt parsing and querying utility
SEO plugin for ASP.NET Core applications.
Sitemap and robots dynamic generator with gadget and admin plugins for EPiserver
SimpleSitemap is a lite library that helps you create web sitemaps for collections or lists of items. These sitemaps follow the Sitemap Protocol. Both sitemapindex and urlset links are generated, based upon the data collection size and 'page size'. Examples of this could be a list of your users, or products, or questions.
Create and simulate ABB, KUKA, UR, and Staubli robot programs.
Configurable robots.txt handler supporting web.config based configuration. Icon supplied by Freepik (http://www.flaticon.com/authors/freepik)
Dynamic route for /robots.txt
This client library enables working with Robots.txt. Key Features: - Parse robots.txt into Typed object. - Lookup Allowed/Disallowed/Crawldelay based on User-Agent. - Traverse sitemap in robots.txt for urls. For More info see: https://github.com/nicholasbergesen/robotsSharp/master/README.md
Allows to serve a robots instruction file.
Package Description
A friendly tool for creating dynamic robots.txt files in Umbraco
A simple middleware built on reflection designed to support your Search Engine Optimized (SEO) app by dynamically creating a Sitemap.xml and Robots.txt. This package is designed to be very simple for simple use cases but allows for custom disallowed routes, user agents, and custom node endpoints by supplying optional parameters to the middleware extension methods. Sitemaps.xml asks if you would like to parse the controllers - you can add the [NoSiteMap] attribute to any class or method you would not like included in the Sitemap.xml and the [Priority] attribute to set custom priorities per route. You can provide detail routing information for a dynamic sitemap.xml of items, e.g. a link for each product in the products database. The Robots middleware will allow you to add any number of RobotRules for defining your User-Agent and Disallowed routes. This middleware parses the existing controller structure and automatically includes all get endpoints in the dynamic Sitemap.xml while ignoring any Posts, Puts, or Deletes.
The Robots Handler package provides editors with the ability to dynamically change the contents of a site's robots file. Instead of storing the contents of a robots file on the file system, an editor can specify its contents in an Umbraco content page. The property content is then served via a Http Handler for the current site. This package works for multi-site Umbraco installations, meaning it will serve the correct contents for a requested domain's robots file.
Sitecore Multisite Http module, allows mulitsite handling of 404, & 500 errors, along with custom robots.txt generation
Stateful programmatic web browsing, based on Python-Mechanize, which is based on Andy Lester’s Perl module WWW::Mechanize.
Create and simulate ABB, KUKA, UR, and Staubli robot programs. This package is for development of Rhino and Grasshopper plug-ins.
Automatic generation of sitemap.xml and robots.txt based on ASP.NET routes. Extensible engine allows custom URL providers to get URLs from any source. Includes exclude from sitemap feature and video sitemap support.
IDeliverable.Seo is module for the Orchard CMS that allows site owners robust and granular control over SEO aspects. The module lets site owners control page titles, meta keywords and descriptions, robots.txt and sitemap.xml. Additionally, it allows 301/302 redirects to be configured for changed URLs.
https://github.com/nicholasbergesen/RobotsParser/blob/master/README.md
Algo trading library (Account emulator and auxiliary classes). More on www.rapiddev.org.
robots.text implementation for EPiServer
Parsers for Robots Exclusion Standard (aka robots.txt), Robots Meta Tag, and X-Robots-Tag. Visit Project Site for documentation.
With RoboDK API allows you to interact with industrial robots and RoboDK Software. RoboDK Software allows you to simulate and program industrial robots for any manufacturing operation, such as robot machining, inspection or pick and place applications. You can esily program and debug any industrial robot arm under the same development environment.
Implements communication with Universal Robots industrial robots (RTDE, Primary interfaces, Dashboard Server, REST API, SSH, SFTP, XML-RPC, Sockets, Interpreter Mode) - Fully-managed .NET commercial DLL without dependencies. Offline tools like forward and invert kinematics, or pose conversion
Sitemap and robots dynamic generator core classes
A package for creating robots.txt files in ASP.NET Core applications.
AngleSharp IHtmlParser Extension for Mechanize.NET