sjdirect / nrobotsLinks
The Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web spiders and other web robots from accessing all or part of a website which is otherwise publicly viewable. This project provides an easy-to-use class, implemented in C#, to work with robots.txt files.
☆15Updated 7 years ago
Alternatives and similar repositories for nrobots
Users that are interested in nrobots are comparing it to the libraries listed below
Sorting:
- An asynchronous web scraper / web crawler using async / await and Reactive Extensions☆59Updated 8 years ago
- A cross-platform client for JabbR☆56Updated 11 years ago
- A distributed network storage system with an hydrid data model