spiderbar
Parse and Test Robots Exclusion Protocol Files and Rules
The 'Robots Exclusion Protocol' https://www.robotstxt.org/orig.html documents a set of standards for allowing or excluding robot/spider crawling of different areas of site content. Tools are provided which wrap The 'rep-cpp' https://github.com/seomoz/rep-cpp C++ library for processing these 'robots.txt' files.
- Version0.2.5
- R versionunknown
- LicenseMIT
- LicenseLICENSE
- Needs compilation?Yes
- Last release02/11/2023
Documentation
Team
Bob Rudis
SEOmoz, Inc
Show author detailsRolesAuthor
Insights
Last 30 days
Last 365 days
The following line graph shows the downloads per day. You can hover over the graph to see the exact number of downloads per day.
Data provided by CRAN
Binaries
Dependencies
- Imports1 package
- Suggests3 packages
- Linking To1 package
- Reverse Imports1 package