spiderbar
Parse and Test Robots Exclusion Protocol Files and Rules
The 'Robots Exclusion Protocol' https://www.robotstxt.org/orig.html documents a set of standards for allowing or excluding robot/spider crawling of different areas of site content. Tools are provided which wrap The 'rep-cpp' https://github.com/seomoz/rep-cpp C++ library for processing these 'robots.txt' files.
- Version0.2.5
- R versionunknown
- LicenseMIT
- LicenseLICENSE
- Needs compilation?Yes
- Last release02/11/2023
Documentation
Team
Bob Rudis
SEOmoz, Inc
Show author detailsRolesAuthor
Insights
Last 30 days
This package has been downloaded 1,727 times in the last 30 days. Now we’re talking! This work is officially 'heard of in academic circles', just like those wild research papers on synthetic bananas. The following heatmap shows the distribution of downloads per day. Yesterday, it was downloaded 88 times.
The following line graph shows the downloads per day. You can hover over the graph to see the exact number of downloads per day.
Last 365 days
This package has been downloaded 20,655 times in the last 365 days. The downloads are officially high enough to crash an underfunded departmental server. Quite an accomplishment! The day with the most downloads was Oct 30, 2024 with 266 downloads.
The following line graph shows the downloads per day. You can hover over the graph to see the exact number of downloads per day.
Data provided by CRAN
Binaries
Dependencies
- Imports1 package
- Suggests3 packages
- Linking To1 package
- Reverse Imports1 package