robotstxt
A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker
Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.
- Top 50 Trending package
- https://docs.ropensci.org/robotstxt/
- GitHub
- File a bug report
- robotstxt results
- robotstxt.pdf
- Version0.7.15
- R version≥ 3.0.0
- LicenseMIT
- Licensefile LICENSE
- Needs compilation?No
- Last release08/29/2024
Documentation
Team
Pedro Baltazar
Peter Meissner
Show author detailsRolesAuthorKun Ren
Show author detailsRolesAuthor, Copyright holderOliver Keys
Show author detailsRolesContributorRich Fitz John
Show author detailsRolesContributor
Insights
Last 30 days
Last 365 days
The following line graph shows the downloads per day. You can hover over the graph to see the exact number of downloads per day.
Data provided by CRAN
Binaries
Dependencies
- Depends1 package
- Imports6 packages
- Suggests6 packages
- Reverse Imports2 packages
- Reverse Suggests4 packages