robotstxt
A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker
Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.
- Version0.7.15
- R version≥ 3.0.0
- LicenseMIT
- LicenseLICENSE
- Needs compilation?No
- Last release08/29/2024
Documentation
Team
- Pedro Baltazar
- Peter MeissnerShow author detailsRolesAuthor
- Kun RenShow author detailsRolesAuthor, Copyright holder
- Oliver KeysShow author detailsRolesContributor
- Rich Fitz JohnShow author detailsRolesContributor
Insights
Last 30 days
The following line graph shows the downloads per day. You can hover over the graph to see the exact number of downloads per day.
Last 365 days
The following line graph shows the downloads per day. You can hover over the graph to see the exact number of downloads per day.
Data provided by CRAN
Binaries
Dependencies
- Imports5 packages
- Suggests6 packages
- Reverse Imports2 packages
- Reverse Suggests4 packages