Crandore Hub

robotstxt

A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler' Permissions Checker

Provides functions to download and parse 'robots.txt' files. Ultimately the package makes it easy to check if bots (spiders, crawler, scrapers, ...) are allowed to access specific resources on a domain.

Versions across snapshots

VersionRepositoryFileSize
0.7.15 2026-04-09 windows/windows R-4.5 robotstxt_0.7.15.zip 208.7 KiB

Dependencies (latest)

Imports

Suggests