Crandore Hub

spiderbar

Parse and Test Robots Exclusion Protocol Files and Rules

The 'Robots Exclusion Protocol' <https://www.robotstxt.org/orig.html> documents a set of standards for allowing or excluding robot/spider crawling of different areas of site content. Tools are provided which wrap The 'rep-cpp' <https://github.com/seomoz/rep-cpp> C++ library for processing these 'robots.txt' files.

Versions across snapshots

VersionRepositoryFileSize
0.2.5 2026-04-09 windows/windows R-4.5 spiderbar_0.2.5.zip 459.3 KiB

Dependencies (latest)

Imports

LinkingTo

Suggests