3
Avyiel
2y

Google just open-sourced their robots.txt parser:

https://opensource.googleblog.com/2...

Comments
  • 4
    Quote:

    "On the other hand, for crawler and tool developers, it also brought uncertainty; for example, how should they deal with robots.txt files that are hundreds of megabytes large?"

    What the fucking frigging fuck?

    I mean... 100 MB ?! And more?!
  • 7
    @IntrusionCM it’s not illegal.

    Some time ago I had various drop table statements in my robots.txt just because maybe I can destroy some robot database.
Add Comment