What is it?¶
Robots.txt basically works like a “No Trespassing” sign. It actually, tells robots whether we want them to crawl the website or not. With this role, we are disallowing all robots to crawl and avoid indexing in search engines.
sb install sandbox-traefik_robotstxt
When you want to reach
HTTP/1.1 200 OK Content-Length: 26 User-agent: * Disallow: /