Rogue Robots and robots.txt
Most people who have had business websites for even a short time know that the robots.txt file contains statements to tell search engine robots which directories and/or pages not to access.
So-called legitimate search engine robots follow the directives in the robots.txt file.
Rogue robots might use the robots.txt file to determine where your secret pages are. Which must have tasty information, because you're trying to hide it (may be the thinking of rogue robot handlers).
So, what does a person do?
One solution is to list a directory in the robots.txt file. Legitimate robots will stay away. Expect rogue robots to enter.
In that directory, put an index.html file with any non-confidential content, or no content, so long as the file is there. That will prevent rogue robots from getting a directory listing.
Now, in that directory, you can create secret directories with names unlikely to be guessed. The secret directory names are seemingly random combinations of characters. The Password Generator could be used to come up with good/strong secret directory names.
This protection by obfuscation can be very effective protection from rogue robots and even from snooping humans.
Content can be put in those secret directories without fear they will be discovered by rogue robots or their handlers.
Will Bontrager