Welcome to HBH! If you have tried to register and didn't get a verification email, please using the following link to resend the verification email.
robots.txt
Hello, how can I exploit a robots.txt directory? For example: www.randomwebsite.com/robots.txt All I see is Useragent: * and then Disallow and whatever is after that. So, how can I use this to my advantage?
Robots.txt is just information gathering, it contains directories that it's telling google not to index when it's crawling the Web , and they're not always linked correctly, so can sometimes be accessed.
You can also sometimes access content normally meant for logged in users only, by swapping your useragent for one of the allowed ones.