new hashcracking idea
During the creation of a hash cracker i came up with an alternative solution, i am not sure if this was a good idea but i want to know your ideas about it.
Me, having about 1gbit web space was thinking about the way how to store a hash, now i came up with the idea letting google everlasting index my site. People searching for a specific hash on google would come to the right place and can simply read out their hash.
What do you guys think about this, could this be a good alternative to password cracking or is there a gap in this fairytale.
I am concerned about the amount of bandwidth this might burn.
here it is: https://root.cd/hashcracker/hash.php
by the way, this is NOT an invitation to hack the site because it is in a redesign phase, please leave my site alone, later on i will let you guys re pentest it with money prices included.
jelmer wrote: its not sensitive data or something, it are just boring hashes, i don't see a way how google could abuse this.
It's not the abuse that you should be worried about. It's a trivial task for google to stop indexing large collections of MD5 hashes, or stop people from searching for a hash.
well, yea, i cant really imagine a situation where google would care as long as they wont lose money/name. They know there are ways to proxy things through google trough translator, they don't have loss of money nor name so they wont really care lot about the entire problem. I do not think that google's intention of making the translator tool is making a proxy network too at the same place. It should not be too hard to fix such issues right? But no money involved, no loss of name involved, (no problem involved?)
I think its a pretty cool idea. If you have a hash just enter it in google and you'll find what it is.
A couple of question though.
-
How would you store the list of hashes?
-
Up to how many hashes are you going to store on there?
-
Wouldn't take a long time and probably more memory than is economically possible (unless you have money to afford storage and bandwidth)?
I'm not trying to be a smart ass or anything I'm just wondering how you would achieve this.
- How would you store the list of hashes? Not, the idea is that google will index me so i dont need more disk space, this was the problem where i had to come with an idea to store it in a different way
- Up to how many hashes are you going to store on there? untill im running out of bandwidth i think
- Wouldn't take a long time and probably more memory than is economically possible (unless you have money to afford storage and bandwidth)? You are right but it will index a lot of hash'es before my bandwidth will run out, in that case it will still be possible to push the 'in cache' button at google or see a glimp of the result in the google result.