Hello amazing hacker and welcome! I hope you are doing well. Please sit back and relax, uncle rat is going to tell you the story of how he found his first critical vulnerability. I’m going to tell you the events that led up to this bug, what i was thinking at the time and how i felt when i got that vulnerability.
It all started with me wanting to hunt a VDP and i know what you’re thinking. Eh, uncle, no cash is no fun but hold on. I knew this program paid out bonusses for getting crits so i set my arrows to [REDATED].
I usually hunt paid B2B program’s but this program was totally different, it was a recon target so i had to change up my strategy totally but i was determined to get a bug. I know how to do recon, i just had not applied it up to this point. One of my heroes is https://twitter.com/stokfredrik and he also hunts main apps only so i’ve devoted my whole time hunting to main app untill this point.
I had my target, this small rat was ready to sneak past any security defences [Redacted] threw at me and believe me friends. Rats are sneaky animals, and smart, very smart.
The hunt is on
To start i had to begin where everyone began. Since the target had a very broad scope, i had to enumerate everything first. I had to scout my grounds.
I always start with subdomain enumeration and it’s my believe that to do this effectively you have to consult as much sources as possible so i want off to run every single tool i could find:
- And the list goes on…
This left me with a bunch of lists that all contained subdomains, i merged the lists by writing a simple python script that would:
- Read all the files
- See if it had already written the subdomain to a new file
- If not, write it to the new file
- If it had, it would just discard the subdomain
This left me with 1 nice clean list
Narrowing down thousands of domains
Now we have a HUGE list of subdomains but that’s still not very usefull so we need to perform some more scoping steps before we can move on.
First we need to see which subdomains are live, for this, i used TomNomNom’s tool called httprobe.
Take a list of domains and probe for working http and https servers. ▶ go get -u github.com/tomnomnom/httprobe httprobe…
This left me with a list of subdomains i knew were live and returning a 200 OK status code. This was a much more workeable list but it still contained over 1100 subdomains, that’s still not very handy. I can’t go through all of those manually.
Looking for the gold
At this point is still had no clue at all what i was looking for. I was just grabbing blindly in the dark and seeing what was happening. This had to change. I had to know what i was dealing with so i ran the next tool in my toolbelt. I wanted to have a screenshot of all of the domains and i used aquatone of this.
A Tool for Domain Flyovers. Contribute to michenriksen/aquatone development by creating an account on GitHub.
This left me with a nice list of sceenshots that i could glance over to bring me some idea’s. I saw a lot of random stuff, a lot of error pages, a lof of default webserver pages but also a lot of login pages.
Now that is interesting, i love login pages, a big portion of them were default login pages for tools you can buy or open source tools so i tried the default credentials and ….. NOTHING! Dang, i did not find anything but during my search i came across what would bring me my first ever crit.
Going for the kill
Custom login pages
During my search i came across a lot of custom login pages as well and it got me wondering how i could hack these. I tried the first thing i could think of which was trying weak credentials, test/test, admin/admin,… but none of these got me any results.
But then it occured to me… Custom login pages … They are trying to protect something … but the developer has to protect every page seperatly if they are not carefull…
I have the privilidge of owning a copy of burp suite pro which is paid for by my boss. Burp suite pro has EXCELLENT engagement tools such as content discovery so i screwed by the amount of threads to just 1 and built in a delay let the thing go to work on several subdomains at a time. I did this because i really did not want to negatively impact the performance of any system. This is VERY important to know guys!
After a couple of hours one of my content discovery windows came back with the result i was waiting for:
I rocketed out of my set and went to work, opening this thing in my browser, not knowing what to expect but i could not believe my eyes.
I opened firefox, threw the URL in the browser and low and behold, there was a search page. This diden’t mean anything yet as I had no clue what to search for or if i could even search in the first place. The first thing i could think of to look for was “HR” but nothing popped up. I found some boring business documents but a high priority issue at best.
I started searching just every letter of the alphabet, there had to be something here, i knew it, i could feel it! And at “p” i got what i wanted. “Personnel.xlsx”. PII of about 1500 people and here in Europe, we take that very seriously. Critically seriously even. I went to work writing my report and submitted it after rechecking a thousand times i diden’t make any mistakes and then the wait being…
The tension is killing me
For the next day i kept my phone in hand all day and kept checking my emails every 5 minutes to the point i had to charge the phone about 7 times that day. Nothing came but i would not let my head down, i was sure it would get accepted, this was just too good.
Just before i was ready to give up and put my phone down at 5pm that day it happened though, i saw that almighty “triaged” we are all looking for.
Just one hour later i got an email that the issue was fixed and they would offer me a 250 Euro bonus for finding this issue. A few days i helped them retest and i never touched recon again.
And we all lived happily after and the internet was a bit more secure again. Thank you for reading and i hope you enjoyed! Uncle rat, out!