Basic Fuzzing

Basic Fuzzing

Directory Fuzzing

As we can see from the example above, the main two options are -w for wordlists and -u for the URL. We can assign a wordlist to a keyword to refer to it where we want to fuzz. For example, we can pick our wordlist and assign the keyword FUZZ to it by adding :FUZZ after it:

ffuf -w /usr/share/seclists/Discovery/Web-Content/directory-list-2.3-small.txt:FUZZ

Next, as we want to be fuzzing for web directories, we can place the FUZZ keyword where the directory would be within our URL, with:

ffuf -w <SNIP> -u http://<target_URL>/FUZZ

Now, let's start our target in the question below and run our final command on it:

 ffuf -w /usr/share/seclists/Discovery/Web-Content/directory-list-2.3-small.txt:FUZZ -u http://www.example.com/FUZZ

Output looks like:


        /'___\  /'___\           /'___\       
       /\ \__/ /\ \__/  __  __  /\ \__/       
       \ \ ,__\\ \ ,__\/\ \/\ \ \ \ ,__\      
        \ \ \_/ \ \ \_/\ \ \_\ \ \ \ \_/      
         \ \_\   \ \_\  \ \____/  \ \_\       
          \/_/    \/_/   \/___/    \/_/       

       v1.1.0-git
________________________________________________

 :: Method           : GET
 :: URL              : http://www.example.com/FUZZ
 :: Wordlist         : FUZZ: /usr/share/seclists/Discovery/Web-Content/directory-list-2.3-small.txt
 :: Follow redirects : false
 :: Calibration      : false
 :: Timeout          : 10
 :: Threads          : 40
 :: Matcher          : Response status: 200,204,301,302,307,401,403
________________________________________________

home                    [Status: 301, Size: 326, Words: 20, Lines: 10]
:: Progress: [87651/87651] :: Job [1/1] :: 9739 req/sec :: Duration: [0:00:09] :: Errors: 0 ::

We see that ffuf tested almost 90K URLs in under 10 seconds. The speed might change based on your internet speed and ping if you run ffuf on your own machine, but it should still be very fast.

If you're in a hurry, you can make it even faster by increasing the number of threads to 200 using -t 200. However, this is not recommended, especially on a remote site, as it might disrupt the site or cause a Denial of Service, or even slow down your internet connection in extreme cases. We do find a few results, and we can visit one to check if it exists.

We get an empty page, which means the directory doesn't have its own page, but we do have access since we don't get a 404 Not Found or 403 Access Denied error.


Page Fuzzing

We now understand the basic use of ffuf through the utilization of wordlists and keywords. Next, we will learn how to locate pages.

Extension Fuzzing

In the previous section, we found that we could access /home, but the directory showed an empty page, and we couldn't find any links or pages manually. So, we'll use web fuzzing again to check if there are any hidden pages in the directory. Before starting, we need to know what types of pages the website uses, like .html, .aspx, .php, or others.

A common way to figure out the file extension is by looking at the server type in the HTTP response headers and making an educated guess. For example, if the server is apache, it might use .php, or if it's IIS, it could use .asp or .aspx. But this method isn't very reliable. So, we'll use ffuf again to test the extension, just like we did for directories.

Instead of putting the FUZZ keyword where the directory name would be, we'll place it where the extension would be, like .FUZZ, and use a wordlist for common extensions. We can use this wordlist from SecLists for extensions: seclists/Discovery/Web-Content/web-extensions.txt. Before we start testing, we need to decide which file the extension will be at the end of. We can always use two wordlists and have a unique keyword for each, then do FUZZ_1.FUZZ_2 to test for both. However, there is one file we can usually find on most websites, which is index.*, so we will use it as our file and test extensions on it.

Now, we can rerun our command, carefully placing our FUZZ keyword where the extension would be after index:

ffuf -w /usr/share/seclists/Discovery/Web-Content/web-extensions.txt:FUZZ -u http://www.example.com/blog/indexFUZZ

Page Fuzzing

We will now use the same concept of keywords we've been using with ffuf, use .php as the extension, place our FUZZ keyword where the filename should be, and use the same wordlist we used for fuzzing directories:

 ffuf -w /usr/share/seclists/Discovery/Web-Content/directory-list-2.3-small.txt:FUZZ -u http://www.example.com/blog/FUZZ.php

Recursive Fuzzing

So far, we have been looking for directories, then checking inside them for files. But if there are many directories, each with their own subdirectories and files, this process would take a long time. To speed this up, we will use something called recursive fuzzing.

Recursive Flags

When we scan recursively, it automatically starts another scan under any newly identified directories that may have on their pages until it has fuzzed the main website and all of its subdirectories.

Some websites have many sub-directories, like /login/user/content/uploads/...etc, which can make the scanning process take a long time. To manage this, it's recommended to set a depth limit for our recursive scan, so it doesn't go deeper than that level. After scanning the first set of directories, we can choose the most interesting ones and run another scan to focus our efforts better.

In ffuf, we can turn on recursive scanning with the -recursion flag, and set how deep it goes with the -recursion-depth flag. If we set -recursion-depth 1, it will only check the main directories and their direct sub-directories. It won't check any deeper sub-directories (like /login/user). When using recursion in ffuf, we can set our file extension with -e .php.

Finally, we will add the -v flag to show the full URLs. This way, we can easily see which .php file is in each directory.

Recursive Scanning

Let us repeat the first command we used, add the recursion flags to it while specifying .php as our extension, and see what results we get:

ffuf -w /usr/share/seclists/Discovery/Web-Content/directory-list-2.3-small.txt:FUZZ -u http://www.example.com/FUZZ -recursion -recursion-depth 1 -e .php -v

As we can see this time, the scan took much longer, sent almost six times the number of requests, and the wordlist doubled in size (once with .php and once without). Still, we got a large number of results, including all the results we previously identified, all with a single line of command.


Matches and Filters Flags in FFUF

FFUF provides powerful matching (-m) and filtering (-f) options to refine fuzzing results based on HTTP responses.

Matching Options (-m flags)

These flags match responses based on status codes, response size, words, and lines.

FlagDescriptionExample Usage
-mcMatch HTTP status codes-mc 200,302 (Show only 200 & 302 responses)
-msMatch HTTP response size-ms 500 (Show responses with size 500 bytes)
-mwMatch response word count-mw 50 (Show responses with 50 words)
-mlMatch response line count-ml 10 (Show responses with 10 lines)

Filtering Options (-f flags)

These flags filter out unwanted responses based on status codes, size, words, and lines.

FlagDescriptionExample Usage
-fcFilter out HTTP status codes-fc 403,404 (Hide 403 & 404 responses)
-fsFilter out response size-fs 1000 (Hide responses of 1000 bytes)
-fwFilter out response word count-fw 20 (Hide responses with 20 words)
-flFilter out response line count-fl 5 (Hide responses with 5 lines)