Manual Crawl (Proxy Mode)

Manual crawl is generally useful if you need to scan only a part of the website. 

  1. Start a new scan with the settings you want to use
  2. Choose Manual Crawl (Proxy Mode) from the Start Scan button


The scan will start, however Netsparker will pause the scan after requesting the starting URL and will not crawl other links. You'll notice that in the toolbar, the Proxy is already started.


Now, open your browser and configure your browser's proxy to Netsparker's proxy.

Sample Proxy Settings in Firefox


Sample Proxy Settings in Internet Explorer


Now all you need to do is browse the target application from your web browser, as soon as you visit new URLs you'll see that the Sitemap in Netsparker will start to show those new URLs.


After visiting the following URL from the browser: http://test12.netsparker.com:60001/XSS-Basic/ Netsparker will get that URL and show it in the sitemap.

This is the new sitemap view:


Important Note

In Manual Crawl (Proxy Mode) Netsparker will not find and crawl new links. This includes parameters pointing to the same page. For example if you visit test.php, Netsparker won't attack "test.php?id=1" unless you visit "test.php?id=1" as well.

Netsparker will only get links and parameters from the proxy. If you want Netsparker to crawl further, use Crawl & Wait mode and start the proxy from the toolbar.

When you’ve crawled all the links and parameters you want to test, you can click the Resume button from the toolbar and Netsparker will start the attacking phase and report vulnerabilities.

If you want to exclude some of the crawled links from the scan, you can right click them on the sitemap and choose Exclude From Attack.


Final Notes About Manual Crawl

  • Cookies originating from the proxy request will override cookies originating from Netsparker. For example, if you login to the application from your web browser Netsparker will use that session and will see pages as you see them.
  • The Manual crawl will not crawl new links and parameters automatically (use Crawl & Wait for this)
  • You can't use a Proxy during the Attack phase, a proxy can only be used during the Crawl phase
  • If you are testing "localhost" ensure that localhost is not in the "bypass proxy list" of your browser
  • While you browse pages from your web browser, Crawler will do the very same request and will report passively identifiable vulnerabilities such as "Programming Error Messages" and "Version Disclosure".


Please sign in to leave a comment.