Forums/Documentation & Manual/Netsparker Help

Manual Crawl (Proxy Mode)

Ferruh Mavituna
posted this on December 07, 2010 13:51

Manual crawl is generally useful if you need to scan only a part of the website. 

  1. Start a new scan with the settings you want to use
  2. Choose Manual Crawl (Proxy Mode) from the Start Scan button

ProxyMde.png

The scan will start, however Netsparker will pause the scan after requesting the starting URL and will not crawl other links. You'll notice that in the toolbar, the Proxy is already started.

ProxyMode2.png

Now, open your browser and configure your browser's proxy to Netsparker's proxy.

Sample Proxy Settings in Firefox

Firefox.png

Sample Proxy Settings in Internet Explorer

InternetExplorer.png

Now all you need to do is browse the target application from your web browser, as soon as you visit new URLs you'll see that the Sitemap in Netsparker will start to show those new URLs.

Sitemap.png

After visiting the following URL from the browser: http://test12.netsparker.com:60001/XSS-Basic/ Netsparker will get that URL and show it in the sitemap.

This is the new sitemap view:

Sitemap2.png

Important Note

In Manual Crawl (Proxy Mode) Netsparker will not find and crawl new links. This includes parameters pointing to the same page. For example if you visit test.php, Netsparker won't attack "test.php?id=1" unless you visit "test.php?id=1" as well.

Netsparker will only get links and parameters from the proxy. If you want Netsparker to crawl further, use Crawl & Wait mode and start the proxy from the toolbar.

When you’ve crawled all the links and parameters you want to test, you can click the Resume button from the toolbar and Netsparker will start the attacking phase and report vulnerabilities.

If you want to exclude some of the crawled links from the scan, you can right click them on the sitemap and choose Exclude From Attack.

Exclude.png

Final Notes About Manual Crawl

  • Cookies originating from the proxy request will override cookies originating from Netsparker. For example, if you login to the application from your web browser Netsparker will use that session and will see pages as you see them.
  • The Manual crawl will not crawl new links and parameters automatically (use Crawl & Wait for this)
  • You can't use a Proxy during the Attack phase, a proxy can only be used during the Crawl phase
  • If you are testing "localhost" ensure that localhost is not in the "bypass proxy list" of your browser
  • While you browse pages from your web browser, Crawler will do the very same request and will report passively identifiable vulnerabilities such as "Programming Error Messages" and "Version Disclosure".
 

Comments

User photo
Mattias Baecklund

Does it only listen to localhost or can you set it to listen to real IP-address?

December 08, 2010 10:24
User photo
Ferruh Mavituna
Netsparker Ltd

Mattias 1.7.0.0 only binds to localhost however if you think this would be a nice feature to support we might add it to the next version.

December 08, 2010 10:34
User photo
Mattias Baecklund

It would help me to sell in netsparker as a tool for my company if it could listen to real IP’s.  Then I could run netsparker on one of my machine and have the BSA's us it as a proxy when they test new functionality.

December 08, 2010 12:16
User photo
Ferruh Mavituna
Netsparker Ltd

Mattias, sound like a good idea. I've added this to our to do list and hopefully next version will include this feature. I'll let you know when it's ready. 

December 08, 2010 12:39
User photo
Rajesh TV

thanks

 

July 09, 2011 13:09
User photo
Ferruh Mavituna
Netsparker Ltd

Hi Mattias, FYI we added support to listen all IP addresses in v2.1.0.39 you can change it from the Advanced Settings. Set InternalProxyAllowRemote to True

Thanks,

February 02, 2012 11:14