How You can Improve Application Scanning Coverage?

Enumerating a database, making the resource unavailable are not pretty much difficult for a skilled hacker by abusing the loopholes present in the application. 

This results in reputation damage and degrading the brand value of the Organizations.

To avoid such disastrous scenario organizations should adopt defense-in-depth strategy for the web applications by eradicating all the vulnerabilities present in any application irrespective of their severity.

Now the point is how to achieve this?

According to industry standards one way to achieve this is by detecting the vulnerabilities using commercial scanners in parallel to manual efforts of skilled security analysts.

According to 2015 Gartner Magic Quadrant for Application Security Testing (AST) Static AST (SAST), Dynamic AST (DAST), and Interactive AST (IAST) are various types of security testing methods provided by top vendors.

HP has the dominance in all of the capabilities mentioned. For DAST WebInspect is offered by HP. 

HP WebInspect is a commercial tool and to scan a website license is required.

Settings in WebInspect to improve the extent of scan coverage:

Before setting a scan using HP WebInspect one should manually assess the website (before using any tools) to ensure more accurate results.

Since the output of the scan depends on the input to WebInspect below are the settings for better scan coverage:

  • Start the WebInspect then open the Scan wizard window. Now select the type of Scan. Let’s select Website Scan and then select the type of scan. 
  • Standard scan: A standard way to start a scan. List Driven scan: This scan will allow to define a list of URL's to perform scan. So one can scan those URLs only. In a text file specify all the URL's to be scanned. Workflow Driven scan: Not entire site but a part of site can be scanned using workflow driven scan. Using workflow macro you can specify the part to be scan. Manual scan: Manually you can specify the links that are to be scanned by browsing the site. 
  • For better scan coverage you can select Restrict To Folder option under which one can find -
  1.  Directory only: Only main directory will be assessed not the directories present inside it.
  2.  Directory and sub directories: WebInspect will not hit any folder that is higher in the directory tree.
  3.  Directory and parent directory: Folder that is lower in the directory tree will not be assessed.
  • Crawl and Audit are the essential things done by WebInspect to conduct a scan for better coverage. If WebInspect crawls and audits simultaneously it is called as “simultaneously “mode. If WebInspect crawls the entire site and then audits one by one, it’s in “sequential” mode. So, if your site content changes before the crawl is completed, it’s best to use “simultaneously” mode. If you select “sequential” mode, you need to select the order in which crawl and audit will take place. 
  • Areas that need to be excluded from the Crawl should be specified so that scan will not go out of application scope.
  • Under General settings, setting Maximum crawl-audit recursion depth:  If vulnerability is found in a single page, WebInspect crawls and follows the link. If that link points to another, then recursion depth is 1. If that link points to another, then recursion depth is 2. Default value is 2 and maximum value is 1000.
  • Setting Depth First and Breadth First: If the web application follows orders requests system example: in an online shopping cart, the user visits the shopping cart page before accessing the check-out page, then select Depth First. If the web application does not follow an order of requests, select Breadth First. If your web application does not follow an order of requests, you can select Breadth First.
  • Setting Limit maximum single URL hits: Refers to the number of times a page can be hit by WebInspect. It’s an important feature because WebInspect can sometimes enter into an endless loop, depending on the site architecture. To prevent this scenario place a limit on maximum single URL hits.
  • Also Login Macro plays important role for better scan coverage. So, during the scan login macros must be checked for errors. Most of the time a login macro that is incorrectly recorded may fail to login to the site due to which the scan produce invalid results. Some of the Symptoms: scan times are abnormally short, lack of vulnerabilities, or large numbers of errors in the error log.

Authored by Geetanjali Das

Rate this article: 
Average: 1 (473 votes)
Article category: