Need of the Hour: Continuous Vulnerability Assessment & Intelligent Data Aggregation

Technology/Software is continuously evolving. Factors that are behind the evolution are: need for reach out to a larger user base, better user experience, faster collection-processing of information etcÔö£├ÂÔö£├éÔö¼├║├ö├Â┬úÔö£├®Ôö£├ÂÔö£├éÔö¼├║├ö├Â┬úÔö¼ÔòæÔö£├ÂÔö£├éÔö¼ÔòØ├ö├Â┬╝Ôö¼ÔòØ In the race to do more business, many enterprises are exposing their data to an outer world.  Though there has been greater awareness of the need for security to safe guard data both at rest & in motion, but there is a GAP in knowing where & what security measure is to be applied.Until now the security is seen as an assessment activity only. This focus on assessment has created a compliance requirement of performing assessments before anything goes live, which in turn drives need for security tools (scanners) & knowledgeable security analysts (consultants). If a website and its related infrastructure systems are going live then all must be tested for vulnerabilities. This process is done by a security expert who would spend roughly a week, scanning, prodding around ports, modifying cookies, URLs and hidden form fields, and then finally will delivering a PDF report documenting the findings.As high-profile information leaks, retail data breaches and other cybercrimes dominate headlines, the enterprises & govt. bodies are racing to protect their IT systems from internal and external threats, it has to be much more than just vulnerability assessment & reporting.We at EVM have been running Vulnerability Management Program (VMP) for multiple customers over many years now. All enterprises we work for have information security standards instructing at bare minimum, what all security activities are to be performed under vulnerability management, like: Asset classification, scan frequency of internal & external systems, vulnerability classification as per severity, remediation timelines based on criticality etcÔö£├ÂÔö£├éÔö¼├║├ö├Â┬úÔö£├®Ôö£├ÂÔö£├éÔö¼├║├ö├Â┬úÔö¼ÔòæÔö£├ÂÔö£├éÔö¼ÔòØ├ö├Â┬╝Ôö¼ÔòØ Complying with the standards, we have been running scans on intervals & delivering timely reports. Standardized routine reports while helpful are insufficient.To enable enterprises make informed security decisions & prioritize remediation effort, following scorecard reports are prepared time-time & shared:

  • Common Vulnerabilities: Lists vulnerabilities most commonly detected in the environment
  • High severity Vulnerabilities: List of Top 10 high severity vulnerabilities found across hosts
  • Vulnerability distribution by platform : Concentrates on providing vulnerabilities by platforms / technology
  • Percentage System with Severe Vulnerabilities: Percentage of hosts with high severity vulnerabilities
  • Mean-time-to Mitigate Vulnerabilities: How quickly the vulnerabilities are patched
  • Percentage of System Scanned: Shows percentage of hosts scanned within a timeframe (1 day, 2-7 days, 8-15 days, 16-30 days, 31-60 days, 61-90 days, and 90+ days)
  • Percentage of Bad fixes: How many times the vulnerabilities re-open, etcÔö£├ÂÔö£├éÔö¼├║├ö├Â┬úÔö£├®Ôö£├ÂÔö£├éÔö¼├║├ö├Â┬úÔö¼ÔòæÔö£├ÂÔö£├éÔö¼ÔòØ├ö├Â┬╝Ôö¼ÔòØ

 Coming up with stats for the listed scorecards, require pulling scan results from past scans. Scanning frequently and holding onto scan data, pose a new problem: data volume. The sheer volume of data generated in a scan when compiled over a time is staggering. Through strong partnership and alliances with vendors, we at EVM, intelligently aggregate security-relevant data (results from multiple scans) and produce a set of actionable data, that enables enterprise security decision makers prioritize where & what security measure is to be applied.

Rate this article: 
Average: 1 (1 vote)