BeyondTrust

Security in Context: The BeyondTrust Blog

Welcome to Security in Context

Bringing you news and commentary on solutions and strategies for protecting critical IT infrastructure in the context of your business.

Webcast Recap: Surviving the Vulnerability Data Maelstrom with Dave Shackleford

Posted May 21, 2014    Chris Burd

If your vulnerability management (VM) processes are like most, you’re drowning in information and wondering whether your scanning and reporting tools are revealing true risks or sending every tiny issue your way for review.

Unfortunately, getting alerts for low-level vulnerabilities and false positives is still considered a standard best practice. But to free themselves from this onslaught of data, many IT teams are searching for a better way to be sure that their vulnerability programs are working hard amidst all the noise.

BeyondTrust recently joined Dave Shackleford, founder of Voodoo Security and SANS senior instructor, for a webcast looking at practical guidance for making your VM program more effective today. Here’s a summary of key takeaways from the presentation, plus a link to the webcast recording.

Finding context for what’s most important

One of the first challenges in effective vulnerability management is isolating what’s important from the reams of vulnerability data you collect. Scanners often detect low-level vulnerabilities that aren’t important in the big picture such as; scan details, OS fingerprinting, SSL/TLS ciphers, self-signed certificates, and web server internal IP disclosure.

Sometimes – though not always – this is redundant information that you don’t need from your vulnerability scan and you can get rid of VM noise by suppressing it. Shackleford provides three guiding questions to help you decide which vulnerability factors to suppress and which to keep:

  1. Is the information important to report to stakeholders?
  2. Is the vulnerability or information useful for remediation?
  3. Will we act upon this, and where does it fall in terms of priorities?

The most important information to gather and analyze is the information that is most useful to your stakeholders. If you’re processing more data than your context needs, it could obstruct the analysis of data that is actually valuable – wasting time and money.

Identifying participants in the VM process

Your organization may have five or more partners who need to be in on the vulnerability data loop, such as system owners and system custodians, departmental support staff, developers and QA teams, security teams, and business unit management teams.

Once you have identified these key users, you need to figure out what kind of data they want and what kind of data is most valuable to them. For example, for participants collecting system inventory data, IP addresses and system DNS names might appear to be the most useful. For others, process and service inventory data might be the most useful. Looking at the context of the data in regards to the participants involved in the process will help you decide your next steps.

Weaving VM into day-to-day operations

The final and most important challenge to overcome is weaving VM management into your organization’s broader day-to-day operations. Shackleford notes that an important aspect of making VM approachable is to deliver regular, condensed vulnerability reports. These can include the top ten or twenty issues found, explicit technical details, remediation guidance and risks, and alternatives and options. The purpose of this report is to prioritize the data for your stakeholders and make the value of your VM software and process clear.

Much of the process of making VM more effective and accessible is to provide context and prioritization for your key stakeholders. If you want to learn more about keeping your vulnerability management processes efficient and effective in the current threat environment, check out the complete webcast below:

Surviving the Vulnerability Data Maelstrom with Dave Shackleford from BeyondTrust on Vimeo.

Tags:
, , , , ,

Leave a Reply

Additional articles

Are Your Data Security Efforts Focused in the Right Area?

Posted January 28, 2015    Scott Lang

Vormetric Data Security recently released an insider threat report, with research conducted by HarrisPoll and analyzed by Ovum. Based on the survey responses, it is apparent that there is still a great deal of insecurity over data. However, the results also show that there may be misplaced investments to address those insecurities. I will explain…

Tags:
ghost

GHOST Vulnerability…Scary Indeed

Posted January 28, 2015    BeyondTrust Research Team

A vulnerability discovered by Qualys security researchers has surfaced within the GNU C Library that affects virtually all Linux operating systems. The vulnerability lies within the various gethostbyname*() functions and, as such, has been dubbed “GHOST.” GHOST is particularly nasty considering remote, arbitrary code execution can be achieved. In an effort to avoid taxing DNS lookups, glibc developers introduced…

Tags:
,
dave-shackleford-headshot

Your New Years Resolution: Controlling Privileged Users

Posted January 27, 2015    Dave Shackleford

Is 2015 the year you get a better handle on security? The news last year was grim – so much so, in fact, that many in the information security community despaired a bit. Really, the end-of-the-year infosec cocktail parties were a bit glum. OK, let’s be honest, infosec cocktail parties are usually not that wild…

Tags:
, , ,