BeyondTrust

Security in Context: The BeyondTrust Blog

Welcome to Security in Context

Bringing you news and commentary on solutions and strategies for protecting critical IT infrastructure in the context of your business.

Webcast Recap: Surviving the Vulnerability Data Maelstrom with Dave Shackleford

Posted May 21, 2014    Chris Burd

If your vulnerability management (VM) processes are like most, you’re drowning in information and wondering whether your scanning and reporting tools are revealing true risks or sending every tiny issue your way for review.

Unfortunately, getting alerts for low-level vulnerabilities and false positives is still considered a standard best practice. But to free themselves from this onslaught of data, many IT teams are searching for a better way to be sure that their vulnerability programs are working hard amidst all the noise.

BeyondTrust recently joined Dave Shackleford, founder of Voodoo Security and SANS senior instructor, for a webcast looking at practical guidance for making your VM program more effective today. Here’s a summary of key takeaways from the presentation, plus a link to the webcast recording.

Finding context for what’s most important

One of the first challenges in effective vulnerability management is isolating what’s important from the reams of vulnerability data you collect. Scanners often detect low-level vulnerabilities that aren’t important in the big picture such as; scan details, OS fingerprinting, SSL/TLS ciphers, self-signed certificates, and web server internal IP disclosure.

Sometimes – though not always – this is redundant information that you don’t need from your vulnerability scan and you can get rid of VM noise by suppressing it. Shackleford provides three guiding questions to help you decide which vulnerability factors to suppress and which to keep:

  1. Is the information important to report to stakeholders?
  2. Is the vulnerability or information useful for remediation?
  3. Will we act upon this, and where does it fall in terms of priorities?

The most important information to gather and analyze is the information that is most useful to your stakeholders. If you’re processing more data than your context needs, it could obstruct the analysis of data that is actually valuable – wasting time and money.

Identifying participants in the VM process

Your organization may have five or more partners who need to be in on the vulnerability data loop, such as system owners and system custodians, departmental support staff, developers and QA teams, security teams, and business unit management teams.

Once you have identified these key users, you need to figure out what kind of data they want and what kind of data is most valuable to them. For example, for participants collecting system inventory data, IP addresses and system DNS names might appear to be the most useful. For others, process and service inventory data might be the most useful. Looking at the context of the data in regards to the participants involved in the process will help you decide your next steps.

Weaving VM into day-to-day operations

The final and most important challenge to overcome is weaving VM management into your organization’s broader day-to-day operations. Shackleford notes that an important aspect of making VM approachable is to deliver regular, condensed vulnerability reports. These can include the top ten or twenty issues found, explicit technical details, remediation guidance and risks, and alternatives and options. The purpose of this report is to prioritize the data for your stakeholders and make the value of your VM software and process clear.

Much of the process of making VM more effective and accessible is to provide context and prioritization for your key stakeholders. If you want to learn more about keeping your vulnerability management processes efficient and effective in the current threat environment, check out the complete webcast below:

Surviving the Vulnerability Data Maelstrom with Dave Shackleford from BeyondTrust on Vimeo.

Tags:
, , , , ,

Leave a Reply

Additional articles

PowerBroker for Unix & Linux helps prevent Shellshock

Posted September 25, 2014    Paul Harper

Like many other people who tinker with UNIX and Linux on a regular basis, BASH has always been my shell of choice.  Dating back to the early days moving from Windows to a non-Windows platform, mapping the keys correctly to allow easy navigation and control helped ensure an explosion of use for the shell. Unfortunately,…

Bash “Shellshock” Vulnerability – Retina Updates

Posted September 24, 2014    BeyondTrust Research Team

A major vulnerability was recently discovered within bash which allows arbitrary command execution via specially crafted environment variables. This is possible due to the fact that bash supports the assignment of shell functions to shell variables. When bash parses environment shell functions, it continues parsing even after the closing brace of the function definition. If…

pbps-blog3

7 Reasons Customers Switch to Password Safe for Privileged Password Management

Posted September 24, 2014    Chris Burd

It’s clear that privileged password management tools are essential for keeping mission-critical data, servers and assets safe and secure. However, as I discussed in my previous post, there are several pitfalls to look out for when deploying a privileged password management solution. At this point, you may be wondering how BeyondTrust stacks up. With that,…

Tags:
, , , , ,