BeyondTrust

Security in Context: The BeyondTrust Blog

Welcome to Security in Context

Bringing you news and commentary on solutions and strategies for protecting critical IT infrastructure in the context of your business.

Webcast Recap: Surviving the Vulnerability Data Maelstrom with Dave Shackleford

Posted May 21, 2014    Chris Burd

If your vulnerability management (VM) processes are like most, you’re drowning in information and wondering whether your scanning and reporting tools are revealing true risks or sending every tiny issue your way for review.

Unfortunately, getting alerts for low-level vulnerabilities and false positives is still considered a standard best practice. But to free themselves from this onslaught of data, many IT teams are searching for a better way to be sure that their vulnerability programs are working hard amidst all the noise.

BeyondTrust recently joined Dave Shackleford, founder of Voodoo Security and SANS senior instructor, for a webcast looking at practical guidance for making your VM program more effective today. Here’s a summary of key takeaways from the presentation, plus a link to the webcast recording.

Finding context for what’s most important

One of the first challenges in effective vulnerability management is isolating what’s important from the reams of vulnerability data you collect. Scanners often detect low-level vulnerabilities that aren’t important in the big picture such as; scan details, OS fingerprinting, SSL/TLS ciphers, self-signed certificates, and web server internal IP disclosure.

Sometimes – though not always – this is redundant information that you don’t need from your vulnerability scan and you can get rid of VM noise by suppressing it. Shackleford provides three guiding questions to help you decide which vulnerability factors to suppress and which to keep:

  1. Is the information important to report to stakeholders?
  2. Is the vulnerability or information useful for remediation?
  3. Will we act upon this, and where does it fall in terms of priorities?

The most important information to gather and analyze is the information that is most useful to your stakeholders. If you’re processing more data than your context needs, it could obstruct the analysis of data that is actually valuable – wasting time and money.

Identifying participants in the VM process

Your organization may have five or more partners who need to be in on the vulnerability data loop, such as system owners and system custodians, departmental support staff, developers and QA teams, security teams, and business unit management teams.

Once you have identified these key users, you need to figure out what kind of data they want and what kind of data is most valuable to them. For example, for participants collecting system inventory data, IP addresses and system DNS names might appear to be the most useful. For others, process and service inventory data might be the most useful. Looking at the context of the data in regards to the participants involved in the process will help you decide your next steps.

Weaving VM into day-to-day operations

The final and most important challenge to overcome is weaving VM management into your organization’s broader day-to-day operations. Shackleford notes that an important aspect of making VM approachable is to deliver regular, condensed vulnerability reports. These can include the top ten or twenty issues found, explicit technical details, remediation guidance and risks, and alternatives and options. The purpose of this report is to prioritize the data for your stakeholders and make the value of your VM software and process clear.

Much of the process of making VM more effective and accessible is to provide context and prioritization for your key stakeholders. If you want to learn more about keeping your vulnerability management processes efficient and effective in the current threat environment, check out the complete webcast below:

Surviving the Vulnerability Data Maelstrom with Dave Shackleford from BeyondTrust on Vimeo.

Tags:
, , , , ,

Leave a Reply

Additional articles

red-thumbprint

Why big data breaches won’t always be so easy

Posted September 19, 2014    Byron Acohido

This blog post is republished with the permission of ThirdCertainty. See the original post here. – By: Byron Acohido, Editor-In-Chief, ThirdCertainty Some day, perhaps fairly soon, it will be much more difficult for data thieves to pull off capers like the headline-grabbing hacks of Home Depot and Target. That’s not a pipe dream. It’s the projected outcome…

Tags:
, , , , ,
pbps-blog2

8 Reasons Your Privileged Password Management Solution Will Fail

Posted September 18, 2014    Chris Burd

Leveraging complex, frequently updated passwords is a basic security best practice for protecting privileged accounts in your organization. But if passwords are such a no-brainer, why do two out of three data breaches tie back to poor password management? The fact is that not all privileged password management strategies are created equal, so it’s critical…

Tags:
, , , , , ,
pbps-customer-campaign-image

You Change Your Oil Regularly; Why Not Your Passwords?

Posted September 11, 2014    Chris Burd

There are many things in life that get changed regularly:  your car oil, toothbrush and hopefully, your bed sheets.  It’s rare that you give these things much thought – even when you forget to change them. But what if you’re forgetting something that can cost you millions of dollars if left unchanged for long periods…

Tags:
, , ,