BeyondTrust

Security in Context: The BeyondTrust Blog

Welcome to Security in Context

Bringing you news and commentary on solutions and strategies for protecting critical IT infrastructure in the context of your business.

Webcast Recap: Surviving the Vulnerability Data Maelstrom with Dave Shackleford

Posted May 21, 2014    Chris Burd

If your vulnerability management (VM) processes are like most, you’re drowning in information and wondering whether your scanning and reporting tools are revealing true risks or sending every tiny issue your way for review.

Unfortunately, getting alerts for low-level vulnerabilities and false positives is still considered a standard best practice. But to free themselves from this onslaught of data, many IT teams are searching for a better way to be sure that their vulnerability programs are working hard amidst all the noise.

BeyondTrust recently joined Dave Shackleford, founder of Voodoo Security and SANS senior instructor, for a webcast looking at practical guidance for making your VM program more effective today. Here’s a summary of key takeaways from the presentation, plus a link to the webcast recording.

Finding context for what’s most important

One of the first challenges in effective vulnerability management is isolating what’s important from the reams of vulnerability data you collect. Scanners often detect low-level vulnerabilities that aren’t important in the big picture such as; scan details, OS fingerprinting, SSL/TLS ciphers, self-signed certificates, and web server internal IP disclosure.

Sometimes – though not always – this is redundant information that you don’t need from your vulnerability scan and you can get rid of VM noise by suppressing it. Shackleford provides three guiding questions to help you decide which vulnerability factors to suppress and which to keep:

  1. Is the information important to report to stakeholders?
  2. Is the vulnerability or information useful for remediation?
  3. Will we act upon this, and where does it fall in terms of priorities?

The most important information to gather and analyze is the information that is most useful to your stakeholders. If you’re processing more data than your context needs, it could obstruct the analysis of data that is actually valuable – wasting time and money.

Identifying participants in the VM process

Your organization may have five or more partners who need to be in on the vulnerability data loop, such as system owners and system custodians, departmental support staff, developers and QA teams, security teams, and business unit management teams.

Once you have identified these key users, you need to figure out what kind of data they want and what kind of data is most valuable to them. For example, for participants collecting system inventory data, IP addresses and system DNS names might appear to be the most useful. For others, process and service inventory data might be the most useful. Looking at the context of the data in regards to the participants involved in the process will help you decide your next steps.

Weaving VM into day-to-day operations

The final and most important challenge to overcome is weaving VM management into your organization’s broader day-to-day operations. Shackleford notes that an important aspect of making VM approachable is to deliver regular, condensed vulnerability reports. These can include the top ten or twenty issues found, explicit technical details, remediation guidance and risks, and alternatives and options. The purpose of this report is to prioritize the data for your stakeholders and make the value of your VM software and process clear.

Much of the process of making VM more effective and accessible is to provide context and prioritization for your key stakeholders. If you want to learn more about keeping your vulnerability management processes efficient and effective in the current threat environment, check out the complete webcast below:

Surviving the Vulnerability Data Maelstrom with Dave Shackleford from BeyondTrust on Vimeo.

Tags:
, , , , ,

Leave a Reply

Additional articles

How To Implement The Australian Signals Directorate’s Top 4 Strategies

Posted October 20, 2014    Morey Haber

The Australian Signals Directorate (ASD), also known as the Defence Signals Directorate, has developed a list of strategies to mitigate targeted cyber intrusions. The recommended strategies were developed through ASD’s extensive experience in operational cyber security, including responding to serious security intrusions and performing vulnerability assessments and penetration testing for Australian government agencies. These recommendations…

Tags:
, , , ,
asp-mvc

Exploiting MS14-059 because sometimes XSS is fun, sometimes…

Posted October 17, 2014    BeyondTrust Research Team

This October, Microsoft has provided a security update for System.Web.Mvc.dll which addresses a ‘Security Feature Bypass’. The vulnerability itself is in ASP.NET MVC technology and given its wide adoption we thought we would take a closer look. Referring to the bulletin we can glean a few useful pieces of information: “A cross-site scripting (XSS) vulnerability exists…

Tags:
4bestpracticesaudits-blog

Four Best Practices for Passing Privileged Account Audits

Posted October 16, 2014    Chris Burd

Like most IT organizations, your team may periodically face the “dreaded” task of being audited. Your process for delegating privileged access to desktops, servers, and infrastructure devices is a massive target for the auditor’s microscope. An audit’s findings can have significant implications on technology and business strategy, so it’s critical to make sure you’re prepared…

Tags:
, , , ,