Modern vulnerability management: creating order from chaos

Modern vulnerability management: creating order from chaos

Stephen Roostan, VP EMEA at Kenna Security, considers the importance of vulnerability management and how it is an evolutionary process founded on a definition of acceptable risk.

Vulnerability management is the essential set of activities that – done well – will proactively identify, track, prioritise, and remedy security weaknesses in IT infrastructure and software. Its core objective is more than protection at any cost. It’s about improving efficiency and having effective deployment of resources to optimise costs. Furthermore it enables the measurement of cyber risk reduction, as well as agreement and understanding of an acceptable level of risk tolerance. All in all, this will support organisations to prevent malware attacks, data theft, and other consequences of a cyber breach such as customer experience failures and reputation damage.

Enterprises that are on the road to modern, risk-based vulnerability management know which vulnerabilities pose a real threat and how to prioritise the fixes that will achieve their defined level of risk tolerance. Creating a common language for security and IT teams to assess and address risk, drives data-driven decision-making. Ultimately, it is about lowering risk with continuous improving economics.

At an enterprise IT level, vulnerability management requires discipline and attention to detail, given the need to manage hundreds, if not thousands of laptops, servers, and other connected assets such as IoT, application and websites. While it is a central component of any holistic cybersecurity strategy, organisations vary widely in their approach and effectiveness, with too many relying on outdated models to prioritise vulnerabilities.

Part of the challenge is that most large technology environments contain millions of vulnerabilities, but trying to patch all of them isn’t practical, and in fact, most organisations only have the resources to patch 10%. Security and IT teams frequently disagree over which to tackle first, and companies will frequently try to patch as much as possible in the mistaken belief that every vulnerability brings with it the potential of a disastrous breach.

The result can be a never ending treadmill where IT teams are, effectively, running to stand still, and the most important and urgent vulnerabilities don’t get addressed in a timely manner. In contrast, modern vulnerability management is a systematic, and data-driven approach offering full visibility into a technology stack to identify key vulnerabilities. Getting it right means organisations can more effectively set and meet internal SLA obligations, respond quickly to threats and have meaningful discussions about organisational risk tolerance.

Vulnerability management: a process of evolution

Developing a modern vulnerability management program is an evolutionary process, with a number of key components. The application of data science to the process makes it possible to calculate which vulnerabilities are the most likely to be exploited. By combining data from threat intelligence platforms and vulnerability databases, organisations can leverage decades of knowledge and real-time data . These can include anything from which operating systems the vulnerabilities are found on to whether security researchers are experimenting with the vulnerability in real-time.

The first practical step should then be to remediate the riskiest vulnerabilities first. The desired outcome of these efforts is that there will be fewer incidents where security teams fail to patch vulnerabilities with a high likelihood of exploitation, and fewer instances in which a vulnerability with a low likelihood of being exploited is patched. This is a major step for an organisation to lower its overall risk profile, and from there it is well placed to use vulnerability management success to drive operational change.

To get there, security analysts must have full visibility of both risks and vulnerabilities to effectively evaluate the next best remediation choice. For instance, two parts of the same enterprise network, one public and one private could each contain vulnerabilities of different risk scores – one moderate, one critical. Despite the variation, security teams need to prioritise patching the one that poses that highest risk to the organisation first and this depends on where the vulnerability sits in their own environment. For example, it’s possible for the organisation to lower its overall risk by patching the moderate risk on the public facing sector of the network as this vulnerability poses a higher risk than the critical one sitting in the private network.

Defining acceptable risk

Despite concerted and often successful efforts to manage vulnerabilities, organisations often struggle with setting goals and boundaries for acceptable levels of risk. Aiming for zero simply isn’t practical and the law of diminishing returns will also have an impact on long term results. Continuing to devote significant resources to vulnerability management yields smaller and smaller benefits.

So, defining acceptable levels of risk requires wider industry context. For example, in January 2020 alone, the National Vulnerability Database added over 1,800 new entries. Even though just a fraction are high-risk, it’s also true that only about one-third of these are patched within a month. By definition that means everyone else reaches the end of the month with more high risk vulnerabilities than before, or is just treading water.

This data driven insight can provide the benchmarks required to optimise internal operations and agree levels of acceptable risk. Armed with this approach, businesses can then set internal SLAs that compare their vulnerability management to industry norms, or at the speed at which new risks emerge. The result is in complete contrast to the all or nothing approach still seen in many organisations, as it sets realistic expectations that can be effectively communicated across all relevant stakeholders.

When vulnerability management programs are given the investment they deserve, businesses can build confidence in a process that is fit for purpose, supported by all its stakeholders and most important of all, highly effective in dealing with the shifting sands of cybersecurity risk.


Stephen Roostan is VP EMEA at Kenna Security.

Main image courtesy of iStockPhoto.com

Copyright Lyonsdown Limited 2021

Top Articles

RockYou2021 data leak: 8.4 billion passwords compromised

A report shows that 100GB of data which includes 8.4 billion passwords have been recently leaked on the internet, people are being encouraged to secure their accounts.

Hackers Breach Electronic Arts & Steal Game Code

Electronic Arts, one of the world's biggest video game publishers including games such as FIFA, Madden, Sims and Medal of Honor, are the latest company to be hacked.

JBS Foods paid £7.7m in ransom to REvil ransomware gang

JBS Foods, the world’s largest processor of beef and poultry products, has admitted to paying a ransom of $11 million to cyber criminals, a week after it announced that operations…

Related Articles

[s2Member-Login login_redirect=”https://www.teiss.co.uk” /]