How can we improve security, if security awareness training programs are not yielding results?

In the security industry, much emphasis is put on humans as the weakest link, and why organizations should invest in security awareness. It usually assumes a direct correlation between increased spending and increased security, which is not guaranteed. This sentiment leaves me conflicted.

On one hand, I agree that we need to invest in our people. Much needs to be done to educate employees on how to better protect themselves and their organization from unintentionally giving criminals access to restricted information. (Good processes for building effective programs do exist, and areextensively covered elsewhere.)

On the other hand, I feel it is my duty to point out that the security awareness industry has never been able to provide independent measures on the effectiveness of awareness to reduce risk. Furthermore, the metrics being used to measure security awareness may not be giving us the meaningful information we need to make the right decisions.

How can we improve security, if security awareness training programs are not yielding results?

We need a way to assess how our activities are changing the organization and, we need to be able to measure how our activities are impacting security. We need to know if our awareness and culture activities mean that security is better, worse or unchanged.

{Article continues below infographic – which is downloadable by the way.
Feel free to save and share.}

Measuring human factors of security: awareness metrics vs. culture metrics (infographic)

The first step is to collect data. A challenge is to identify the right kind of data to help find meaning and value.

Security awareness metrics and the results we have

One option could be to look at the number of training courses conducted per year, the number of participants in a particular training, click and open rates of simulated phishing emails, and so forth. 

Reports on the number of participants/opens/clicks for a particular training say something about the number of users opening, participating or clicking.  The problem with this approach is that you get a false sense of success. There are so many possible explanations for completion rates that have nothing to do with the employees’ security behaviors.

These kinds of metrics look nice and seem relevant at first glance. But what do they say about the security culture — about the security related attitudes, beliefs, norms and compliance — of those who click? When organizations fail to measure the effectiveness of their investments to change security, they may end up wasting time, money and resources.

What we want to know is: how does the culture in our company impact security? Are our colleagues talking about security at all? If they are, what do they say? Are they happy to follow security procedures or do they think security is a drag? Are they finding ways around the controls?

Metrics such as the number of participants/opens/clicks amount to little more than vanity metrics – a term coined by Eric Ries [3] which refers to metrics or numbers that at first glance look nice and telling, while after closer inspection do not give any relevant or actionable information on the matter under consideration.  One way to avoid relying on vanity metrics is to revisit your objectives and create clear goals.

A common objective of both security awareness and security culture programs is improved security behaviors. Whilst improved awareness alone is not enough to change behavior, these programs can be used to support cultural change. Measuring culture allows you to document how it is changing and provides an indicator of the effectiveness of your program.

Security culture metrics and the results we want

Measuring security is a given when discussing technical controls. Just like technical controls, organizational controls such as security awareness and security culture must be measured (independently of the training provider) in order to understand and manage change.

One problem that organizations previously had with measuring culture was the expense and the time needed to survey all employees. Limited resources and funding prevents some organizations from measuring culture at all, and means that others choose to survey a selection of their employees.

Surveying some employees and extrapolating data is sometimes considered a cost effective way to gain insights into your organization and its culture. Typically, organizations that choose this route would set up a team in-house or hire consultants to create the survey and analyze the results (based on a sample of employees). This approach provides relevant data, if the survey has been designed and validated by qualified and experienced survey specialists [4]. However, often, we find this is not the case.

Moreover, breaking down the process exposes some not inconsiderable costs: it takes time to create and validate the survey items, it requires research into which topics to cover, and of course it takes time to analyze and make sense of the data after it is gathered.

Finally, if you want to be able to compare the results, this approach is reliant on the same group of people (the same in-house team or consultancy) to conduct the same survey year after year.

Is there be a better option?

What if a standard tool existed, built to the highest scientific standards, and available at an attractive price point? If you could measure security culture across the whole of your organization, and be confident that the data you gather is reliable and valid, would you be interested?

CLTRe offer years of experience, specialist knowledge and scientific research. With this expertise, we have created a scientific measurement instrument for measuring security culture, and wrapped this in a SaaS solution for easy access, scalability and low cost. 

Talk to one our security culture experts

Book a demo and discuss your needs with one of our security culture experts. See how easy we’ve made it to set up a security culture measurement program and what sort of insights to expect.

Request a demo

References:

[1] Bada, M; et al (2014). Cyber Security Awareness Campaigns: Why do they fail to change behaviour? Global Cyber Security Capacity Centre, University of Oxford: Oxford, UK.
[2] Metalidou, E; et al (2014). The human factor of information security: Unintentional damage perspective. Procedia-Social and Behavioral Sciences, 147, 424-428.
[3] Ries, E (2011). The Lean Startup, Crown Publishing Group.
[4] Jones, TL; et al (2013). A quick guide to survey research. Annals of the Royal College of Surgeons of England, 95(1), 5-7.

First published Nov 16 2018, this blog was updated Mar 25 2019 to improve readability.

Related Posts