Are Human Beings Really the Problem in Cyber Security? Part 1


Authored by Liberty Specialty Markets Strategic Assets Underwriting Manager Matthew Hogg and Underwriter Graham Preston

In this blog, our Strategic Assets team delves into the cause of human cyber security errors, and how the design of the environments in which we work might deter employees from taking the right action to prevent breaches in the first place.

We all probably have behaviours that we would benefit from changing; eating more healthily, walking rather than driving or making our passwords impenetrable. So why don’t we? Should we castigate ourselves for our failures or promise that we’ll try harder next week when it’ll all be different? Or rather than considering what will make us do something next week that we’ve failed to do for countless previous weeks, should we maybe wonder what prevented us from doing these things in the first place?

This final comment centres around work presented by psychologist Kurt Lewin, who developed a theory known as force-field analysis that sees human behaviour as, broadly speaking, being subject to brakes and accelerators (1) i.e. things that prevent us from doing things and things that get us to do things. Professor Daniel Kahnemann, one of the founding thinkers of the now well-known nudge theory, described it as ‘the most important idea in psychology ever’ (2). With that considerable billing, we hope we can do the idea justice.

Just before you think you’ve stumbled upon the advice column in a lifestyle magazine offering premature tips about how to keep New Year’s resolutions, we will pivot to the context of cyber security. In our previous article we described how human beings often err from the narrow path of cyber security policies and become the source of errors that culminate in cyber security breaches. The statistics are hard to deny, but the cause may be less clear cut. The problem is that the tendency is to encourage stronger adherence to policies by, among other measures, linking compliance to remuneration or sending endless reminders. To use Kurt Lewin’s parlance, focus on ‘accelerators’ (3). It’s a tempting idea that this alone will make things better. But people were aware of these before, so why did they still act the way they did?

Professor Kahnemann advises that ”When you want to influence someone’s behaviour, you can either push them, or you can ask the question ‘Why aren’t they doing it already?'" (4).

To illustrate this concept, a study was carried out where, in a bid to encourage people to install thermal loft insulation, people were offered either a free supply and installation of loft insulation, or they could pay several hundred pounds for the loft insulation. Crucially, the second offer would include the service of clearing out the loft and putting the items back afterwards. Uptake for the second service was much higher, despite the considerable cost. In this scenario the ‘brake’ on installing insulation was clearly not the cost, but the dread of clearing out a cluttered loft which would have brought Marie Kondo out in a cold sweat.

Likewise, the reason people don’t change their password or pay due attention to a phishing e-mail may not be for want of ‘accelerators’ in the shape of training and reminders, but that there is a ‘brake’ preventing them from doing this. The reasons for each individual in each company will be varied and myriad.

Professor Kahnemann elaborates further on this, laying the blame on a concept known as ‘fundamental attribution error’ (5). This is where people underestimate the impact of an environment on the behaviour of an individual. It is key to understanding why people don’t carry out a certain action, even though they know they should. For example, in a workplace with moving and competing priorities cyber security may not always be the first thought when people log on in the morning or during a busy afternoon. Viewed through this lens omissions become much more understandable. Put another way, maybe it’s not people per se that are the cause of errors, but the design of the environments they work in that prevent them from taking the right action.

Perhaps, rather than thinking about how to enforce greater compliance, IT security professionals should consider what in the current environment is preventing them from doing so already? In other words, release the brake rather than press the accelerator.

In a recent meeting with an enlightened client, they acknowledged that remembering numerous secure passwords is impossible, even for the most diligent employee. Rather than peppering employees with more training around secure passwords or requiring increasingly complex combinations, they are moving to a ‘passwordless world’. This would rely more on One Time Passcodes (OTPs) as favoured by banks with their customers, among other measures, in place of passwords. For example, your bank texting you a single-use code to access online banking.  A small slice of utopia for anyone who has ever struggled to recall a password or realised that all of their passwords are identical. In the spirit of Kurt Lewin’s work, they are removing the brake on employees having secure login credentials.

Inevitably some organisations will remain solely focused on ‘accelerators’ and battle on in the vain hope that another round of e-mails or warnings will finally change behaviour. A tempting sentiment, but not one that reflects the nature of human behaviour. To conclude with the elegantly concise sentiment of Professor Richard Thaler ‘If you want people to do something, make it easy’ (6).








Latest video

Liberty Mutual Reinsurance - A Change in the Industry

Liberty Mutual Reinsurance explore the changes the reinsurance industry is facing.  Dieter Winkel shares his insights.If you would like to talk to Liberty Mutual... click here for more