Sometime in the next 7 days you will be at risk because of a “data breach” caused by human factors. People returning from their Christmas break will have forgotten their passwords, will be keen to get stuck into their work and overlook the practices and processes they followed and take short cuts. This, combined with the fatigue they may feel about returning to a job they don’t like or trauma at home such as relationship problems or family break-ups which are so prevalent post-holiday season in the UK.
Why is this relevant? Well, we are humans, we are not machines. We are not conditioned to or able to follow rigorous procedures day after day against a backdrop of productivity requirements and consistent decision making. In most organisations Information Security is often designed and developed by those who do not use the policies, procedures and processes used to keep data safe on a daily basis. Technological solutions are often designed by IT security experts with little interaction with those who use the tools on a daily basis. Data protection policy is often written by Compliance professionals and some lawyers who demand 100% allegiance to their policy with very little regard to the factors that affect us as humans leading to compliance fatigue.
Compliance fatigue is where a constant stream of instructions or complex rules have been developed because it is the only way to enforce humans to act like machines. My recent trip to Japan proved that the more rules there were, the more shouting through mega-phones; the pointing of wands by people helping you cross the road; the more signs telling you not to do something, the more humans could ignore them and pick only those they felt aided to their productive day. Taking shortcuts where possible and ignoring the rest.
In Data Protection we are in danger of using the term “culture change” to try and enforce an extra burden on employees and expecting another good dose of training and awareness to solve the problems we have as humans. The culture change being a new way of adding layers of complexity to day jobs of employees.
We need to address cultural change in a different way and develop employee centric engagement at the start not the end of a privacy programme.
In the 1975 paper, The Protection of Information in Computer Systems, Jerome Saltzer and Michael Schroeder established ten principles for designing security.
Two of those principles are rooted in the knowledge of behavioural sciences:
Psychology: the security mechanism must be ‘psychologically acceptable’ to the humans who have to apply it;
Human Factors and Economics: each individual user, and the organisation as a whole, should have to deal with as few distinct security mechanisms as possible.
100 years before Schroeder & Saltzer, the founding father of cryptography, Auguste Kerckhoffs formulated six principles for operating a secure communication system, with a key focus on human factors: Three of those were “it must be easy to use and must neither require stress of mind nor the knowledge of a long series of rules”.
Over the past 20 years, there has been a growing body of research into the underlying causes of compliance and security failures and the role of human factors. The insight that has emerged is that security measures are not adopted because humans are treated as components whose behaviour can be specified through security policies and controlled through security mechanisms and sanctions. But the fault does not lie primarily with the users, as suggested by the oft-used phrase that humans are the ‘weakest link’, but in ignoring the requirements that Kerckhoffs and Schroeder & Saltzer so clearly identified: that security needs to be usable, acceptable and effective.
The expectation that all that is needed is a set of standards, policies and training sessions to deliver the CORE programme outputs is unfounded. Our compliance programmes need to ensure that our design “fits the task to the human, not fits the human to the task” .
When employees do not behave as specified by security policies, most compliance practitioners think that the users are at fault: that they ‘just don’t understand the risks’ or ‘are just too lazy’. But research  has shown that non-compliance, which we now refer to as ‘rule-bending’, is caused by people facing a stark choice between doing what is right by security and reducing their productivity. Most choose productivity over security, because that is what the organisation also does.
Behaviour is essentially goal driven. We perform tasks to achieve goals, at work: ‘I want to get this offer to our employee today’, or in their personal life: ‘I want to get the best utility deal for us’. To achieve these goals, people complete a series of tasks. We must be aware that compliance fatigue can easily set in with conflicting challenges in employee’s everyday lives.
Therefore, a compliance programme will fail in delivering its objectives without considering how it can address at design phase “human factors”.
Therefore, armed with this knowledge and building in, not only privacy, but human factors by design, a range of employee supportive solutions can be developed.
These may include
- Automated security, for instance, using implicit authentication to recognise authorised users, instead of requiring them to enter passwords many times over.
- If explicit human action is necessary in a security task, we should minimise the workload and the disruption to the primary task.
- Designing processes that trigger security mechanisms such as authentication only when necessary.
- Design systems that are secure by default so that they so that they do not push the load of security configurations and management on to the employees.
The benefit being that we actually make the humans job easier which is always an easier selling message. When employees encounter security policies that are impossible to follow or are clearly not effective, it provides a justification for doubting all compliance policies.
That is why compliance hygiene is essential. When policies are not being followed, compliance professionals must investigate, in a non-confrontational manner, why and if it is because they are impossible or too onerous to follow and redesign the solution.
Employees do not show blatant disregard for security but try to manage the risk the best way they know how, what they call shadow security by designing their own solution. These ‘amateur’ security solutions may not be entirely effective from a security perspective, but since they are ‘workable’, asking ‘how could we make that secure?’ is a good starting point for finding an effective solution that fits in with how people work. The programme recognises employee input into compliance design.
This represents a major change of behaviours, attitude and culture from any organisation over and above the normal remedies to ensuring “compliance sticks”. Practitioners often respond with security awareness, education and training measures when people do not follow security policies. In practical terms, if people keep being told that the risk is really serious and they must follow policy, but cannot do so in practice, they develop resentment and negative attitude towards security and the organisation (which is counter-productive).
In practice, the three terms: awareness, education and training, are often used interchangeably but are different elements that build on each other:
Security Awareness. The purpose of security awareness is to catch people’s attention and convince them security is worth the engagement. Given that many organisations face compliance and security fatigue, we need to capture people’s attention, and get them to realise that:
(a) date security is relevant to them, that is, the risks are real
and could affect them, and
(b) there are steps they can take to reduce the risk and that
they are capable of taking those steps.
Crafting effective awareness messages is not an easy task for compliance professionals. Working with the communications specialists in an organisation can, therefore, help. They not only know how to craft messages, nudges and scenarios that catch people’s attention but know how to reach different audiences via the different channels available to them and integrate them into the overall set of communications to avoid message fatigue.
Security education. Once people are willing to learn more about data security, we can provide information about risks and what they can do to protect themselves against them. Most people currently have very incomplete and often incorrect mental models on cyber risks. Transforming them into more accurate ones provides a basis on which to build cyber security skills.
However, it is hard to ascertain whether the education leads to more accurate mental models or at least the ones that security professionals expect people to possess.
Security Training. Training helps people to acquire skills, e.g., how to use a particular security mechanism correctly, how to recognise and respond to a social engineering attack. In addition to showing people how to do something, we need to support the acquisition of skills by letting them practise the skills in a setting where they can ‘experiment’ with security decision-making and react on their perceptions and biases. Parts of skill acquisition can be supported online, but, like all learning, it is much more likely to be successful when taking place in the context of a social community.
A common misunderstanding is that if people complete the three steps above and know what to do, they will change their behaviour. But knowing what to do and how to do it is not enough, human activity is 90% automatic, driven by routines or habits stored in the long-term workspace. The new security behaviour needs to be embedded there but its place is occupied by an existing behaviour (similar to an old password). The adage that ‘old habits die hard’ accurately describes the fact that until we manage to push the old behaviour out and the new behaviour becomes automatic, all our awareness, education and training efforts may not yield the changes in behaviour we are seeking. This is a challenging undertaking. Since productive activity needs to carry on while we change security behaviour, we can only target 1–2 behaviours at a time and embark on changing the next 1–2 only once these have become genuinely embedded. Nor should one conflate security awareness and education with security culture.
As my mate Ali Hepher, Head Coach at Exeter Chiefs rugby said “the expectation that professional rugby players can be 100% mentally focussed over a 30 match programme fails to recognise that they are human beings and it is impossible to that spot on with your mental focus every time….we are humans after all”
So, it is not employee behaviour we need to change but those of us within Information Security who think “compliance can be taught”.
 J. H. Saltzer and M. D. Schroeder, “The protection of information in computer systems,” Proceedings of the IEEE, vol. 63, no. 9, pp. 1278–1308, 1975. [Online]. Available:
 C. Herley, “More is not the answer,” IEEE Security & Privacy, vol. 12, no. 1, pp. 14–19, 2014 & M. A. Sasse, S. Brostoff, and D. Weirich, “Transforming the ‘weakest link’—a human/computer interaction approach to usable and effective security,” BT Technology Journal, vol. 19, no. 3, pp. 122–131, 2001.