Is finding out that users don’t comply with the policy a nightmare scenario for an IT security officer, for example of the House of Commons? Hardly. Unless you find out through Twitter, of course, along with the rest of the world (See: https://twitter.com/NadineDorries/status/937019367572803590)
A policy that only demands self-evident behaviour does not contribute, and probably does not solve a problem. For a realistic policy in an ever changing cyber security landscape, you should expect some aspects of compliance to be strenuous initially, and more of them over time. It is counterproductive to assume that security versus utility is a zero-sum game, but trade-offs are always likely. The research area of “usable security” works to minimise this effect.
So you have to monitor policy compliance. Probably not through social media research, though. It would be interesting to see how compliance gets checked in the House of Commons. There is a decent chance that there’s education and advice but otherwise reliance on individual MPs’ responsibility. That worked for everything including MPs’ expense claims, until we realised that it didn’t. To complicate things, IT security where it concerns the Data Protection Act does devolve to individual MPs, as they are all separate data controllers.
IT security policy compliance should be monitored to cover the risks that the policy is supposed to mitigate. Business should normally link non-compliance to disciplinary procedures. As some tweets said this week, sharing logins is a sacking offence in some businesses. Non-compliance can also be an indication of changes in cyber risks and risk perception, and changes in business processes – so the exact areas of non-compliance may just be where the security policy needs to reflect such changes.
Most of all, however, usable security research tells us what the ultimate value of non-compliance information is: it indicates where users have found security too burdensome, and where they have found their own workarounds. This is also known as “shadow security”. This creates the seams through which cyber risks can come into the organisation.
Is the password for the shared drive too hard to remember? Sharing logins is one solution for sharing files. Another is to use the cloud (Dropbox, Google Drive, etc) or worse: a USB stick. So links to just about anywhere on the internet can refer to official documents – or not –, and a USB stick casually passed on can contain important official information. And be lost on the train. All this normalises dubious cyber hygiene.
Is communication by email not secure enough, maybe because emails can even be read by interns on exchange programmes? Create a WhatsApp group for gossip or conspiracy. If the Honourable Member for Backwardbury South defects to the opposition or turns out to be on Putin’s pay list, whose responsibility is it to remove them from the group? Presumably there’s no harm in Facebook knowing who is in the gang either?
These examples should give some indication of the value of knowing about non-compliance with security policy. The response is not simply to shout at the users for misbehaving – it is also to explore where business and security procedures can be integrated in a more usable way.
That does not provide an excuse for the recent behaviour of Nadine Dorries and other MPs. She didn’t exactly raise login sharing as an example of unworkable IT and its workarounds. Rather, it was to make a public argument to dissipate Damian Green’s responsibility for the porn that had been found on his work computer. From an information security perspective, that is inexcusable – and that point of view should be supported by management. One role of logins is to represent a user’s permissions, responsibilities and actions in an IT system in a way that makes them checkable, recordable and auditable. Morally if not also legally, a user should always remain responsible for what is done using their login – the more so if it is willingly shared. Dorries’ alternative for the “maybe his login was hacked” excuse was ill-considered for that reason alone.
This blog post was written by Eerke Boiten, Professor of Cyber Security in the Cyber Technology Institute, De Montfort University.