Security technology cargo cult: buy more boxes (part 2)

In Part 1 we looked at the deterrence quality of security controls. It’s one of the three attributes of security controls that are often ignored; sometimes consciously but more often due to ignorance. Now we will look at another attribute that is too often neglected: awareness. Typically when discussing security awareness the immediate mental image is of mandatory courses, presentations and drab, unimaginative posters around the workplace. What this post talks about is the information security situational awareness: what is happening, where, why, and who is involved.

Security technology cargo cult: buy more boxes

Fear of reprisal is one of the most potent stimulants for action. It is also one that information security generally ignores. To that end the need to “improve security by buying more technology” is the prevalent course of action for most IT shops in large and small organisations. That this is just perpetuating the losing race is not a message most IT security staff are willing to concede. There is a better way to improve information security posture of large and small organisations, and it starts by mimicking physical security, where psychology has played a significant role.

Wassenaar Arrangement and dual-use computer code

The Wassenaar Arrangement is frequently mentioned in information security (and vulnerability research in particular) since inclusion of computer code as dual-use good. The Agreement does not clearly specify what is and isn’t considered a controlled good that should be subject to export controls, making a number of security researchers and long-term thinkers rightfully concerned about the future of security research. The section on dual-use technology that’s relevant to software (not crypto) says: >‘4. A. 5. Systems, equipment, and components therefor, specially designed or modified for the generation, operation or delivery of, or communication with, “intrusion software”.

Information security and the observer effect

The initial empirical study of the observer effect (Hawthorne effect), which said that people change their behaviour to the better when observed, has seen equal measures of criticism and support over the years. Whilst a lot of the critiques were typically academic (i.e. no impact on the end effect, just argument on which factor influences it) there were also a number of empirical studies that failed to replicate the original study’s results. Two academic papers that I used in the past on the effect of being watched (quantum physics observer effect in real world psychology, if you will) have a lot of lessons for information security designers and architects, if only they will stop rolling out new boxes and start thinking about what it is they really need to do.

Cyber: what does it even mean?

Cyber is hot property nowadays. There’s not a “thought leader”, an organisation, a think tank, an industry body, government body, and the list goes on and on and on. There’s only one slight problem: no one agrees what ‘cyber’ actually means and what is and isn’t cyber. Every time I do risk related work I try to make sure everyone uses the same terms to mean the same thing, to reduce the risk of misunderstanding. It is such a simple and obvious step that most people forget about it.

African Union's draft Cybercrime Convention

TL;DR: African Union’ draft Cybercrime Convention: Majority of the Convention is spent defining legal protection for entities and persons using the internet and other electronic means to do business. A whole section of the Convention is reserved to define rights of persons and protection from “strong men”. [Council of Europe’s ETS 185]( – Convention on Cybercrime (Budapest Convention): Assumes legal environment exists where rights and obligations of persons and entities are clearly defined and protected. Provides a framework for law enforcement and legal actions for fighting crime done by electronic means.

Why the cyber debate needs retired admirals and generals to stay out

OK, so the title may be a bit insensitive. A bit. But only until you read, yet again, what some of the best and brightest military minds have to say about cyber security: In the early 1980s cyber fiction film, “War Games,” a young hacker played by Matthew Broderick almost managed to start World War III when he accidentally nearly launched nuclear strikes against the Soviet Union. It seemed unlikely in those relatively primitive days before the widespread use of the Internet, but it foreshadowed the emerging era of the profound intersection of national security and the cyber world.

Cybersecurity: economics does not apply

Everyone who goes to their boss lobbying for more budget believes they have an honest reason for doing so. But they don’t. It’s like how everyone believes they are an above average driver. The source of this belief cannot be an honest appreciation of the facts. Therefore, it must be a dishonest belief in one’s own worth. Cybersecurity have this in spades. They’ve raised their profession into some sort of quasi-religion. Cybersecurity has become some sort of moral duty rather than a rational cost/benefit or threat analysis.

Unsubstantiated claims are not a way to reason

The competition would try to get software developers to think about security as they were writing software and other applications, said John Colley from the ISC Squared security certification organisation. "It costs 100 times more to retro-fit security to an application than it does to do it from the start," he said. No wonder (ISC)2 is getting a bad reputation, if they’re making such stupid and unsubstantiated claims. BBC News - UK seeks software writers with Cyber Security Challenge