This is the official Perth blog site for posts, comments, and other contributions about leadership, behavioral finance and economics, and about management generally, as well as other related topics that take our fancy.
Font size: +

Can the cognitive biases of the intelligence agencies be exploited by our enemies?

National intelligence has been a hot topic lately. That’s a bit weird given that I have been posting on artificial intelligence. Does national intelligence come even close to AI? Can we trust its analyses?

OK so I have a tiny bit of first-hand experience in this. In my callow youth I was actually a junior diplomat for a small power allied to the US (check out my bio to see which one). I was posted to a subcontinental country that is frequently in the news, usually not for good things.

But as a bit player in the global diplomatic game I actually had a very small part in analyzing political, social and military events in my host country and its neighbors and yes, even in preparing reports on them. Of course, I was totally en claire, no deep cover here. So I was no Smiley (my apologies to John le Carré), just a working stiff trying to do his insignificant best for his own country and the Western alliance of democratic nations and especially for its leader, the US

Of course, you get to know the tricks. My job, apart from attending numerous boring, boozy and tiresome diplomatic and social events, often involved preparing intelligence reports on my host country. I quickly learned if I wrote a piece that was bad (in my own lights) I should classify it, then very few people would ever see it which would hide my stupidity and (possibly) enhance my chances of promotion.

Like most ersatz-crypto-intelligence agents (aka diplomats) everywhere I cut a lot of clippings from newspapers, classified them as confidential and then sent them back to home base, since it counted as a report, helped me look busy to my handlers back at home base and was quicker (and probably much better) than trying to write something myself.

But occasionally I wrote something good. Actually really good. I penned several which accurately predicted some world-changing black swan events precipitated by my host country some years later. But at the time, much to my shame, they were all rejected by home base which essentially said that they were imaginative but totally wrong. They had other sources, these responses said, that were way more senior, way more well-placed and way more experienced than yours truly, that had proof that what I was saying was just plain wrong if not totally out of court.

At the time I was junior, inexperienced, not well-placed and definitely not part of the inner circles of my particular department or of the allied intelligence networks of which I formed a tiny part.

But what these responses told me, once I thought about it, was that being senior, experienced, well-placed and part of these networks probably was a huge liability. I came to realize that you generally couldn’t trust what they were predicting and generally they had no chance of predicting black swans or anything like it.

That’s because my employer, like all bureaucracies always thought in linear ways, never nonlinearly. If you want that you have to go way outside the normal intelligence networks. And no, that doesn’t include academic networks because these are essentially the same too. You usually have to go to loners, disruptors and entrepreneurs to figure out what is likely to happen. Usually these people have to be seen as a bit weird, people you can’t trust to go with the program. Hmmm, a bit like me….

Cut the rest of my life until now. Naturally I quickly left said diplomatic service; it was a terrible match on both sides. I went first into government service, then the private sector, the latter for most of the time. These taught me even more about bureaucracies and their inability to predict black swans.

I guess that’s why some of my favorite reading is books by spy-thriller author John le Carré who, not so coincidentally, was also an émigré from intelligence services, in his case from the UK’s MI6.

SOS. What we learn from him again is the aversion to black swans of all intelligence services and the importance if not the primacy of personal and political motives.

But here’s the point I am getting to. Amongst my other sins I happen to have conducted a lot of research into behavioral finance. So I’m very interested in the importance of irrationality and mixed rationality in decision-making.

For a recent checkpoint on this see the recent book by Michael Lewis (author of the best-selling “Moneyball”) about the emergence of behavioral economics (“The Undoing Project”). It’s just the latest reminder that irrationality and mixed rationality in the private sector are mirrored in excelsis in the doings of the public sector, maybe even more so since in politics there’s few objective measures of successful outcomes.

Under the Conservative government of David Cameron in the UK a unit was set up to apply the findings of behavioral science to consumers and voters (the so-called “Nudge Unit”). But even the Tories were never gutsy enough to apply it to the mandarins, just like we in the US haven’t been either.

Reformulating my experience in the diplomatic service in behavioral economics terms, it’s clear that it suffered from all the classic cognitive biases that are now the currency of the behavioral economics realm. The main ones I observed were:

  • Confirmation bias – people interpret almost everything such that it confirms what they already believe – WMD anyone?
  • Loss aversion – people hate to lose stuff they have even more than love to get new stuff, even if the new stuff will put them way ahead of the game overall nukes maybe?
  • Status quo bias – most people have a reflexive bias to stick with what already exists, even if so doing will do them much harm – Cold War spycraft perhaps?
  • The endowment effect – people tend to stick with what they already possess even if so doing will prevent them from gaining even bigger endowments in the future South China Sea?

These are all essentially irrational biases. They are all totally natural to us. So much so that we are rarely conscious of them and the impacts they have on our decisions, usually bad. Essentially they are behavioral vulnerabilities, big ones in fact. Could behavioral vulnerabilities actually be more serious than vulnerabilities in material things such as warheads, armed men, advanced electronics, even AI?

It’s not that I think the intelligence agencies, or even government departments generally are stupid or peopled by stupid people. Rather it’s that the presence of mass irrationality is not understood and addressed in any formal way by them

Notable failures in intelligence (the ones we know about anyway) include Iraq (WMD), 9/11, the fall of the Soviet Union, the Berlin Wall and of course, there are numerous others.

I can guarantee that in every single one of these cases there was a junior, inexperienced, poorly-placed loner with no access to the elite or senior networks that make all the final calls on what’s right and wrong who already had the right prediction but then had it rejected. And in their turn, each of these rejectees quickly quit to do their own thing. Maybe Snowden was one of them, not that’s to condone what he did.

That junior was the one who didn’t have those behavioral vulnerabilities. But because of the hierarchy, and deeply ingrained cognitive biases I have just outlined above, the powers-that-be rejected other views as not being credible. They were the vulnerable ones.

I think in defense and intelligence thinking and action we are focusing too much on vulnerabilities in men, materiel and military capabilities generally and not enough on our behavioral vulnerabilities. All government agencies have them but the intelligence agencies are the ones on the front line where these vulnerabilities are most serious and most likely to be exploited by our enemies.

We talk about cyber-hacking, propaganda attacks and other such militaristic exotica. But it’s clear that our enemies have already been exploiting these behavioral vulnerabilities and that’s why they’ve been so effective. We’re just struggling to figure it all out in yesterday’s terms, generals fighting the last war, etc. etc.

It’s time for the national intelligence agencies to start looking inwards. You know, we found the enemy and it is us.

And if the new Administration is serious about reform of our intelligence agencies, they should lead them to address these vulnerabilities. Maybe a Nudge Unit for Intelligence? If not the entire Federal bureaucracy?

The only silver lining in this big black cloud is that most of our enemies (and generally our allies) – but certainly not all - are in exactly the same, clueless, position.



Stay Informed

When you subscribe to the blog, we will send you an e-mail when there are new updates on the site so you wouldn't miss them.

Exercise is being way overhyped!
AI is heading in a wrong and terrible direction!

List of all Perth posts