The frailty and error-proneness of human cognition and the decision to torture

Under many conditions (including difficult and stressful ones), people rely on heuristics (cognitive shortcuts that enable decisions) and are prone to effects of social processes such as groupthink. We have a guide from more than fifty years of data in experimental psychology and experimental brain research to understand how human rationality and reason is bounded and error-prone. Daniel Kahneman, experimental psychologist and Nobel Memorial Prize winner, gives a marvellous and extensive exposition of these in his tour-de-force “Thinking, Fast and Slow”. Decision-making within organisations (political institutions, law-enforcement, etc) is often a difficult and ideologically-charged process. Political and civil service systems can undervalue expertise, suppress cognitive diversity and discount evidence in favour of ideology.

There are persistent and enduring cognitive errors which lead to faulty decision-making by individuals (political leaders, civil servants, bankers, etc.) and institutions (social systems and organisations: Government departments, banks, churches, etc). Here are a few.

Anosagnosia is the condition of literally being ‘without knowledge’ (being unaware) of a cognitive or other impairment, and behaving as if there is no problem. Governing elites do not know that they do not know, nor do they even know what they need to know. Complex and difficult problems (such as how to collect valuable and useful intelligence from multiple humans under conditions where the targets are continually moving and changing) are best solved by groups with substantial intellectual strength and capacity (obvious), and substantial diversity of experience (not obvious). Absent these factors, and people will rely on folk or lay intuitions as the basis for deciding courses of action – “we need to know quick – let’s waterboard this guy”. Government decisions are taken and implemented within a group context or contexts – documents are evaluated, discussions occur, and decisions are taken and then implemented (sometimes in ways that subvert the spirit and intentions of the original decision).

Groupthink occurs when a group makes poor decisions because of high levels of within-group cohesion and a desire to minimise conflict (as might happen in an exhausted, embattled and worn-out Government Cabinet desperate to forestall another terrorist attack).

The necessary evidence-based critical analysis does not occur. Groupthink can be reduced by the leader and the group having an extensive network of weak ties to other individuals and groups. Weak ties provide us with novel ideas and knowledge, and provide a route to ‘reality-test’ planned courses of action. An extensive national and international weak tie network will provide Government Ministers knowledge, insights and ideas unavailable within their usual cognitive and social bubble. Being able to ask how the leaders of countries X, Y, Z coped with terrorist outrages, for example: “You guys have waterboarded, beaten, isolated and electrocuted terrorist suspects. How did that work out for you?” European countries have a long, sad and sorry history of doing just this – and within living memory too.

An interesting cognitive error during complex decision making is to not explore counterfactuals, as these might falsify or invalidate a course of action. Exploring counterfactuals forces you to ask why you might be wrong! (“Why might freezing this guy be a bad idea?”).

Cognitive dissonance is the unpleasant feeling caused by holding two contradictory beliefs simultaneously: for example, “our current interrogation methods aren’t delivering; my colleagues and I are good people and are part of the system; therefore it isn’t the system; the problems lie elsewhere” (with terrorists, although logically this may also be true).

Verificationism (also known as confirmation bias) is a pervasive cognitive bias, where evidence favouring a particular point of view is collected and weighted heavily, and contrary evidence is discounted or ignored. (“All the programmes I watch on telly show torture working”). Its opposite, falsificationism, is a difficult habit of mind to acquire. It is a must for any working scientist. Falsificationism requires considering what empirical evidence would invalidate (falsify) the position you are adopting. One way of avoiding this bias is to state clearly what empirical evidence would falsify your opinion or theory; another is to build an evidence-based brake into policy formation. In science this is done by international, anonymous, expert ‘peer-review’. Peer-review and similar systems can be built into the process of Government via policy-review boards.

The arguments for torture may also pivot around the focusing illusion, a cognitive error which emphasises only upside arguments (local benefits: ‘quick and easy knowledge about terrorist networks and ticking time-bombs’), but ignores costs (the destruction of reputation, lives, and contempt for international treaties and the rule of law; acting on what is false knowledge).

Language has the important property of ‘framing’ arguments and discussions. The crime debate at one time in the UK was dominated by the phrase ‘a short, sharp, shock’, which relied on the folk theory that quick and severe punishment would shock teenagers out of criminal tendencies. (The pleasing alliteration of the successive sibilants was an important, but useless, selling point too). Short, sharp shocks, of course, predictably have no such effect, but why let data from the psychology of punishment and from criminology influence debate? The phrase ‘cut and run’ was used to forestall debate about the palpably-failing US military strategy in Iraq, until empirical reality forced a change of direction. There are many other cognitive errors (for example, availability and affect heuristics, motivated reasoning, competence illusions, overconfidence, incentive effects) and humans are also prone to them (especially under duress, as cognition degrades under stress.).

Individual rationality and cognition is limited and error-prone. Institutionalised decision-making supports are vital to ensure that decisions are made using the best evidence and logic available.

My book, Why Torture Doesn’t Work: The Neuroscience of Interrogation is available on Amazon  and will be released in November, 2015.

Author: Shane O'Mara

Neuroscientist

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s