What do video games say about you?

I think the way people play video games can tell you a lot about their personality. This thought first came to me in 1999, when I was sharing a flat in London with a friend of mine. To preserve his anonymity, I will not use his real name here. For the purposes of this story, l will refer to him simply as “Christ.”

Back then, Christ and I used to spend hours every day on the Playstation. We were particularly fond of the various Tomb Raider games, in which the player has to guide the protagonist, Lara Croft, through a series of ancient ruins and underground caverns. Along the way, Lara must fight various foes, and each blow they land on her takes away a certain amount of life. If her life indicator drops to zero, she dies, and the player must go back to the previous checkpoint and start again. Dotted around the game, however, are occasional medi-packs, which can be used to restore a certain amount of life to the wounded heroine.

My approach to playing the game was not particularly sophisticated. I would charge through the tunnels and chambers as fast as I could, hacking away at the enemies who confronted me, and picking up any medi-packs I spotted. If my life dropped to dangerously low levels, I’d use a medi-pack. Often, I would make it through to the next checkpoint by the skin of my teeth, with my life perilously low and no medi-packs left.

Christ took a very different approach. He would proceed very tentatively, exploring each section of tunnel, and every nook and cranny of each room. As a result, he managed to find a lot more medi-packs than I did. But he would never use them. If he took any damage, he’d let Lara die so he could go back to the last checkpoint, and try to make it all the way to the next one without losing any life. By the time he reached the end of the game, he had hundreds of un-used medi-packs, like a miser who reaches the end of his life with a pile of banknotes under his death-bed.

At the time, I realized that these different approaches to playing Tomb Raider said something about our different attitudes to risk. Christ was clearly much more risk averse than I was. But since then, I have come to think that our different playing styles reveal much more than just this. They reflect our whole attitude to life.

Take work for example. Christ and I are both writers, but we write very differently. I will write furiously for a few months, finish a book, and then not write again for a year or more. For a while, I live well on my advance, but then I plough it into some crazy project, or invest in a business that goes bust, and before long I’m poor again, and must return to my desk to write another book.

Christ, on the other hand, has written for several hours almost every day for the past ten years. In that time, he has steadily built up a small fortune. He is always talking about retiring the country, but he never seems to think he has enough money to take the plunge. Even now, with over a million pounds in the bank, and large royalty checks arriving every few months, he is still living in his run-down old house in Catford. Next year, he says, he will finally stop writing and leave London. But I’ve heard that before.

A few years ago, when another one of my hare-brained schemes had failed dismally, and I was crushed and broken, Christ invited me to stay with him in Catford to lick my wounds for a week or so. Each day, as I crawled out of my bed and made my way downstairs for breakfast, I would look up admiringly at the rows of colored paper pinned to the walls of the stairway. Each piece of paper had a list of milestones that Christ had set himself for each book, all dutifully ticked off as he had achieved them. These humble records spoke eloquently of his patient setting of goals, and the diligent accomplishment of each one. At the time, they felt like a silent rebuke of my way of living, and I even caught the occasional gleam of triumph in Christ’s eyes, as if my downtrodden air was the vindication he had always been seeking that his way of life was better than mine.

But now, a few years later, I’m back on my feet again, and I see things differently. Since that sojourn in Catford, I’ve written another book, made a pile of money, and squandered it on setting up a company that went nowhere. I’m poor again now, and have started writing yet another book – my eighth. But in those few years, I’ve also got married, moved to Ireland, and spent several terms abroad teaching at universities in Beirut and Guatemala. I’ve bought a house and redecorated it, created a website that has been visited by over 100,000 people, and taken one of my employers to the High Court (twice) and won (both times). In short, I’ve packed a lot in.

Christ, meanwhile, has done nothing except write. His bank account has grown steadily, but his house is still in the same dilapidated condition it was when he moved in ten years ago. He still hasn’t married his girlfriend, despite promising to do so every year for the past five years. And he still hasn’t moved to the country, though he does say he plans to do it next year. He’s had a heart attack, so I hope he does it soon. Otherwise, I fear he’ll arrive at those pearly gates with a bunch of medi-packs which by then will be completely useless.

 

Thinking, fast and slow, and slower…

Daniel Kahneman’s recent bestseller, Thinking, Fast and Slow, is a brilliant summary of a lifetime’s work in the psychology of decision making. Together with his colleague, Amos Tversky, Kahneman revolutionized the way psychologists think about how people reason and make choices. Before these two young Turks burst on the scene in the early 1970s, psychologists had a rather rosy view of decision making that owed more to logic and mathematics than to empirical research. People were seen as utility-maximizers, rationally weighing up the pros and cons of the options available to them before opting for the one with the highest payoff. In a series of brilliant experiments, Kahnmen and Tversky exposed this picture as overly optimistic, and showed that human decision making is riddled with biases and cognitive short-cuts that work well enough most of the time, but can also lead to some pretty dumb mistakes.

The central thesis of Kahmenan’s book is that there are fundamentally two modes of thought, which he denotes System 1 and System 2. System 1 is fast, automatic, emotional, and subconscious; System 2 is slower, more effortful, more logical, and more deliberative. The biases and cognitive short-cuts are largely features of System 1, but we can overcome these by employing System 2. It just takes more effort and more time.

This is fine as far as it goes, but it leaves a crucial third kind of thinking out of the picture. This is the meditative, creative mode of thought that the psychologist Guy Claxton calls the “undermind” in his thought-provoking book, Hare Brain, Tortoise Mind. It is much slower than Kahneman’s System 2, and works away quietly in the background, below the level of conscious awareness, helping us to register events, recognize patterns, make connections and be creative. This is the kind of thought that can bubble away beneath the surface for weeks or even months, quietly turning over a problem and looking at it from different perspectives, before suddenly thrusting a solution into consciousness in that exciting Eureka! moment.

I think Claxton is onto something in claiming that the mind possesses three different processing speeds, not two.  Think of it as a kind of “cognitive sandwich” if you like. The top half of the bun is the lightning fast System 1 identified by Kahneman, the world of snap judgments and rapid heuristics.  The bottom half of the bun is the snail-paced undermind identified by Claxton, where thoughts cook slowly in the back oven. Both of these are unconscious processes, operating below the level of conscious awareness. The hamburger in the middle is conscious thought, Kahneman’s System 2.

Where does risk intelligence come in to all this? Risk intelligence tends to be domain-specific, and those with high risk intelligence build up models of a given domain slowly, often unconsciously, as they gradually accumulate experience in their specialist field.  These models may involve many different variables.  The expert horse handicappers I describe in chapter one of my book took at least seven different variables into account, including the speed at which a horse ran in its last race, the quality of the jockey, and the current condition of the racetrack.  People with high risk intelligence manage to keep track of all the relevant variables, and perform the complex mathematical task of weighing them up and combining them.  However, they usually do this unconsciously; they need not be mathematical wizards, since most of the computation goes on below the level of awareness.

There are some basic tricks for increasing risk intelligence across the board, and I discuss some of these in the book. Simply taking a general knowledge version of the RQ test can, for example, lead to rapid gains in risk intelligence because it encourages people to make more fine-grained distinctions between different degrees of belief.  Such rapid improvements in risk intelligence may well generalize to any field, so there may be a small domain-general component of risk intelligence.  But these rapid gains are the low-hanging fruit; once you have plucked them, further increases in risk intelligence may be harder to achieve, and will require immersing yourself in a particular field of study for perhaps many years. It is then that, to borrow Claxton’s metaphor, the “hare brain” must stand aside, and let the “tortoise mind” take over.

 

The unbearable lightness of expected utility

In chapter 8 of my book, Risk Intelligence, I discuss the concept of expected utility.  To calculate the expected utility of a course of action, the first step is to estimate – separately – the probability of success, and the potential gains and losses that success and failure would entail.  Next, one does a little math, multiplying the probability of success by the potential gains, and multiplying the probability of failure by the potential losses.  Finally, we adding these two figures together to end up with the expected utility of that course of action.  After doing this for each possible action, the rational choice is to pick the action with the highest expected utility.

Expected utility is an abstract concept. It doesn’t refer to any actual win or loss: it’s the average amount you would win or lose per bet if you placed the same bet an infinite number of times. But this ethereal figure takes on an almost physical nature for the expert gambler, looming even larger in his consciousness than the actual profit or loss that hypnotises the rest of us. Expert gamblers see something different when they look at a poker table or a roulette wheel. Most people see a range of prizes; they see a single abstract figure.

This was brought home to me one night when a highly successful gambler invited me to accompany him to a casino. This particular gambler made all his money betting on horses. He shunned casinos, and only ventured in on this occasion because he wanted to give me and his other companions the thrill of seeing someone playing for high stakes. We followed him over to the craps table.

Craps is a dice game in which players place wagers on the outcome of the roll of two dice. My gambler friend proceeded to show off by placing bets of 1000 US dollars on the pass line, one after the other. A pass line bet is won immediately if the first roll is a 7 or 11. If the first roll is a 2, 3 or 12, the bet loses (this is known as “crapping out”). If the roll is any other value, it establishes a point; if that point is rolled again before a seven, the bet wins. If, with a point established, a seven is rolled before the point is re-rolled, the bet loses (“seven out”). A pass line win pays even money; in other words, my friend stood to win or lose a thousand dollars on each bet.

However, as he told me later over cocktails in the casino bar, it wasn’t this figure that was at the forefront of his mind. Instead, he just treated each bet as a bit of fun that cost him fourteen bucks. Although not a casino gambler, he was familiar enough with craps to know that the expected value of a pass line bet is -0.014. And this meant that, on average, each bet of a thousand dollars would leave him fourteen dollars less well off. The actual profit or loss at the end of an hour on the craps table could be anything from plus fifty thousand dollars to minus fifty thousand, and my friend would be aware of this figure too. But the figure that mattered most for him was the expected value of minus fourteen bucks per bet. That’s not a good-value bet, of course. Which is why, in any other circumstance, my gambler friend would never play craps.

New “risk intelligence test” is a disappointment

Imagine my surprise, last week, when I read a report announcing that researchers in Germany had created “the first quick test for establishing an individual’s risk intelligence.” After all, I created an online risk intelligence test back in 2009, and I was simply following in the tracks of many researchers before me. What was so special about the new test from Germany, I wondered.

The answer, as I soon found out, is … nothing! In fact the test doesn’t measure risk intelligence at all. To be fair, the authors of the test do not pretend that it does. They claim, instead, that the test measures what they call “risk literacy.” It seems the journalists used poetic license when they described it as a risk intelligence test.

The test is certainly quick. There are only four questions, and it takes only a couple of minutes to answer them. But the questions are the same old probability puzzles that have been the mainstay of books about risk for decades.

For example, one of the questions is as follows:

Out of 1,000 people in a small town 500 are members of a choir. Out of these 500 members in a choir 100 are men. Out of the 500 inhabitants that are not in a choir 300 are men.

 What is the probability that a randomly drawn man is a member of the choir?

Many books that purport to help people think more clearly about risk focus on such analytical puzzles. But although these puzzles can be fun to explore and their solutions are often pleasingly counterintuitive, mastering probability theory is neither necessary nor sufficient for risk intelligence. We know it is not necessary because there are people who have very high risk intelligence yet have never been acquainted with the probability calculus. And mastering probability theory is not sufficient for risk intelligence either, as is demonstrated by the existence of nerds who can crunch numbers effortlessly yet show no flair for estimating probabilities or for judging the reliability of their predictions.

Risk intelligence is not about solving probability puzzles; it is about how to make decisions when your knowledge is uncertain. Outside of some highly structured risk-taking activities, such as are found in casinos and financial markets, dealing with uncertainty is a more useful skill than probability analysis. It is also much easier to learn, primarily because it depends on common sense and simple logic rather than abstract mathematics.

It is depressing to find people still confusing these two things. The journalists who have been waxing lyrical about the German test would do well to read Agatha Christie. Fifty years ago, in The Mirror Crack’d from Side to Side, she showed her disdain for the kind of probability puzzles that the German test regurgitates, when she has Dr Haydock complain: “I can see looming ahead one of those terrible exercises in probability where six men have white hats and six men have black hats and you have to work it out by mathematics how likely it is that the hats will get mixed up and in what proportion. If you start thinking about things like that, you would go round the bend. Let me assure you of that!

Risk intelligence in Baalbeck

Last week I went to see the Roman ruins in Baalbeck, Lebanon. I had wanted to visit Baalbeck ever since I arrived in Beirut two months ago, but had hesitated about going because I’d heard that the security situation there was dicey. Last year, seven Estonians were kidnapped in while cycling near Baalbeck, in the Bekaa Valley. Last month, the US Embassy in Lebanon sent an emergency message warning Westerners not to travel to the Baalbeck area because of clashes between the Lebanese authorities and local criminal groups. The latest clash erupted only last week, when three Lebanese Army soldiers were wounded during a shootout early Tuesday in the town of Baalbeck itself.Dylan in Baalbeck

Browsing online, however, I came across various traveller’s bulletin boards that told a slightly different story. One tourist noted, on 14 March, that they were with some friends in the Baalbeck ruins early the previous Sunday morning, when suddenly they heard lots of automatic arms fire in nearby, and also “heard what sounded like something whizzing across the sky above us, which was followed by a few very large explosions.” However, it seemed to start up and die off quickly and all at once so they assumed it was a military drill of some sort. The travellers didn’t seem that scared.

It’s important to take reporting bias into account when reading newspapers and searching online for security information. Journalists don’t file reports saying “nothing happening round here,” and tourists are far more likely to post something online when they hear gunshots than when they don’t. So it’s hardly surprising that a Google search for “Baalbeck warnings dangers” will turn up a bunch of scary stories.

One way of getting a better idea of the security situation somewhere new is simply to ask people who’ve been there recently. This way, you avoid giving too much weight to the self-selecting sample of those people who go to the trouble of writing about stuff. So last week I asked everyone I knew in Beirut if they had been to Baalbeck recently, and whether they thought it was dangerous. The overwhelming consensus was that it was fine.

Weighing up all the evidence I had gained from newspapers, online reports, and gossip, and taking into account the various biases that might distort these sources, I tentatively concluded that my chances of being shot or kidnapped in Baalbeck were minimal – low enough to constitute an acceptable risk. So my sister and I headed out there in one of the rickety minibuses that are the cheapest way of getting around in Lebanon, and for a few hours the following morning, we had the ruins to ourselves.