SXSW: Stalin and the Scope-Severity Paradox
Recently, someone pointed me to a piece by Ben Goldacre on a paper written by Loran Nordgren and Mary McDonnell. They looked at the link between the punishments handed out for crimes against multiple victims against the number of victims affected. You would expect that the more people affected by a crime, the more severe the punishment, right? What they found was extraordinary – the higher the number of victims, the less harsh the punishment was. They gave this the catchy title of ‘the scope-severity paradox’.
Similarly, in 1992 a study for Exxon looked at the link between disasters and the financial commitment to clearing up the disaster. It has a less catchy title – ‘Measuring Non Use Damages Using Contingent Valuation: An Experimental Evaluation of Accuracy’. In one particular experiment, researchers asked people how much they would pay to clean 2,000, 20,000 or 200,000 oil-affected sea birds. The answers averaged out at $80, $78 and $88, respectively. It would appear that the number of victims had no effect on the compensation expected. This is termed ‘scope-insensitivity’.
Stalin famously said (or maybe didn’t; it’s a bit murky) that one death is a tragedy, but a million is just a statistic. As monstrous as the thought is, it seems he was right.
Big Data, in name and nature, deals only with scope. It is the amalgamation of either huge datasets or multiple sources of data with the expectation we’ll be led to better understand, better predict and better serve people.
The problem is, the bigger the data, the further we move from people. It’s no coincidence that those organisations with the biggest data tend to make the biggest missteps. From Facebook’s terrible experience with Beacon, to Uber’s questionable treatment of almost everyone, to governments’ even worse invasions of privacy, it seems clear that the more access to data about people an organisation has, the less they seem to actually understand – or care – about people.
So how do we redress the balance?
This isn’t an argument against big data. Making assumptions based on what a single person said in a focus group one time isn’t a better way of doing things. I’m asking for balance.
First, let’s rehabilitate the focus group. A well-run group with an experienced moderator – whether it’s user feedback, focus or testing – is a powerful weapon.
Secondly, we need to go out and talk to people. Next time you get a taxi that isn’t an Uber, ask the driver what he thinks of Uber. (Then duck). Ask a barman what he thinks of your beer. Ask your mother how she likes that iPhone. Ask your dad how he likes to wash up. And if you don’t like their answer, don’t inform them that they’re not even in the target audience and carry on regardless.
Unilever utilises a great tool called ‘people immersion’. It goes beyond a questioning of someone’s brand perception, instead choosing to have a lengthy, intimate conversation in someone’s living room to gain a true understanding of their ambitions and aspirations.
Thirdly, bring customers into the business. LEGO has fans of the brand embedded in the business, with ideas.lego.com, where ideas are submitted and fans vote for a product to go into production. This is so unique that I struggle to find another example – except a gambling company I once worked with who recruited anyone who won too much money.
Finally, trust artists. John Lennon said: “My role in society, or any artist's or poet's role, is to try and express what we all feel. Not to tell people how to feel. Not as a preacher, not as a leader, but as a reflection of us all.”
Lennon didn’t rely too much on data to achieve that goal.
Martin Harrison is Planning Director at Huge. His session at SXSW Interactive, ‘The Empathy Gap: Why Stalin Nailed Big Data’, is set to take place in JW Marriott, Salon 4 on Sunday, March 15 at 11:00PM - 12:00PM.