News Railways

From migration to railways, how bad data infiltrated British politics | Georgina Sturge

[ad_1]

Modern governments rely on numbers. They are the lifeblood of departments, used to judge the success or failure of policies. Politicians use them to legitimise their views and ideas and to scrutinise, expose and attack the other side.

While in the past it might have been enough for public policy to be justified on the basis of “because I say so”, governments can no longer rely on blind faith. They are expected, even required, to back their policies with hard evidence – the unease that greeted Liz Truss government’s mini-budget is a case in point – and we tend to view numbers as the most solid form of evidence there is.

The trouble is that numbers can’t always be trusted, even when they come from official sources. Despite the intention to act on good evidence, governments of all stripes have been continually led towards disaster by the problem of what I call “bad data” – official statistics that are patchy and inaccurate.

Sometimes the dismal state of our data is the fault of under-resourcing and a lack of attention to counting what should be counted. For decades, immigration statistics were based purely on a survey of people arriving and departing from UK air, sea and rail ports. Millions of passengers enter and leave the UK each year and picking migrants out of this enormous haystack has in part been a matter of luck. In the early 2010s, for example, these figures appeared to show an alarming situation where half of all international students were overstaying their visas.

Under Theresa May, the Home Office launched a multi-pronged campaign to identify illegal immigrants, which included closing bogus colleges and introducing right-to-work and right-to-rent checks. New statistics in 2017 concluded the original overstaying estimate for students had simply been wrong, a fault of failing to count people properly – and a sign of how unreliable migration statistics were as a whole. But it was too late for one group that fell on the wrong side of the so-called hostile environment policies: people who had come legally from Commonwealth countries in the postwar era but couldn’t provide enough proof of this when questioned. These victims of the Windrush scandal, uncovered by the Guardian, suffered multiple injustices thanks to an imaginary foe in the numbers and a failure of government record-keeping.

In the mid-2000s, the Labour government was keen to be on the front foot when it came to switching the EU’s farming subsidy from one based on what farmers produced to one based on how much land was capable of being farmed. As it turned out, the patchy state of our land records meant the government had essentially no idea how much land this applied to and, when the new system was launched, the civil service was upended by an avalanche of unanticipated claims. Britain was fined by the EU for the delay to payments caused by this backlog, while farmers themselves faced bankruptcy and, in some terrible cases, took their own lives.

Other times, numbers can mislead because there isn’t necessarily a right or wrong way of counting something, so we end up with a narrow view based on what we think is important at one point in time. Debates about whether prison “works”, whether grammar schools are a good idea, and even whether crime and poverty are going up or down have been going on for decades – and will go on for decades more unless we find better, agreed-upon ways of measuring these phenomena. Data will tend to offer us solutions based on what we decided was important enough to count and measure in the first place.

The people of Ilfracombe, Devon, know this. In the 1960s their railway station was closed, spelling an end to the harbour town’s tourism industry. This was thanks to a sweeping programme of cuts to the railways on the advice of British Rail chair Richard Beeching, whose main criterion for deciding a railway line’s usefulness was the average cost per passenger, per mile, over the course of a year. The trouble was that a yearly average was a terrible reflection of the importance of the railway to summer holiday destinations such as Ilfracombe, which had substantial railway traffic for only a few months of the year.

Politicians are usually not experts in statistical modelling, which puts them somewhat at the mercy of academics and economists who can themselves promote their ideas with far more confidence than is warranted. In one particularly egregious case, a key economic argument of the 2010 Con-Lib coalition government’s austerity agenda was revealed to have originated in a mistake in an Excel spreadsheet. Economists Carmen Reinhart and Kenneth Rogoff had been recommending lowering the debt to GDP ratio armed with a study in which they claimed to have found that debt of 90% of GDP was bad for growth. Years later, a PhD student discovered that this conclusion only held because the authors had failed to include the last five rows of their data. The authors admitted their mistake – but not before austerity had become a cornerstone of UK economic policy.

Bad data is not something niche or technical; it has real-world costs that can be very serious indeed, no matter which party is in power. The issues that are most important to people are, worryingly, the ones on which we have the worst data: crime, immigration, income, benefits, unemployment, poverty and equality.

Some of our architecture for collecting data is just plain under-resourced and in need of an overhaul, but governments tend to see fixing this problem as a hard sell to the taxpayer. A shift in our political culture would go a long way towards uncertainty no longer being treated as a dirty word. Until then, we the public can keep up the pressure by asking questions, refusing to settle for face value, and demanding explanations. Numbers hold enormous power, but in the end, we must remember that we govern them – not the other way round.

Georgina Sturge is a statistician in the House of Commons Library, and the author of Bad Data: How Governments, Politicians and the Rest of Us Get Misled by Numbers

[ad_2]

Source link