Government ‘loses billions’ to bad data policy

By Cassie Chambers

Inefficient data policy is costing the government billions, a think tank has claimed.

Policy that does not use existing data to its full potential is causing the government to forego £33 billion a year in savings, according to a report by Policy Exchange.

"Across the public sector, extraordinary quantities of data are amassed in the course of running public services – from managing welfare payments and the NHS, through to issuing passports and driving licences," said report author Chris Yiu.

"Finding ways to share or link this data together has the potential to save time for citizens and money for taxpayers," he continued.

To improve government's use of data the report calls for the creation of a 'Data Force' unit to examine information across departments and identify potential savings.

The new unit could, for example, analyse data on airport queues in real time to anticipate bottlenecks and ensure personnel were efficiently allocated.

Other examples cited in the report include a move towards a 'virtual census' based on combining existing data sets and increased use of data to identify citizens who do not pay taxes.

Yet the report also acknowledged a trade-off between increased data mining and the desire of individuals to avoid "being tracked".

"The government will need the capability to conduct analytics effectively, and the courage to pursue this agenda with integrity," said Mr Yiu.

He added: "This is only partly about cutting-edge data science skills. Just as important – if not more so – is ensuring its leaders are literate in the scientific method and confident combining big data with sound judgment."