public digitalThe public digital logo

Why the National Data Library could be a quiet revolution

Over the past decade, the UK government has become much better at building streamlined services. The shift from paper forms to digital services, from fragmented local processes to GOV.UK, has shown what happens when we design with users in mind: people can renew a passport, register a business, or apply for support in minutes rather than days or weeks. Each service stands as proof that public services can be simple, user-centred and digital-first.

But citizens’ lives aren’t divided neatly into departmental silos, services or policies. A family applying for childcare support will also touch benefits, housing or health services. 

Making these services work together depends on the underlying data - how it’s collected, shared and structured across departments. As the recent GDS API discovery showed, without shared standards and infrastructure, each organisation ends up solving the same data problems in isolation. The challenge now isn’t streamlining individual services - it’s connecting them, weaving together data, rules and processes so government feels like a single, joined-up system.

The same applies to the work of civil servants: linking policy ambitions and operational data across services reduces duplication, reporting burdens, and the need to re-key or hunt for information. Too often, policy is set at departmental level while delivery sits with agencies or arm’s-length bodies, making it hard to see how intent translates into outcomes.

Fixing the plumbing, not just the pipes

Over the past five years, many of the Discoveries, Alphas and Betas we have collectively worked on at Public Digital across government departments and arms-length bodies in the UK and globally have had a distinct data flavour. Designers, who were once focused on digitising and streamlining services, are now working closely with data teams to ensure that the data systems behind services are as user-centred and robust as the services themselves, tackling complex challenges like architecture, metadata, standards, access, reuse, interoperability and ethics.

Most interventions, however, still take place at the service, platform or departmental level. But because policies, services and platforms often span multiple departments, the real challenge is fixing the plumbing across the whole system. This means creating shared standards, structures and ways of working that enable government to act as one coherent system, rather than a collection of silos.

Running and maintaining data services comes with both economic and governance challenges. Every integration, API and dataset carries ongoing costs, and across government different models have emerged - from paywalls to annual fees and API charges. Some datasets are also constrained by external agreements. For instance, HMLR’s geospatial data is subject to Royal Mail and Ordnance Survey obligations, which limit how it can be used, shared or repurposed. Even when departments want to link or leverage this data for public services, these restrictions slow down innovation and interoperability.

Too often, projects stall because the economics and governance of data sharing haven’t been fully addressed - who pays, how it’s maintained, and what incentives suppliers have to support interoperability. Without a sustainable model and clear control over key datasets, the government risks paying more while retaining less sovereignty over its own systems.

This isn’t only about efficiency. Too much government data is still locked up in outsourced supplier systems, which are costly and slow to access. If departments can’t use their own data quickly, they can’t govern effectively. Shared infrastructure is about sovereignty as much as interoperability - giving government control over its data. And that control underpins the safe and effective use of emerging technologies like AI. 

As we’ve written about before, the benefits of AI can’t be realised without a clear framework for how data is collected, shared and governed. At the same time, the risk grows that public data is shaped by private interests rather than stewarded for the public good.

Introducing the National Data Library (NDL)

The NDL, announced last year by the Department for Science, Innovation and Technology, is still a developing concept. However, providing it is positioned not as a single database or a new piece of IT kit, but as a shift in how we think about information, it has the potential to be transformative.

Instead of every department maintaining its own siloed data, the NDL offers the possibility of creating shared infrastructure: common standards, trusted governance, and ethical pathways for linking data across domains. Just as libraries collect, organise and preserve knowledge for public use, the NDL could provide the backbone for how the government collects, safeguards and reuses data for the public good.

The work starts small, with modest but foundational steps:

  • Build cross-department teams to test linking projects safely in controlled environments

  • Embed digital-ready legislation so that new rules are drafted with data use in mind from the outset

  • Agree on common identifiers

  • Publish metadata so people know what exists

These are not headline-grabbing reforms. Instead,they’re the plumbing: the hidden infrastructure. But without them, the promise of a National Data Library will remain just that: a promise.

Taking a test and learn approach

The emergent strategy for the NDL has also proposed the use of a test and learn approach, exploring specific service use cases to inform the overall approach.

From Public Digital’s work taking test and learn across the UK government, we have identified three key things we hope the NDL can achieve using this approach:

  • Cross-policy alignment: Too often, the same kind of data is collected differently across policy areas. Departments disagree on definitions, what should be collected, or the purpose of collection. The NDL can test how to reconcile these differences, building shared standards so data can be reused across services without losing the nuance each domain needs.

  • Platform unification: Core platforms like One Login, Notify, and Pay are becoming the backbone of digital government. But each risks building its own data definitions and standards in isolation. The NDL can show how shared infrastructure and identifiers allow these platforms to work together, so users experience them as a coherent whole rather than separate tools.

  • Interoperability in practice: It’s not enough to collect or even share data; it has to work together. Today, datasets are often locked in different formats, missing metadata, or hampered by incompatible permissions. The NDL can test how to make interoperability real: building the standards, connectors and trust frameworks that let data flow across boundaries without losing meaning or security.

These are big, chunky problems. The way through is to start small: pick use cases that cut across these challenges, deliver something tangible, and learn from it. Each experiment generates insights that can be scaled and adapted - building shared infrastructure step by step, without needing a DMA, a full strategy, or a top-down mandate. Foundations are laid not through one big-bang reform, but through a series of small, practical wins that add up to something transformative.

A quiet revolution

If the last decade was about streamlining services, the next decade must be about interconnecting them.

Done well, the National Data Library could be the quiet revolution that makes that possible.

Written by