A prototype that compares coronavirus response sites

    Since the start of the crisis, and through our work with teams around the world, we’ve kept a close eye on government response websites.

    We’ve paid particular attention to:

    1. How fast they are. Fast sites avoid frustration and reassure users – particularly important when they’re searching for information in a stressful situation. They work well on low-cost devices and don’t cost citizens a fortune in data to access.
    2. The reading age of the content. Clearly-written, concise and actionable content opens information up to more people – particularly important when concentration is low and emotions are high. [we’ve removed this while we do a bit more thinking about whether we can make an automated reading score a helpful and fair metric.]
    3. How accessible they are. Sites that work well for people with accessibility needs work better for everyone, across devices, quality of connections, etc.

    These metrics represent a baseline and sites that don’t get these things right are failing their users. However, you need to get into the details, analyse decision-making and look for feedback loops to find out whether you’re really meeting user needs.

    Leaders, how does your site compare?

    We’ve been tracking those metrics across approximately 90 coronavirus response sites and to help make sense of this we’ve made a prototype tool – covidsites.public.digital – from our analysis.

    The aim is to present data simply, in a way it can be easily compared to data from other response sites. In the past we’ve seen leaders respond and give the go-ahead to fixing problems really quickly when we’ve ‘shown the thing’ and presented information in a similar way.

    And, if you’re a leader, you can use these metrics to get a sense of whether your site meets a baseline, and make decisions about whether the team responsible for it is equipped with the right understanding, skills and focus. Progressive parts of the web industry have already recognised that if you want to reach users (customers) who don’t have the latest devices and limitless bandwidth, you need to do the work to keep things simple. It’s not hard, but it does require understanding and focus.

    As David Eaves of Harvard Kennedy School said at our joint digital services convening: “teams went into the crisis with the digital team they had, not necessarily the digital team they wanted.” That’s reflected in these sites. This work helps illustrate whether there’s a gap there, not to judge but to see where improvement is needed.

    Feedback on the prototype

    covidsites.public.digital is very much a prototype—we can’t promise rigorous consistency in the data, we’re running it in time snatched from other activities, some of these areas are complex to quantify, and there are definitely gaps — but it shows how the 90 or so sites we’ve been analysing fare.

    The prototype’s regularly updated because the web is a living medium (these sites should be changing regularly) and because it’s easy to do it. The code we’re using to power it and the data we’re collecting are also public to make sure that our methodology is completely transparent.

    And because making things open makes things better, we’re putting it out there to get some feedback, to see what everyone makes of it and whether there are ways we can make it more useful. The best way to send us feedback is by email to [email protected]

    We’ll keep it going for a few months and then retire it gracefully.

    Written by

    public digitalThe public digital logo

    Head Office

    Clerks Court
    18-20 Farringdon Lane
    London, UK
    EC1R 3AU

    Our positions

    Our values expressed in action and outcomes.

    Read them here

    Newsletter

    A monthly scan about digital transformation and internet-era ways of working around the world.