Collaborating to measure digital transformation: Sharing DIAL’s draft Digital Transformation Indicator Library for consultation and comment

DIAL is excited to share a draft of our Digital Transformation Indicator Library for comment and consultation. The Indicator Library is a key resource for our work to accelerate country demand for digital transformation and responsible data use. The draft is in a Google Sheet and you can access and provide feedback on it here. Read on to learn more about the Indicator Library, what it means to DIAL and hopefully to the wider digital ecosystem, and how you can help us to evolve it.

In October 2020, DIAL and Smart Africa published a Listening Study confirming that countries are actively prioritizing national digital transformation in order to achieve their broader strategic objectives. That same month, we also published a shorter Leadership Brief that calls for global development actors to get behind national and regional strategies and adopt a whole-of-government approach to supporting national digital transformation.

National Digital Transformation: The economic and societal effects of digitalization as it disrupts and reinvents innovative domains across the economy and society of a country, including government institutions.

In line with our philosophy of reusing and improving upon existing knowledge, we built on this research by compiling indicators from across the existing ecosystem into a Digital Transformation Indicator Library. The Indicator Library includes a preliminary list of what we believe are the highest-quality data-backed indicators, drawn from 2,000+ data points across more than fifty digital transformation indices, assessments, and evaluations. Our goal is to leverage it to design an assessment during our upcoming country support in Sierra Leone, as well as to build on and evolve it over time.

Many of the frameworks in the Indicator Library are difficult to neatly categorize as they employ a wide range of methodologies to collect data and cut across several broad areas of evaluation. That said, some of the different types of applications of frameworks we have found through our research include:

  1. Diagnostic Tools: Diagnostic tools – including those from Pathways for Prosperity, UNCDF, UNDP, and the World Bank – are concerned with the replicability and usability of the assessment, and are supported by a significant body of publicly available evidence. Because they are intended to be repeated across many countries, knowledge sharing between organizations that conduct these assessments is critical to advancing progress on national digital transformation agendas and guaranteeing efficient use of resources. We believe more work is needed to transparently share the methodologies, insights, and data from diagnostic tools for the benefit of low- and middle-income countries.
  2. Indices: Indices and/or surveys gather data in a large number of countries on a regular basis, including the GSMA Mobile Connectivity Index, the Network Readiness Index, and the Economist Intelligence Inclusive Internet Index. These indices have a strong conceptual framework and add value by organizing data persuasively and using transparent methodologies. However, the underlying data is rarely publicly available and is often translated into composite scores, making evaluation and strategic planning difficult unless it is done by the organization hosting the index.
  3. Country Reviews: Country reviews – which are conducted by organizations such as USAID, UNCTAD, and Microsoft – are largely focused on assessing a country’s digital landscape. These reviews tailor their analyses to the local country rather than prioritizing replicability and quantitative data collection, which often limits their reusability. They can sometimes also be used to inform strategic planning, as with diagnostic tools.
  4. Regional Strategies: Regional strategies and blueprints, such as the African Union’s Digital Transformation Strategy for Africa and Smart Africa’s Digital Economy Blueprint, provide valuable information about transnational priorities enshrined in continental strategies. However, these regional strategies lack methodologies for collecting data on indicators, which can limit their usefulness for monitoring countries’ progress toward national digital transformation.

We see these assessments as public goods that help produce evidence countries need to monitor their progress toward and implement national digital transformation. Nevertheless, collecting this information has been challenging because the tools and methodologies that inform assessments can differ in their focus and scope. With respect to focus, some narrowly evaluate one particular sector or domain – such as e-trade, digital government, or digital innovation – while others take a broader approach to digital transformation, digital ecosystems, or digital economy. In terms of scope, some assess one country at a time, some include indicators for an entire region, and some aim to assess many countries on an ongoing basis. Additionally, assessments are sometimes not published and do not always provide their source data.

Despite these challenges, we hope to use the Indicator Library not only for our own efforts, but to facilitate collaboration between organizations that conduct and/or finance assessments. We must work together to amplify each other’s work, improve our understanding of the different types of assessments and indicators that are available, and build alignment within the digital development ecosystem. With more collaborative effort and shared inputs, we believe that global development actors can better support national governments in implementing their digital agendas. Moreover, such collaboration should help decrease the upfront transaction costs of financing assessments, as well as improve the measurement of risk.

Please note that what we have chosen to release today is a preliminary, prioritized list of indicators to provide an understanding of the structure and format of our Indicator Library. If you are interested in viewing the full list of indicators, please reach out and we would be happy to share it with you and/or update it to reflect data from your organization. The larger effort that went into this list is still an ongoing process that evolves every day, as we continue to clean the database of 2,000+ indicators and grade indicators using RACER criteria. Over time, some indicators may be organized into different focus areas, variables, and categories as we improve our understanding of the groupings we are observing.

This is an ongoing effort, and we need your help to improve our Indicator Library. For example, we welcome recommendations on how to improve the quality, definitions, and sourcing of the raw data inputs for the indicators we have prioritized, especially if you have worked on one of these assessments. We also welcome feedback on our initial mapping of those indicators to categories, variables and focus areas, though this is still very much a work in progress.

You can provide feedback by making comments in the Indicator Library itself or by reaching out to us directly via Mary Jo Kochendorfer, Director for Policy and Research at DIAL (mkochendorfer@digitalimpactalliance.org). If you have detailed feedback or would like to start a larger discussion about providing indicators or data, viewing the full list, or aligning our efforts, please reach out as well. We look forward to continuing the conversation.