Federal Data and Digital Maturity

Agencies Assess Where They Stand

Logo for Partnership for Public Service
The Partnership for Public Service is a nonpartisan, nonprofit organization that strives to build a better government and a stronger democracy.
Logo for BCG
BCG is a global management consulting firm dedicated to advising clients in the private, public and not-for-profit sectors. We partner with our clients to identify their highest-value opportunities, address their challenges and transform their enterprises so that they achieve sustainable competitive advantage, build more capable organizations and secure lasting results. In our work with the federal government, BCG is recognized for bringing commercial insights and best practices to our public sector clients. To learn more, visit bcg.com.
Table of Contents

Introduction

A federal scientist holds up a tablet during a demonstration.

Photo credit: Robert Turtil/Department of Veterans Affairs

Federal agencies are more frequently turning to digital technologies and data to improve their decision-making and service delivery.1 Examples include the Department of the Treasury analyzing federal spending and demographic data to determine whether COVID relief funds were being equitably distributed,2 and the Department of Veterans Affairs using artificial intelligence to identify veterans most at risk of suicide.3

It is a promising development that could lead to a more efficient and effective government that produces better results for the public.

In 2019, the Office of Management and Budget identified actions for agencies to take over the next 10 years to use data to the fullest extent possible. The actions are part of OMB’s Federal Data Strategy, which the agency released shortly after the Foundations for Evidence-Based Policymaking Act was signed into law.

One Federal Data Strategy requirement calls for agencies to conduct a data maturity assessment to analyze “all aspects of agency policies, procedures, and operations related to data and data infrastructure.” The areas for agencies to assess include data-related governance, management, culture, analytics, systems and tools, as well as staff skills and capacity, resource capacity, and compliance with law and policy.4 After completing the assessment, agencies should have a better understanding of their strengths and weaknesses, and how they can allocate their limited resources to boost their data maturity.5

While the Federal Data Strategy provides guidance on what to include in a data maturity assessment, agencies can customize the assessment to fit their needs. For example, an agency that wants to understand how effectively it is using artificial intelligence to analyze data might design an assessment that explores that topic in depth. Other agencies might emphasize topics such as the quality of data that employees have at their disposal or how data is being managed.

To help federal agencies fulfill the data maturity assessment requirement or build on the results from a completed maturity assessment, the Partnership for Public Service and Boston Consulting Group (BCG) last year launched the Federal Data and Digital Maturity Index, or FDDMI—a survey that assesses data and digital maturity.

Respondents are asked to evaluate their agency on numerous dimensions, including the strength of its data and digital strategy; its use of data and technology to simplify processes, improve how work gets done and serve its customers better; and leadership’s advocacy for evidence-building and data-driven decision-making. Because human capital is integral to every agency’s data and digital maturity, the survey also includes a set of questions that measures how effectively agencies recruit, hire, develop, engage and retain a high-caliber data and digital workforce.

The Partnership and BCG administered the survey, collected the responses, and presented the results to participating agencies. Between September 2021 and February 2022, six federal agencies completed the inaugural edition of the assessment: the Department of Labor, the Department of the Treasury, the Federal Emergency Management Agency, the Office of Personnel Management, the Small Business Administration and the U.S. Patent and Trademark Office. The Partnership and BCG intend to survey additional agencies the next time the assessment is administered.

This issue brief presents the results from the Federal Data and Digital Maturity Index survey. They revealed that government’s data and digital maturity trails the global public and private sectors, but agencies have big aspirations to improve in the next five years. The issue brief also looks at other ways agencies assess data and digital maturity, to highlight leading practices agencies could adopt as they develop, administer and act on the findings of their own maturity assessment—whether or not it is the FDDMI.

The Federal Data and Digital Maturity Index Survey

Photo credit: Shutterstock

METHODOLOGY

The Federal Data and Digital Maturity Index survey has 29 questions, each addressing a specific topic related to digital technology or data—for example, the use of artificial intelligence. Each question asks respondents to evaluate their agency’s current level of maturity, its target level of maturity in the next five years, and how important the question topic is to their agency, on a scale of 0-100.

An agency’s overall score for both its current and target data and digital maturity is calculated by averaging its scores across the 29 survey questions—with higher scores indicating a greater maturity level.6

The questions—referred to as “dimensions” in the survey—are grouped into eight categories, which represent the building blocks of data and digital maturity.

 

 

The score of each building block is calculated by taking the average score of its questions. For a description of what each building block measures, see Appendix II.

After completing the survey, agencies should come away with an understanding of their current state of data and digital maturity, the areas with the largest gaps between current and target state, and whether the data and digital activities deemed most important are also the most mature.

To gather diverse employee perspectives, participating agencies are encouraged to administer the maturity assessment to a broad set of stakeholders, including individuals from the offices of the chief data officer, chief information officer, chief human capital officer, and administration and management, as well as program offices.

Survey results here reflect a sample of six federal agencies of varying size, with diverse missions.

 

KEY FINDINGS OF THE FEDERAL DATA AND DIGITAL MATURITY INDEX

Federal data and digital maturity trails the global public and private sectors, but agencies aim to improve dramatically in the next five years.

The overall data and digital maturity score for federal agencies that completed the assessment is 36 out of 100, putting government at a “literate” level of maturity. By comparison, the global public sector maturity score is 11 points higher at 47 out of 100, and the global private sector score is 18 points higher at 54 out of 100.7 Government should aim to meet or exceed the global public and private sector scores, boosting its ability to address this country’s biggest challenges.

 

 

Participating federal agencies also registered lower scores than the global public and private sectors on all eight data and digital maturity building blocks.

The gap was most stark in the Reimagine Government category, where government fell behind the public sector by 17 points and the private sector by 24 points. This category includes survey questions that assess how innovative and open to experimentation agencies are when it comes to using data and digital technologies to improve their outcomes, including whether they pursue opportunities to collaborate with and learn from peer agencies. For example, an agency that administers programs to combat childhood poverty could learn more about the problem by supplementing its data with data housed at the departments of Agriculture, Education, Health and Human Services, Labor or other departments and agencies that work on the issue.

Government came closest to matching the global public and private sectors in the Data and Analytics category, trailing the private sector by 14 points and the public sector by just 6 points. Survey questions in the Data and Analytics category assess the strength of an agency’s data and digital culture, exploring whether artificial intelligence is used to its fullest potential, data governance is robust, and agencies strive to share their data with the public. Federal agencies have been compelled to make improvements in this area in recent years by a spate of legislation and executive directives, including the Foundations for Evidence-Based Policymaking Act, the Federal Data Strategy, and the Digital Accountability and Transparency Act.

In addition to evaluating current data and digital maturity, survey respondents were asked to assess how mature their agencies aspire to be in five years. The data reveals lofty goals, with participating agencies registering a target maturity score of 75 out of 100, nearly 40 points higher than their current level of maturity.  

Participating agencies' target score in five years. 

Agencies’ target scores drastically exceeded their current scores for all eight data and digital maturity building blocks, with the largest gap, at 47 points, in the Modular Technology category. The survey questions in this category assess how closely and collaboratively IT and program offices work together, and whether an agency’s technology is modern and nimble, and not overburdened by legacy systems. The smallest gap between current and target maturity, at 36 points, was in the Reimagine Government category.

Participating agencies scored highest on Mission and Strategy, lowest on Reimagining Government.

Of the eight data and digital maturity building blocks measured by the FDDMI, the six federal agencies scored highest overall on Mission and Strategy, with a maturity score of 42 out of 100. Survey questions in this category assess whether an agency has an ambitious vision for accelerating the use of data and digital technologies, a plan to do so, and a process in place to measure progress. Government’s strength on Mission and Strategy suggests it has a good foundation in place to build its data and digital maturity, but lower scores elsewhere indicate government has yet to fully to turn its vision into results.

Participating federal agencies registered the lowest level of maturity on the Reimagine Government building block, with a score of 30 out of 100.

 

 

The most important data and digital activities are often the most mature, signaling a strong focus on government’s top priorities.

In addition to evaluating maturity, survey respondents are asked to assess how important the topic of each survey question is to their agency. For example, if asked about cybersecurity, respondents evaluate its maturity as well as its importance to their agency.

Results reveal that data and digital topics deemed to be most important also tend to be the most mature. For example, the Mission and Strategy building block logged the highest current and target maturity scores, and the highest importance score. Conversely, the Reimagine Government building block had the lowest current and target maturity scores, and the lowest importance score.

While maturity and importance do not always align perfectly, they tend to move in tandem. This could suggest that agencies have been successful at identifying and prioritizing the development of areas most important to them.

 

 

Improving how technology is used to strengthen internal processes is a top government priority, while findings suggest there is less emphasis on building the data and digital workforce

Among the scores measuring target maturity and importance, some of government’s highest are concentrated in questions about how technology is used to improve internal processes, including how work gets done.

For example, the Data and Analytics building block, which includes questions about AI, cybersecurity, and using technology to process data, is tied for the largest target maturity score at 80, and the third largest importance score at 83. The Modular Technology building block has the third largest target maturity score at 79 out of 100, and is tied for the largest importance score at 85.

Conversely, survey respondents suggested there is less ambition to grow and develop the data and digital workforce than other areas, with the Digital Talent building block registering only the fifth largest target maturity and importance scores. Nevertheless, the Digital Talent building block’s target maturity score of 74—coming in at 40 points above its current maturity level—suggests that government aspires to improve.

While government’s interest in making the most of technology to improve its internal processes is laudatory, federal agencies should also ensure that advances in technology translate into more seamless and effective delivery of services to the public—a goal of President Biden’s December 2021 executive order.8 Further, agencies must not neglect building a world-class data and digital workforce. While government has lofty ambitions to grow its data and digital maturity in the next five years, its ability to do so will hinge on its talent.

Human Capital

Photo credit: Shutterstock

Human capital is at the core of every agency’s journey to boost its data and digital maturity, and success depends on having employees with the right data and digital skills in the right positions at the right time. Given the importance of human capital, the inaugural edition of the Federal Data and Digital Maturity Index included a separate set of 20 questions to assess how agencies recruit, hire, develop, engage, reward and retain their data and digital workforce. As with the rest of the survey, the goal is to help agencies understand where they are performing well, and where they might allocate their limited resources to improve.

While human capital maturity is key to agencies’ efforts to reach data and digital maturity, the overall score for federal agencies landed at 28 points out of 100.

Survey questions are grouped into six categories, or building blocks, each reflecting a part of the human capital life cycle. A category score is calculated by taking the average score of its component questions. For a description of each category, see Appendix III.

 

 

While maturity scores across the six building blocks are similar, Leadership and Cultural Change received the highest mark at 30 points out of 100. This category evaluates how effectively agencies develop the leadership acumen of their data and digital workforce, promote diversity and inclusion among this group of employees, and ensure that everyone in the organization has at least a basic level of data and digital fluency. The Organizational Transformation category, the building block scoring the lowest at 25 points out of 100, assesses how thoroughly employees with data and digital expertise are involved in an agency’s projects.

Just as federal agencies aspire to grow their data and digital maturity in the next five years, they also have big aspirations for their data and digital workforces. The average target maturity score for human capital was 67 points out of 100—a 39-point increase over the current level. And the target maturity scores for all six human capital building blocks were at least 37 points higher than current levels, according to the survey. Notably, the Organizational Transformation category has the lowest current maturity score, but the highest target score.

Four Keys to a Successful Data and Digital Maturity Assessment: Lessons from Across Government

Photo credit: Partnership for Public Service and the Noun Project

A data and digital maturity assessment, whether it is the Federal Data and Digital Maturity Index or an alternative, can offer valuable information about agencies’ strengths and weaknesses and help them chart a path for growth. Yet these assessments will only be useful if agencies design and administer them effectively and act on the results. To help agencies get the most value from an assessment, the Partnership and BCG interviewed federal leaders at several agencies that conducted their own maturity assessments to learn about their best practices. They identified four important components for success: invest ample time in the design phase; boost participation by fostering buy-in; pilot the assessment before rolling it out widely; and demonstrate value by acting on the results.

 

Timer

INVEST AMPLE TIME IN THE DESIGN PHASE

Agencies should carefully consider what they want to learn from their data and digital maturity assessments and design them to elicit that information, according to the leaders we spoke with.

In the design phase, agencies need to consider who should complete the maturity assessment, which topics the assessment should probe, and how to write questions clearly and correctly, so they are neither leading nor ask about more than one issue per question. Short-changing the design phase can lead to results an agency is unable to use. “It can be impossible to recover from design flaws,” said Clemencia Cosentino, section head and chief evaluation officer at the National Science Foundation, which runs its own data maturity assessment.

The team in charge of the NSF assessment worked with experts inside and outside the agency to get the design right, including having a contractor administer the assessment. “NSF staff know the agency, what it wants to learn from the assessment, and how to word questions in a way that will make sense to our employees and get us information about what we are trying to measure,” Cosentino said. “Our outside partners brought independence, objectivity and additional expertise.”

To identify the right people to complete the maturity assessment, the NSF team designated an ambassador in each office or directorate, shared selection criteria with those ambassadors, and asked for their help selecting people who fit the bill. “Selection makes all the difference,” Cosentino said. “We needed to make sure we identified the right people.”

While some agencies administer a standard survey with fixed response options that participants complete on their own time, NSF brought groups of people together instead and conducted interviews about the data maturity in the offices where they worked. Cosentino realized it was beneficial to have administrators in the room who could clarify what the assessment’s questions were getting at, since different offices talk about data differently, including the definition of the term itself. The person administering the assessment could ensure that all participants interpreted the questions the same way.

 

Outline of person with a plus symbol

BOOST PARTICIPATION BY FOSTERING BUY-IN

A data and digital maturity assessment will only yield credible findings if enough people complete it. Yet employees are busy, and it can be challenging to convince them to participate, according to leaders we interviewed. Agencies have experimented with a range of techniques to convince the staff that completing the maturity assessment is a worthwhile investment of their time.

At the Department of Education, the Office of the Chief Data Officer strives to ensure the maturity assessment focuses on topics the agency cares about most. “The big thing is to make sure it’s relevant for people,” said Soumya Sathya, a management and program analyst in the organization “I think that’s hugely important. We don’t want to [come off as], ‘We’re the data folks and we’re assessing other people.’”

To foster better communication and collaboration with headquarters, Education recently designated one person in each office to serve as the “data coordinator.” Data coordinators speak on behalf of all data professionals in their offices, providing headquarters with a single point of contact. The department hopes its streamlined approach will make it easier to learn about each office’s data priorities and challenges, and what resources they need to excel.

Other ways to encourage participation in data and digital maturity assessments include making involvement as easy as possible, marshalling leadership support, emphasizing that the assessment is meant solely to help the agency learn and improve, and sharing the assessment results across the organization.

The assessment team at the Department of Education held an orientation session and office hours for employees taking the department’s data maturity assessment, to ensure they felt prepared to answer the questions. The Nuclear Regulatory Commission invited every employee to take a data maturity survey. But rather than email it and risk that employees would overlook it or procrastinate for too long, the NRC presented a choice of days and times for taking the survey and asked prospective participants to pick one. On that day and time, participating employees signed in to receive and complete the assessment.

Leaning on agency leaders to encourage participation can also be effective, according to people we spoke with. At the Nuclear Regulatory Commission, the assessment team ensured executives knew the survey was coming, what it aimed to accomplish, and why it was important to the agency. With the executives bought in, the assessment team could count on them to encourage their staff to complete the survey.

Finally, breaking down the agencywide results of a maturity assessment and showing each work unit that completed it how it fared is a valuable way to generate buy-in and strong participation. So is treating the assessment as a learning exercise rather than a “gotcha” activity. At the National Science Foundation, the agency’s aggregate results are released to everyone, but each office or directorate that participated also receives the specifics about its own organization. “We incentivized people to complete the assessment,” said the NSF’s Cosentino. “We said, ‘We will tell you how NSF is doing, but we will also tell you how your organization is doing, and you can do with that information what you please. We’re not going to share it with anyone.’” Cosentino credits staff across NSF, whose participation in the assessment generated findings that will help the agency improve.

 

Computer with arrows on the screen

PILOT THE MATURITY ASSESSMENT BEFORE ROLLING IT OUT WIDELY

Agencies should consider piloting their data and digital maturity assessment with a small sample of the workforce, to determine what changes should be made before rolling the assessment out more widely, according to agency experts we interviewed.

Both the Department of Education and the National Science Foundation did so and received valuable feedback. Participants in the Department of Education’s pilot identified where terminology could be clearer.

The NSF found that the maturity assessment in its pilot took too long to complete, potentially jeopardizing participation. “It took five to seven hours to finish the interviews for one organization, and people said ‘No way. We’re not going to participate,’” Cosentino said. The agency streamlined the assessment so it takes between two and 2.5 hours to finish.

 

Crossed wrench and screw driver

DEMONSTRATE VALUE BY ACTING ON THE RESULTS

After completing a maturity assessment, agencies should develop a plan for turning the results into action. Completing the assessment “requires a time commitment, and we owe it to the offices that took the assessment to use the results,” said Education’s Sathya. “Otherwise, when we go back next year and say we’re doing it again, they’re not going to be as engaged in the process.”

The State Department used the findings from its data maturity assessment to shape its inaugural enterprise data strategy. To make the data strategy actionable, the Office of Management Strategy and Solutions’ Center for Analytics organizes campaigns every six months during which data is used to tackle one foreign policy and one management priority.

During the campaigns, employees from data and programs offices work together to identify ways they can use data to address those priorities. In the first campaign, the agency focused on two issues: strategic competition with China and improving diversity, equity, inclusion and accessibility in the department’s workforce.

“One option is to say, ‘We have a problem with data governance, so let’s have a working group. Show up to the data standards working group, everybody,’” said Garrett Berntsen, the State Department’s deputy chief data officer. “But we didn’t feel like that’s how you get people jazzed about these things. You get people excited by saying, ‘We’re going to tackle diversity data.’ The same goes for strategic competition with China. People will show up to a working group meeting with a lot of opinions and good energy on that.”

Conclusion

Photo credit: Shutterstock

Digital technologies and data can transform federal agencies, enabling them to make better decisions, improve their programs and deliver better services to the public. Conducting a data and digital maturity assessment is a critical first step in building a more data-driven and digitally enabled agency. In a 2021 survey of federal chief data officers whose organizations had completed an assessment, conducted by the Data Foundation, 83% reported that assessing their agency’s maturity was “somewhat,” “a lot” or “completely” helpful.9

While the Federal Data and Digital Maturity Index showed that government has a long way to go before reaching full maturity, it aspires to make big gains.

To accelerate the journey, agencies could take some important steps, including:

  • Using the findings of their maturity assessment to spark discussions with stakeholders across the agency, getting buy-in on what the next steps should be, which areas to prioritize, and how to achieve results.
  • Periodically reevaluating and refreshing their data and digital maturity strategy to account for emerging technologies, legislation, mandates or other developments that could affect agencies’ data and digital landscape.
  • Focusing on human capital, allocating resources toward recruiting, hiring, developing, engaging and retaining a high-caliber data and digital workforce.
  • Routinely measuring progress by conducting follow-up data and digital maturity assessments.

Our hope is that agencies will go beyond complying with the requirement to conduct a maturity assessment and fully embrace the opportunity to act boldly on the findings. If they do, the people of this country will benefit.

Footnotes
  • 1. Examples of digital technologies include cloud computing, cyber security platforms, artificial intelligence and machine learning, and other emerging technologies.
  • 2. The Census Bureau, “Analyzing Equity in Federal COVID-19 Spending.” Retrieved from bit.ly/3D9mern
  • 3. Benedict Carey, “Can an Algorithm Prevent Suicide?” The New York Times, Nov. 23, 2020. Retrieved from nyti.ms/3wFdSqe
  • 4. Office of Management and Budget, “Federal Data Strategy 2020 Action Plan,” 25. Retrieved from bit.ly/3oKDIEx
  • 5. The FDDMI is based on BCG’s Digital Acceleration Index survey, which has been administered to 501 public and 4,247 private sector organizations globally. For more information, see: on.bcg.com/3wlVcM9
  • 6. Overall maturity scores fall within four levels. A score between 0-24 puts an agency at the “starter” level of maturity, scores between 25-49 is “literate,” the 50-74 range is considered “performer” level, and scores between 75-100 indicate a “leader.” For a description of each maturity level, see Appendix I.
  • 7. BCG’s Digital Acceleration Index, a survey comparable to the Federal Data and Digital Maturity Index, was administered to 501 public sector agencies in more than 10 countries, and 4,247 private sector companies spanning more than 19 countries.
  • 8. The White House, “Executive Order on Transforming Federal Customer Experience and Service Delivery to Rebuild Trust in Government,” December 13, 2021. Retrieved from bit.ly/3tAEBm2
  • 9. The Data Foundation, “CDO Insights: 2021 Survey Results on the Maturation of Data Governance in U.S. Federal Agencies,” September 2021. Retrieved from bit.ly/3sbdNrI
Authors

Loren DeJonge Schulman leads the Partnership’s efforts to develop forward-thinking solutions that change the way government works and evaluate our impact. She began her career in public service as a Presidential Management Fellow and devoted ten years at the Department of Defense and National Security Council to building networks and ideas for problems only the government can solve. For the last five years she has led research efforts at the Center for a New American Security to elevate the national security debate and prepare the national security leaders of today and tomorrow. Her favorite public servants are former Secretary of Defense William Perry, who understood how America’s innovation was a foundation to its security, and former U.S. Representative Barbara Jordan, who declared that when it came to the Constitution she would not be an idle spectator.

Email Loren

David Garcia leads the organization’s Best Places to Work in the Federal Government® research. David holds a Ph.D. in political science from the University of Maryland – College Park. David’s favorite public servant is his wife, Amanda, who works at U.S. Customs and Border Protection.

Email David

Maddie Powder supports the Partnership’s quantitative research work, including projects on developing federal data capabilities. She developed a love for research in college while researching counterterrorism and gained a passion for public service through an internship at the State Department. She is interested in evidence-based policymaking and its potential to streamline decision-making and improve government effectiveness. Maddie’s favorite public servant is Tammy Duckworth, who served her country bravely and is the first Thai American woman and the first woman with a disability elected to Congress.

Email Maddie
Appendix I

Appendix II

Appendix III

Project Team
Partnership for Public Service
Loren DeJonge Schulman
Vice President, Research, Evaluation, and Modernizing Government
Andrew Parco
Associate Digital Design Manager
Samantha Donaldson
Vice President, Communications
Ellen Perlman
Senior Writer and Editor
David Garcia
Senior Manager
Audrey Pfund
Senior Design and Web Manager
Tim Markatos
UX Design Manager
Maddie Powder
Associate
BCG
Valerio Gardelli
Lead Knowledge Analyst
Steven Mills
Managing Director and Partner
Jonathan Milde
Managing Director and Partner
Brady Wargofchik
Senior Knowledge Analyst

 

Header photo credit: Shutterstock