4 keys to creating a successful data and digital maturity assessment: Best practices across government
Close
Back to Blog

4 keys to creating a successful data and digital maturity assessment: Best practices across government

Date
May 13, 2022 | Updated on May 16, 2022
Authors

In 2019, the Office of Management and Budget released the Federal Data Strategy. The strategy outlines how agencies can use data and evidence effectively and required them to conduct a Data Maturity Assessment—an assessment that analyzes “all aspects of agency policies, procedures and operations related to data and data infrastructure.”

To support this effort, the Partnership worked with Boston Consulting Group to launch the Federal Data and Digital Maturity Index—a survey that assesses data and digital maturity—and shared the results in our most recent issue brief, “Federal Data and Digital Maturity: Agencies Assess Where They Stand.”

Based on interviews with agency leaders across government who conducted their own data maturity assessments, the report offers best practices to help agencies that are working through the earlier stages of data maturity assessment planning. We explore these practices below.

Invest in the design phase

Across government, agency leaders emphasized the importance of investing time in the design phase of data maturity assessments. Agencies should carefully consider which topics the assessment should probe, who should participate in the assessment and how the results should be used.

Some agencies, such as the National Science Foundation, used an outside contractor to help get the design right. In doing so, NSF combined its internal knowledge with the objectivity and additional expertise of an external partner to successfully design a data maturity assessment.

Boost participation by fostering buy-in

A requisite number of people need to complete a data maturity assessment for it to be meaningful and make an impact. However, it can be challenging to convince busy government employees to participate in an exercise that they might see as futile.

Therefore, it is essential that agencies make employees’ involvement as easy as possible. For example, to avoid delays or procrastination, the Nuclear Regulatory Commission offered staff a series of set days and times to take the survey.

Pilot the assessment before rolling it out widely

Agencies should pilot their data maturity assessment with a small sample of the workforce before rolling it out widely.

At NSF, the pilot maturity assessment took more than five hours to complete, so the agency streamlined the final assessment before distributing it to the full workforce. At the Department of Education, the pilot assessment revealed employee confusion about certain terminology. As a result, the agency changed and clarified the meaning of some wording.

Demonstrate value by acting on the results

Most importantly, agencies looking to complete data maturity assessments should create a plan to turn results into action. This approach will demonstrate the value of assessments and help generate buy-in for future ones.

“We owe it to the offices that took the assessment to use the results,” said Soumya Sathya, a management and program analyst at Education.

Other agencies, such as the State Department, echoed Sathya’s sentiments. To make its data strategy actionable, the State Department organizes data-driven campaigns every six months and uses data to tackle one foreign policy priority and one management priority.

The first foreign policy campaign aimed to incorporate data into analyses of strategic competition with China. According to Garrett Berntsen, State’s deputy chief data officer, “people will show up to a working group meeting with a lot of opinions and good energy” on topics they care about.


For more insight into these practices, as well as the results of our Federal Data and Digital Maturity Index survey, see “Federal Data and Digital Maturity: Agencies Assess Where they Stand.”


Leave a Reply