Original Broadcast Date: 02/01/2026
Sponsored by CGI Federal
As federal agencies continue implementing the Foundations for Evidence-Based Policymaking Act, the Treasury Department is taking a structured, enterprise-wide approach to transforming how it governs and uses data. On Fed Gov Today, David Ashley, Acting Chief Data Officer at Treasury, outlines how the department is moving from statutory requirements to practical, mission-driven data modernization.
Ashley explains that Phase Two of the Evidence Act places a strong emphasis on agency data responsibilities. To meet those requirements, Treasury organizes its efforts around five focus areas: data governance, data cataloging, data maturity, data literacy, and data sharing. Together, these pillars create a framework for understanding what data exists across the department, how it is managed, and how it can be used more effectively.
Data governance serves as the umbrella for this work. Treasury is a large, federated organization with roughly 100,000 employees spread across about ten bureaus, each with distinct missions. Under governance, cataloging becomes a critical first step. Ashley describes an ongoing effort to build a comprehensive data catalog that identifies Treasury’s data assets and where they are stored. A new cataloging rule, due later this year, is designed to bring greater consistency and visibility across the department.
Understanding what data exists leads directly into data maturity. Treasury conducts data maturity assessments across its bureaus to determine where systems, processes, and capabilities currently stand. Each bureau completes an assessment, and the results help leadership understand both strengths and gaps. This baseline is especially important as Treasury explores more advanced analytics and artificial intelligence. Knowing where the department is today allows leaders to make informed decisions about where to invest next.
Breaking down data silos is another recurring theme. Ashley acknowledges that silos exist in Treasury, as they do in most large organizations, largely because bureaus support different mission sets. Not all systems need to be connected, he notes, but the assessments reveal where connections would create value. Data sharing efforts focus on linking the right data elements to tell a common story across bureaus, rather than forcing integration for its own sake.
Infrastructure and data standards both play a role in this work. Treasury manages a vast amount of highly varied data stored in systems that do not always communicate with one another. To address this, the department is exploring the use of application programming interfaces, or APIs, to better connect datasets and improve efficiency. At the same time, Ashley stresses that modernization is not just about infrastructure. Standards, definitions, and shared understanding matter just as much.
Three foundational elements guide Treasury’s approach: data quality, metadata, and data frameworks. Ashley emphasizes that data quality is paramount. Without reliable data, analysis loses its value. Treasury often triangulates multiple datasets to see whether they produce consistent
Metadata—often described as data about data—is equally important. As Treasury increases data sharing and enters into more agreements across the department, metadata helps explain the context behind the data. It captures what data represents, how it can be used, and the terms under which it is shared. This clarity allows bureaus to better understand what other parts of Treasury have available and how it might support their own missions.
Coordination across the department happens through an enterprise data council that brings together chief data officers from each bureau. Ashley says these regular meetings improve awareness of who is doing what, foster collaboration, and help spread effective practices. They also create a forum for sharing lessons learned, particularly as bureaus explore different uses of artificial intelligence.
Ashley is careful to frame AI as a tool, not a goal. He acknowledges the temptation to treat AI as a “shiny object,” especially given how quickly the technology evolves. Treasury’s approach focuses on developing concrete use cases and putting appropriate guardrails in place. One example he highlights involves analyzing complex systems like bond markets, where many variables interact. AI techniques, such as combining multiple decision trees, can help uncover patterns and insights that would otherwise be difficult to see.
Equally important are the people who use the data. Ashley notes that IT professionals are critical partners, but they are not the only ones involved. Data stewards and data users—those who create analyses and present information to leadership—play a central role. Having data is not enough; agencies must be able to tell a clear story that includes not just findings, but the “so what” and the “now what.”
To support that goal, Treasury launches a department-wide data literacy campaign. The effort focuses on improving understanding of data concepts and tools across the workforce. Success is measured through surveys, follow-up assessments, and evidence that employees are applying what they learn over time. For Ashley, the ultimate measure of progress is whether data is being used more effectively to inform decisions and advance Treasury’s mission.