DoD’s Software Factories Come of Age

 

 

Original broadcast 5/11/25

Presented by Okta and Carahsoft

In recent years, the Department of Defense has launched an aggressive software modernization push—one built on the pillars of cloud adoption, cultural change, and agile development. At the center of this transformation is George Lamb, Director of Cloud and Software Modernization in the Office of the DoD CIO, who joined Fed Gov Today at TechNet Cyber to discuss how the department is moving from pilot projects to enterprise-scale execution.

Lamb describes the journey as one that began with urgency and experimentation but is now firmly rooted in strategy and accountability. “If you go back five years, we were still figuring out that software is never done,” Lamb said. “The old model—waiting two to three years to see if a program worked—doesn’t hold up anymore. That’s where the software modernization strategy came in.”

That strategy evolved into three primary goals: modern software development, effective cloud deployment, and the culture to support both. “If you get those right,” Lamb said, “you can start to scale.”

Cloud at the Edge, Software at Speed

One of the foundational goals of the modernization effort has been moving to the cloud—but not just in the traditional, commercial sense. DoD’s global operations, including OCONUS (outside the continental United States) and at the tactical edge, require a unique approach.

“The business of the department spans CONUS, OCONUS, and the edge,” Lamb explained. “Our cloud strategy has to reflect that. We’re not just spinning up servers in commercial regions. We need resilient, forward-deployed solutions that work for the mission.”

Meanwhile, the software modernization effort is focused on consistency—especially across the many software factories operated by the services. Lamb noted that four years ago, many of those factories operated like startups, building interesting tools in isolation, with little coordination or shared infrastructure. Today, that’s changed.

“All of the services have now assessed their software factories and are consolidating where needed,” Lamb said. “The Navy has developed guidance to direct usage, the Army has shut down insecure factories, and the Air Force has unified its software development under one directorate.”

What’s emerged is a maturing ecosystem of software factories backed by standardized platforms, reusable components, and consistent security models. “It’s no longer a question of who’s funding what or where to go,” Lamb said. “The services are managing these as business portfolios now. That’s a big shift.”

DevSecOps and Continuous ATO

At the heart of this new approach is DevSecOps—integrated development, security, and operations practices designed to speed up software delivery without compromising trust. Lamb said that moving from legacy risk management to continuous authority to operate (ATO) is a major enabler.

“When the code is done in two months instead of two years, your system needs to be ready to accept it,” he explained. “That means continuous ATO, reciprocity across systems, and automated testing pipelines.”

A forthcoming report from DoD and Carnegie Mellon’s Software Engineering Institute, titled The State of DevSecOps in the Department of Defense, provides a quantitative look at just how far the department has come. Lamb sees it as a marker of maturity—and a springboard for what’s next.

Bringing AI Into the Pipeline

Looking ahead to fiscal years 2025 and 2026, Lamb said the focus is shifting to the next frontier: integrating artificial intelligence into the software lifecycle and legacy systems.

“We’re done with the lighthouse programs and pilots,” he said. “Now it’s about taking what we’ve proven works and applying it at scale—especially with AI.”

Lamb breaks down the department’s AI efforts into two tracks. The first is using AI to improve software development itself—enhancing productivity through smart tooling. The second is injecting AI models into mission systems for real-time decision-making.

“It’s one thing to have a good model,” Lamb said. “It’s another to get that model into production. That’s the hard part. AI is still software, and it has to go through the same DevSecOps pipeline, be tested, deployed, and continuously monitored.”

Lamb’s team is collaborating with organizations like MITRE to develop guidebooks and playbooks that help program offices integrate AI more efficiently. The emphasis is on practical enablement, not just experimentation.

From Guidance to Policy

Lamb emphasized that this evolution is being supported by a deliberate progression of governance—from goals to implementation plans, and now to instructions.

“Guidance precedes policy,” he said. “We’ve experimented, we’ve documented what works, and now we’re codifying it in a way that sticks.”

One key example is the emerging software acquisition pathway, which turns traditional, years-long acquisition models into agile cycles with built-in DevSecOps practices. “There’s now a law that says we should follow this pathway,” Lamb noted. “It’s changing how the department thinks about software—from procurement to test to deployment.”

Moving at the Speed of Conflict

Lamb closed with a reminder of why this work matters. “We’ve proven the technology works. We’ve shown we can move faster. Now the challenge is changing the culture to accept that speed—and to get new capability into production when it matters most.”

The stakes are high. As the department watches the evolving pace of decision-making and innovation in places like Ukraine, it’s clear that agile software is no longer a luxury. It’s a necessity.

“Modernization isn’t about the tools anymore,” Lamb said. “It’s about getting them into the hands of the mission—quickly, securely, and at scale.”