Presented by Carahsoft
Delivering Decision Advantage in a Contested Maritime Environment
Broadcasting directly from WEST 2026 in San Diego, this special episode of Innovation in Government explores the cutting-edge digitization and modernization efforts transforming the U.S. Navy, Marine Corps, and the broader Department of Defense. Through in-depth interviews with military leaders and top industry experts, we examine the critical shift toward data-driven decision-making, the drive for open architectures, and the race to achieve seamless interoperability from the enterprise down to the tactical edge. Viewers will gain unique insights into how the services are overcoming technical debt, leveraging commercial-off-the-shelf technologies, and preparing warfighters for future conflicts through advanced training continuums and accelerated sensor-to-shooter kill chains.
Mastering Surface Warfare: The Future of Combat Training and Readiness
RDML T. J. Zerr, Commander of the Naval Surface and Mine Warfighting Development Center for the U.S. Navy, discusses the critical mission of the Surface Warfare Combat Training Continuum and the absolute necessity of ensuring warfighters are ready for high-end conflict on day one. Drawing on recent real-world lessons from engagements in the Red Sea and other conflict zones, RDML Zerr explains how the continuum represents a fundamental shift in how the Navy approaches readiness. Instead of merely tracking whether a watch team as a whole has completed basic certification, the new model dives deep into the individual proficiency of every single watchstander. By meticulously monitoring the specific repetitions and sets of critical warfare areas—such as anti-submarine or air defense operations—the Navy ensures that every sailor achieves a master-level understanding of their role before they ever face a crisis. Data plays an instrumental part in this transformation. The Navy has moved beyond outdated spreadsheets to advanced systems that display training and readiness matrices across the entire force in real time. Furthermore, actual combat systems data is now being pulled directly from ships engaged in live operations. This information, which once took months or years to process, is now analyzed within minutes or hours by Warfare Tactics Instructors and industry experts. The resulting insights are rapidly pushed back out to the fleet as immediate software updates or tactical setup recommendations, ensuring combat systems are continually optimized against evolving threats. Ultimately, the goal is to build highly cohesive, maximally effective watch teams composed of thoroughly trained individuals who can execute flawlessly during the very first salvo of a conflict.
-
The Surface Warfare Combat Training Continuum focuses on tracking the specific repetitions and sets of individual watchstanders to build complete, highly proficient watch teams.
-
The Navy is utilizing data not only to track training readiness across the force but also to rapidly push software updates to combat systems based on real-world engagement data from areas like the Red Sea.
-
By training individuals to a master level using simulators and targeted scenarios, the entire watch team can operate at maximum speed and effectiveness during an initial crisis.
Harmonizing Data for True Interoperability and Zero Trust
Chad Keefer, Head of U.S. Federal at Infoblox Federal, provides critical insights into the ongoing challenge of achieving true interoperability and advancing zero trust architectures across the Department of Defense. While acknowledging the significant progress made by military departments and industry over the past decade, Keefer points out that the government still struggles with isolated pockets of excellence. Agencies frequently rely on highly effective but vertically siloed platforms—such as specific endpoints or firewalls—that fail to communicate seamlessly with one another. This lack of integration is further complicated by a thicket of overlapping guidance from entities like the DoD, NIST, and OMB, as well as differing technological baselines across mission partner environments. To overcome these hurdles, Keefer advocates for a fundamental shift toward horizontal platform architectures that utilize data as a foundational, unifying control plane. By harmonizing data across disparate systems, network operations, security operations, and cloud teams can finally operate from a single, shared baseline of factual information. This unified approach is essential for realizing complex modernization efforts like zero trust. Importantly, Keefer stresses that the solution does not always require the government to undergo costly and disruptive overhauls. Instead, industry must focus on delivering commercial-off-the-shelf technologies with open APIs that can integrate with existing tools, allowing agencies to maximize their return on investment. Ultimately, success relies on a closer partnership between industry and government, where industry helps translate complex executive mandates into practical, repeatable, and mission-forward solutions that cut through bureaucratic red tape and empower the warfighter.
-
To move past isolated pockets of excellence, organizations must shift away from vertical platforms that don't communicate and embrace horizontal, data-harmonized platforms.
-
Implementing commercial-off-the-shelf technologies with open APIs allows agencies to operate off a common operational picture without resorting to costly replacement strategies.
-
Industry and government must partner closer together to cut through red tape and translate complex executive orders into practical, mission-forward solutions.
Back to Basics: The Foundation for AI and Cybersecurity Modernization
Brian "Stretch" Meyer, Federal Field CTO for Axonius Federal Systems, emphasizes a critical reality check for the Department of Defense: before military organizations can successfully harness the power of advanced technologies like artificial intelligence and machine learning, they must first master the absolute fundamentals of IT and cybersecurity infrastructure. Drawing on his extensive background as a cyber engineer working directly on DoD networks, Meyer observes that many agencies are eager to leap into AI initiatives while still struggling to answer basic questions about device visibility and overall security posture. He spends a significant amount of his time educating executive leadership on the true total cost of ownership for their current operations, revealing that what many believe to be automated processes are actually heavily reliant on manual labor from teams of full-time employees. By returning to the basics and establishing true correlation across the attack surface, organizations can clearly identify their gaps and build a reliable foundation for larger, more complex AI platforms down the line. Furthermore, Meyer highlights a growing shift driven by executive orders and congressional pressure to abandon highly customized, open-source-reliant government off-the-shelf software in favor of fully supported commercial off-the-shelf solutions. While some leaders are hesitant to let go of legacy systems they built their careers on, adopting commercial enterprise tools ultimately lowers costs, reduces security vulnerabilities, and speeds up development cycles. Meyer also underscores the tremendous value of cross-communication within the government, noting that bringing compartmentalized DoD groups together into collaborative working groups often sparks rapid innovation by revealing shared missions and overlapping technological goals.
-
Organizations often struggle to implement artificial intelligence because they still lack basic visibility into their devices and overall cybersecurity posture.
-
Executive orders are increasingly pushing the DoD to abandon expensive, custom-built government software in favor of supported, commercial-off-the-shelf tools that lower the total cost of ownership.
-
Fostering communication and working groups among disparate DoD components with similar missions is proving to be a highly effective way to drive innovation.
Empowering Mission Velocity Through Data-Driven Infrastructure
Kevin Hansen, CTO at MFGS, Inc., delves into the foundational strategies required for the Navy and Marine Corps to successfully meet their ambitious digitization and modernization objectives. At the core of his approach is the implementation of robust reference architectures, specifically leveraging open standards like the IT for IT framework from the Open Group. Hansen explains that establishing this kind of standardized infrastructure is vital for driving compliance, increasing velocity, and ensuring the high-quality delivery of software across all domains, including traditional IT, operational technology, and cybersecurity. However, simply having the infrastructure is not enough; the true catalyst for speed and innovation is becoming fully data-driven. Hansen points out that a persistent challenge within the DoD is the existence of data silos, which prevent high-fidelity information from reaching the right personnel at the exact moment they need it. To make accurate, mission-critical decisions—whether a sailor is monitoring server health on a ship console or a cyber warrior is hunting for threats in a tactical environment—systems must be seamlessly connected to allow for unhindered data flow. As the military continues to operate a vast array of disparate technologies, including legacy weapon systems, navigational tools, and modern cyber platforms, the variations in data formats can create massive communication bottlenecks. Looking to the future, Hansen anticipates that artificial intelligence will be the key to overcoming these hurdles. By utilizing AI to automatically normalize unusual and disparate data formats, the military will be able to translate complex information into readable, actionable intelligence for the warfighter, thereby accelerating mission velocity across the board.
-
Leveraging open standards is foundational for adding velocity and quality to software delivery across IT, operational technology, and cyber domains.
-
Breaking down silos to ensure high-fidelity data gets to the right people at the right time is critical for empowering data-driven decisions.
-
Artificial intelligence will play a major role in normalizing the unusual and disparate data formats generated by the Navy's wide array of weapon, navigational, and IT systems.
Accelerating the Sensor-to-Shooter Kill Chain Across the Joint Force
COL Kenneth Jones, Director of Science and Technology at the Marine Corps Warfighting Lab, outlines the immense complexities and critical importance of optimizing the sensor-to-shooter kill chain in modern warfare. While the concept of pulling data from a sensor to a screen seems simple in the civilian internet age, COL Jones explains that doing so in a combat environment involves an incredibly difficult fusion of data from a multitude of platforms—ranging from satellites to forward-deployed ground nodes—often built by competing commercial vendors. Historically, companies wanted to sell the military proprietary, end-to-end packages, but the DoD is now demanding open architectures that allow different sensors and command-and-control software to seamlessly communicate. This interoperability is vital not just within the Marine Corps, but across the entire Joint Force and with international allies and partners. However, bringing in massive amounts of joint data creates a new challenge: information overload at the tactical edge. Warfighters operating in bandwidth-constrained environments cannot afford to monitor half a dozen different laptops to build a common operating picture. Through initiatives like Operation Dynamis, the Marine Corps is actively working to consolidate this actionable targeting data onto single, intuitive devices, ensuring the right data reaches the shooter without overwhelming them. Furthermore, COL Jones addresses the critical issue of data latency and security classification when sharing targeting information across joint and coalition networks. He envisions a near future where artificial intelligence and machine learning serve as essential curation tools, automatically evaluating data quality, managing latency, and securely routing targeting information from exquisite sensors directly to the warfighter on the gun line in mere seconds.
-
The military is moving away from proprietary end-to-end systems and demanding open architectures that allow different vendors' sensors and command-and-control software to communicate seamlessly.
-
Through targeted operational efforts, the Marine Corps is working to simplify incoming data onto single devices, mitigating information overload in bandwidth-constrained environments.
-
Future sensor-to-shooter networks will likely rely on artificial intelligence and machine learning to curate data, manage latency, and facilitate instantaneous sharing with allies and joint partners.
Packaging Knowledge: Generative AI and the Future of Interoperability
Terry Dorsey, Evangelist and Data Architect for Denodo, explores the intricate challenges of achieving interoperability across highly distributed government agencies while allowing them to maintain their necessary autonomy. Dorsey explains that a major hurdle in DoD modernization efforts is finding a way to seamlessly share critical information without forcing every individual component into a rigid, one-size-fits-all operational mold. The most effective strategy to solve this is to centralize the fundamental meaning—or semantics—of the data. By creating a governed core of abstracted data relationships and components, the DoD can establish a universal language that allows completely agnostic consumers to access and understand the information they need, all while preserving the independence of their specific organizational structures. This approach not only solves current communication gaps but perfectly positions agencies to adopt advanced modern technologies down the road. Dorsey heavily emphasizes that for these digital transformations to be successful, the government must avoid the common pitfall of focusing solely on acquiring new technology for technology’s sake. Instead, every modernization project must be strictly aligned with defining and achieving specific business or mission outcomes. This outcome-driven mindset is particularly crucial when dealing with the rapid advent of generative AI. Dorsey cautions that generative AI cannot be treated like traditional artificial intelligence, which relies heavily on highly structured, pre-staged data sets with understood inputs. Because generative AI operates differently, organizations must fundamentally change how they model information, focusing instead on how to effectively package knowledge to help the AI understand complex intent and drive highly specific, nuanced outcomes.
-
Establishing a centralized core of data semantics allows organizations to communicate effectively while letting individual agencies remain agnostic and autonomous.
-
Modernization projects are most successful when they are strictly aligned with desired business outcomes, rather than just focusing on adopting the latest technology.
-
Unlike traditional artificial intelligence, which relies on structured data inputs, generative AI requires organizations to package knowledge to better understand intent and drive outcomes.
Unified Architectures for Seamless Data at the Edge
Chris Betz, Global Field CISO at Omnissa, highlights the foundational importance of consistent, unified IT architectures in ensuring that critical data remains accurate and actionable for warfighters deployed at the tactical edge. Betz emphasizes that if data is going to be the tip of the spear in modern conflict, its interpretation must be absolutely uniform across all branches and environments. Currently, the Department of Defense struggles with a proliferation of unique architectures. Because these isolated systems ingest information differently, they inevitably output that information differently, creating dangerous inconsistencies in how a soldier on the ground or a sailor at sea might interpret the exact same intelligence. To solve this, Betz argues for the urgent rationalization of toolsets, eliminating overlapping technologies and consolidating onto unified platforms that span from the enterprise level all the way down to the edge. This back-to-basics focus on solid, agnostic architecture is essential to support the massive influx of autonomous vehicles and advanced software currently flooding the battlespace. Furthermore, Betz points out that this unified approach is critical for the future of artificial intelligence in combat. As the military moves away from operations that require constant cloud connectivity, there will be a massive shift toward localized AI processing. This will allow edge devices to process vast amounts of data in real-time without ever needing to phone back to a home base. To support this, defense contractors must commit to developing highly flexible software that provides on-premise capabilities and functions flawlessly regardless of the underlying infrastructure being used by the specific military branch.
-
The military must eliminate unique architectures so that data interpretation is uniform and consistent across the enterprise and down to the warfighter.
-
Future field operations will heavily utilize localized artificial intelligence, allowing edge devices to process data and operate in real-time without having to phone home to a central cloud.
-
To ensure cross-service interoperability, defense contractors must develop software that operates consistently regardless of the underlying hypervisor, hyperscaler, or container architecture.
Overcoming Technical Debt: A Stepping Stone to the Hybrid Cloud
Chris Boyd, Head of Federal Sales at Zoom, addresses the immense challenge the Department of Defense faces in overcoming decades of technical debt to achieve a modernized, hybrid cloud environment. Throughout the military, and particularly within the Navy and Marine Corps, bases and stations are heavily burdened by aging, legacy telephony and IT systems. The primary hurdle for leadership is figuring out how to transition away from these outdated platforms without paralyzing their budgets with massive, middle-ground investments that ultimately hinder future innovation. Boyd advises that the key to successfully navigating this transition is to make smart, incremental investments that integrate seamlessly with the tools the government is already using. Instead of taking an incredibly disruptive and expensive replacement approach, agencies can protect their past technology investments by bridging legacy infrastructure with modern collaborative platforms, such as Microsoft Teams and existing CRM systems. By timing these incremental steps correctly, the DoD can look over the technological horizon and continuously innovate without accidentally engineering itself into a corner. Furthermore, Boyd highlights that the government is actively making strides to improve this process by consolidating its buying power at the enterprise level. By moving away from disjointed, localized purchasing and operating at an enterprise scale, the military is successfully reducing rampant IT sprawl and minimizing the reliance on insecure shadow networks. Ultimately, getting everyone rolling in the same direction through enterprise-level solutions and careful, integrated modernization steps is what will allow the DoD to reliably deliver robust communications to warfighters operating in the most bandwidth-challenged tactical environments.
-
Agencies must strategically modernize aging legacy systems without over-investing in middle-ground solutions that might hinder future innovation or engineer them into a corner.
-
Integrating modern capabilities with existing investments, like Microsoft Teams and CRM platforms, allows the DoD to innovate while protecting its past technology spending.
-
Consolidating buying power at the enterprise level helps reduce IT sprawl and minimizes the presence of unauthorized shadow networks.
Implementing Force Design: Bringing C2 and Fires to the Tactical Edge
Col. Craig Clarkson, Commanding Officer of MCTSSA at Marine Corps Systems Command, and Col. Kevin Stepp, Assistant Chief of Staff, G-6 for the Marine Expeditionary Force, provide a joint perspective on how the Marine Corps is actively transitioning the Force Design modernization strategy from a broad initiative into daily operational reality. Col. Clarkson explains that MCTSSA is heavily focused on the command, control, cyber, communications, computers, and intelligence space, working to separate hardware from software and deliver smaller form-factor systems that enhance capabilities directly at the tactical edge. Col. Stepp notes that Force Design is now squarely in the implementation phase, heavily focused on integrating command and control with fires, creating an inextricable link that defines how staff planners operate today. A major shift in this implementation is how the Marine Corps validates new technology. Rather than relying on temporary setups for short-term exercises, they are running continuous, round-the-clock operations centers to keep systems online, integrate them in real-time, and mature both the technology and the associated staff processes. A prime example of this accelerated acquisition model is Project Dynamis.
Col. Clarkson describes Dynamis as an effort to partner end-users directly with capability developers, acquisition professionals, and industry engineers. During continuous, time-compressed events, these collaborative teams iterate on hardware and software in real-time, instantly closing technical gaps across multiple enclaves. By bringing the warfighter and the engineer into the same room, the Marine Corps ensures that they are
delivering exactly the right capabilities to connect sensors to shooters at machine speed, rather than waiting years for a top-down solution.
-
Force Design has moved from an initiative into an active implementation phase, deeply embedding the crucial link between command and control and fires into daily operations.
-
The Marine Corps brings end-users, acquisition professionals, and industry engineers together to iteratively test and close technical gaps in real-time.
-
The Marine Corps is validating these new technologies through continuous operational centers rather than relying solely on temporary exercises.
Building a Foundation of Trust for Data-Driven Decision Making
Bill Roberts, Field CTO for Federal Services at Riverbed, tackles the persistent cultural and technical hurdles that prevent government agencies from fully realizing the potential of data-driven decision-making. Despite decades of pushing toward modernization, Roberts observes that many leaders still approach problems with deeply ingrained preconceived notions, often searching for specific data points merely to justify a decision they have already made. To fundamentally shift this culture, organizations require unwavering, top-level executive support to mandate that actual data dictate the operational course. Roberts points to the Navy’s World Class Alignment Metrics as a prime example of success; driven by strong leadership, the Navy did the grueling, foundational work of meticulously mapping their data collection efforts directly to their top mission objectives. Once this framework is established, organizations can then confidently invest in the proper telemetry, instrumentation, and visualization tools. However, Roberts warns against the common pitfall of seeking out magic dashboard solutions or advanced artificial intelligence before this foundational alignment is complete; without understanding exactly what is being measured and why, no software tool will solve an agency's problems. Looking forward, Roberts notes that the sheer volume of data is forcing a shift away from simple data sampling, which often misses critical threats, toward the collection of full-fidelity data. To process this massive influx of information, agencies must increasingly rely on automated anomaly engines and trusted artificial intelligence to filter out the noise and present only the most critical, actionable intelligence to decision-makers, thereby solidifying trust in the data process.
-
For a successful modernization shift, organizations must have top-level executive support to challenge preconceived notions and mandate reliance on actual data.
-
Agencies must do the hard work of aligning their metrics with top mission objectives before they invest in new visualization or telemetry tools.
-
The future of IT management involves trusting artificial intelligence and automation to analyze full-fidelity data to find anomalies, moving beyond the limitations of simple data sampling.
