Presented by Riverbed & Carahsoft
Bill Roberts, Field CTO for Federal Services at Riverbed, continues to find himself on a soapbox advocating for the importance of trusting data. Even 10 to 15 years into the military's data-based decision-making process, he notes that leaders frequently approach problems with ingrained biases. "I think part of it is folks come to those decisions with preconceived notions sometimes," Roberts explains, observing that leaders often search for data that supports a decision they already wanted to make rather than allowing the actual data to change their minds.
To overcome this cultural hurdle, agencies require unwavering, top-level executive support. Roberts points to the Navy’s World Class Alignment Metrics, driven by Admiral Small, as the premier model for success. Because Admiral Small was "100% behind it and drove it through," the Navy was able to implement a massive culture shift. They went through a grueling process of tying their data collection directly to their top mission objectives. Once this foundational framework is established—agreeing on top-level priorities and mapping consolidated metrics to those objectives—organizations know exactly what they need to measure and can confidently trust the data flowing up the chain.Unfortunately, many agencies try to skip this difficult foundational work. "I think part of the challenge is people want to jump to the answer, and they want to go find that new BI tool, that new visualization tool that's going to solve all their problems without having had that framework built first," Roberts cautions. He notes that people often mistakenly believe "there is magic out there, and if we just all stir it all together, that the magic is going to come out". Coming from a company that makes telemetry tools, Roberts emphasizes that without alignment between what you are measuring and why you are measuring it, no tooling will solve the problem.
Looking forward, Roberts sees organizations realizing the limitations of data sampling and moving toward processing full-fidelity data. Sampling is only so effective and can cause agencies to miss highly important security threats. The next evolution involves trusting artificial intelligence and automation to handle this massive influx of full-fidelity information. "Being able to apply anomaly engines that are looking at that large mass amount of data and events and alerts that are coming up, and reduce that down into what's important for you... trusting that process, I think, is kind of the next evolution," Roberts states.
This interview appeared in the program Innovation in Government at WEST 2026
