Big Data, Big Challenges. Will Big Government be Able to Handle it?
WASHINGTON — Along with the rapid creation of data comes the complications of capitalizing on new capabilities for storing, computing, analyzing and visualizing the information. The Federal News Network convened a panel of top government data officials to discuss how agencies are addressing the challenges and opportunities associated with big data.
“A lot of us are in the same position… first Chief Data Officer,” said Ron Thompson, chief data officer at NASA. Joined by Tom Sasala, chief data officer of the Navy and Michael Conlin, chief business analytics officer for the Department of Defense, these new positions are in direct response to the data revolution and were created to make data accessible to take to the next level.
“[DOD] had been grappling with over 100,000 operational IT systems, each generating data in completely different ways,” said Conlin, adding even before his tenure the Agency had been working to standardize these systems for better analysis and operational decision-making. Now DOD is “running smoothly on high quality data to be used by principal staff advisors,” according to Conlin and has a plan for a “balanced scorecard,” or reporting on what has happened already; “predicting the future,” meaning leading indicators of performance; and “telemetry to drive operational processes through automation” rather than manual operation.
For its part, the Navy is working toward “better interoperability for any paramilitary conflict we might end up in in the future,” according to Sasala. Its fundamental data work includes “getting governance in line, getting the enterprise data environment off the ground, and working with a new platform — Jupiter — to get our analytics in order.”
“Jupiter takes disparate data sources and integrates them into a package running in the cloud,” explains Sasala. Previous to Jupiter, the Navy had a variety of end-user systems and lacked the ability to stitch those systems together to take a detailed look at topics.
“[Jupiter is] a dramatic departure from where [the Navy has] been in the past where data has been siloed,” says Sasala, adding that it allows analysts to view and use high end visualization and AI tools to write algorithms, which is a “complete radical change from the way we’ve done this in the past.”
NASA, with its huge data sets from its science community, human exploration, and machine learning, is focused on data-sharing to benefit the larger community while also protecting the data of proprietary partners. “[We’re getting an] idea of what we have and focusing on standards and interoperability,” said Thompson, who looks forward to sharing NASA’s appropriate data with the general public and using it to aid future research.
“There’s a lot of questions folks haven’t even thought of yet, and by [putting data together we’re] going to be able to answer some interesting questions in the future,” he said.
According to Nick Psaki, principal system engineer at Pure Storage, which develops all-flash data storage hardware and software products, what we’re seeing from these government agencies is a “conscious recognition that it is really about agility and the availability of data to service itself… not beating your infrastructure into submission so you can actually get work done.”
“It’s all about volume, velocity, variety and veracity,” Psaki said, “A year ago, we had… data distributed across a variety of agencies, [now the focus is on] smart, timely, relevant, and accurate decisions.”
But how fast you can know is dependent on how fast you can move. “We want to pull data through fire hoses rather than soda straws,” Psaki quipped, warning that data sources will need to be agile without trying to refactor or rebuild. For the future, he sees an increase in cloud-based platforms delivering seamless opportunities to leverage data without infrastructure on the premises.
Henry Sowell, chief information officer of Cloudera Government Solutions, an enterprise data cloud solution, agrees. “There’s been a generational shift in technology and the use of tech from top to bottom,” Sowell said, that “involved a pivot of data strategy that greatly impacts machines coming into the future… Now, gaining actionable intelligence from real-time data is more important than ever.”
Citing the pandemic as an impetus, Sowell argued that “you can no longer depend on a static state of resources… COVID created the need for a real time data analysis pipeline with flux to include a multitude of data services quickly.” And he believes cloud-based enterprise is the solution.
From the census to NASA, to batch/job and supply chain scheduling to fleet management and maintenance, government agencies are finding new efficiencies around their digital transformations. Technology is embedded in human lives, and with every organization being so tech and data driven, the future of government is sure to look far different than it does today.
“All that we are doing around data and analysis is to be more capable of responding to stakeholders,” said Conlin. “We need a much more agile and nimble process to serve our stakeholders — both the war fighter and the taxpayer — and there are plenty of opportunities ahead for us.”