Nathan McGuirt started working in the cloud about ten years ago. Now, a senior solutions architect at Amazon Web Services, McGuirt can testify that a lot in the space has changed.

The risks are changing, said McGuirt this week, and “people are trying to get data out of silos.”

Ten years ago, data existed somewhere within four walls, said David Blankenhorn, Chief Technology Officer at DLT Solutions, on a March 19 webinar with McGuirt and Skip Farmer, Chief Architect at Veritas.

milCloud® 2.0 unraveled in an hour. Learn More

McGuirt said that while a lot of attention is placed on data confidentiality, the integrity and availability of data often does not get enough attention. The old way of putting meaningful data into a data warehouse was expensive, not easy to scale, and data intake was difficult.

The question now, McGuirt said, is: “How do I make sure that those new systems that I build are available and scalable?”

McGuirt now sees customers moving to the more flexible data lake architectures.

Farmer said there did not used to be mechanisms in place to differentiate data. “Everything was managed the same way,” he said. “We can’t do that anymore.” Value must be assigned to data, so it can be moved, he said.

The challenge is how to identify and classify data to make it useful, said McGuirt, and that comes not only from knowing your data, but your stakeholders.

McGuirt used the Department of Housing and Urban Development’s Stella tool, which measures progress toward preventing and ending homelessness, as an example.

Having cloud providers and government agencies share data management responsibilities gives agencies an advantage, said AWS’ McGuirt. “It lets them put more focus on what makes the data unique.”

Read More About
About
Dwight Weingarten
Dwight Weingarten
Dwight Weingarten is a MeriTalk Staff Reporter covering the intersection of government and technology.
Tags