Government IT leaders may feel a little punch drunk lately. Between new mandates, an understaffed workforce, and aging systems–they are getting hit from all angles.

Chad Sheridan, CIO, USDA-Risk Management Agency

Indeed, Chad Sheridan, CIO for USDA’s Risk Management Agency kicked off his keynote at last Thursday’s Veritas Public Sector Vision Day in Washington, D.C. by quoting Mike Tyson: “Everybody has a plan until they get punched in the mouth.”

Sheridan noted that the way government agencies have traditionally planned for and implemented technology is wrong in today’s cloud age.

“If you’re out there building an aircraft carrier, it’s a complex project, but you know the scope. The plan is king,” Sheridan said. “Except I’m building software, and it doesn’t work that way. We now have racks of servers dedicated to different parts of the business. There’s no incentive for us to work together. The way we budget, plan, and program manage–everything goes against the horizontal flow of value, so what we end up with is overhead. We’re in a never ending death spiral of infrastructure.”

So how do we get out of it?

“Cloud allows us to get away from a place in which we cannot win. It’s been eight years since Cloud First came on the scene, but uptake in government hasn’t exactly been stellar,” Sheridan said, also noting that the Technology Modernization Fund will help government pivot and focus on doing work in a more efficient, horizontal fashion by providing incentive for change.

But, as data volumes continue to skyrocket–with much of the growth in unstructured data–agencies are hampered by data classification challenges that make it difficult to protect information throughout its lifecycle, regardless of where it is located–on premise, or in a public or private cloud.

“You can outsource your IT infrastructure, your real estate, even your workforce,” said Tom Kennedy, vice president and general manager, Veritas Public Sector.  “But you always need to manage your data effectively so it’s an asset, not a burden.”

Government executives sitting on a panel discussion at the Veritas event agreed that this is a top priority for their agencies.

Brian Brotsos, chief data officer for the Federal Highway Administration noted that his team is in the process of taking inventory of all the data they have. “We want one IT platform,” Brotsos said. “But everything is siloed and we need to build trust within the program areas. We’re working to identify everything so we can build trust and move forward.”

Data identification and classification is a major hurdle in state and local government as well. Jeff Porter, director, IT infrastructure in Fairfax County (Va.)’s IT department said, “We’re trying to get a handle around where the data is and who owns it. It wasn’t that long ago that a lot of our stuff was in file cabinets. Change management is a challenge–a lot of people think we’re going too fast.”

Sonny Bhagowalia, senior advisor on technology and cybersecurity in the Commissioner’s Office for the Bureau of Fiscal Services (Department of Treasury), echoed these sentiments. “The real challenge is how do you bring it all under governance? Ownership is a big challenge. So is the reward structure. We need to know what data is really important so we can make sure it survives, no matter what.”

Veritas, which hosted Thursday’s event, is focused on helping government agencies get a handle on their data so they can be more effective and secure.

David Noy, Vice President, Product Management, Veritas

“Agencies need to understand the assets they have so they can determine what needs to be encrypted and/or kept in secure locations,” said David Noy, vice president, Product Management, Veritas. “The amount of governance you put around your information is critical when that information is sensitive and the lifeblood of what you do.”

“We’re positioned at the nexus of digital transformation and modernization,” Noy said. “We want to help you understand what you have; if it’s sensitive; if it should be where it is; if it’s in compliance, if it’s exposing risk.”

New solutions from Veritas’ 360 Data Management portfolio combine data protection with global visibility, regulatory compliance, business continuity, workload mobility and storage optimization.

“Fifty percent of enterprise data is dark data, meaning organizations don’t understand it at all. A third of it is at least three years old. It’s only going to get worse,” Noy said. “The world was home to six zetabytes of data in 2014. By 2020, we’ll reach 52 zetabytes–due to unstructured data, web-scale workloads, and IoT. How can we ever understand what we have when our data is that large, and the landscape is becoming increasingly complex?”

Veritas launched the latest version–8.1.1–of its flagship NetBackup product on Thursday at the event. The new iteration is optimized for the Federal Processing Standard (FIPS) certification, as well as all applicable DISA Security Template Implementation Guides (STIG).

“NetBackup allows agencies to protect data everywhere–to the cloud, in the cloud, between clouds,” Noy said.

Noy noted that Veritas has launched eight new products over the last year and is taking a more vertical-centric approach to its product development, building solution architect teams focused on key industries like government and healthcare. He also showcased two new technology appliances released earlier this month–Veritas Flex Appliance and Veritas Access Appliance–that enable a software-defined storage approach to data management and cost-effective storage across hybrid cloud environments.

Read More About
Recent
More Topics