A senior technology advisor at the Department of Transportation is urging federal agencies to prioritize “lift-and-shift” strategies and tighter collaboration between developers and program staff to modernize legacy IT systems without disrupting mission operations.

Speaking during a Red Hat Government Symposium session, Anil “Neil” Chaudhry, senior advisor for artificial intelligence (AI) at the Department of Transportation, said agencies can accelerate infrastructure overhauls by focusing first on foundational platforms and deferring application-level changes until systems are stabilized in modern environments.

The approach, he said, addresses a persistent challenge across federal IT portfolios: how to replace aging systems while maintaining uptime, controlling costs, and meeting security requirements.

At the core of that challenge is a disconnect between technical teams and program offices. Chaudhry said program staff are primarily focused on system performance, not backend complexity.

“What the end … program staff cares about is two things, uptime … and then the second thing is reliability,” he said, adding that users expect consistent outputs without “runtime errors” or system delays. Technical teams must address a host of other issues, from scalability to security – but as they do so, they may overlook the business process they’re enabling, he noted.

That dynamic has implications for modernization strategies, acquisition decisions, and system design. Agencies that focus too narrowly on technical architecture and fail to align with mission workflows risk slowing adoption and increasing operational friction, he said.

Chaudhry emphasized that developers should map end-to-end business processes in partnership with program staff, ensuring that data flows seamlessly across systems, clouds, and edge environments. That includes designing for multicloud interoperability, security, and scalability – while maintaining simplicity for end users.

Modernization without disruption

Lift-and-shift migration is a practical starting point for agencies managing significant technical debt, he said.

“I like to take a legacy application … straight from a mainframe and put it into a hyperscale environment,” he said.

By moving applications intact into modern infrastructure, agencies can abstract backend concerns such as security, patching, and scaling into a centralized enterprise platform. This approach allows agencies to maintain a consistent user interface while incrementally modernizing code and services over time, he noted.

“When you do the big lift and shift, you get those efficiencies of scale, and you’re able to actually drive the replacement of aging systems without the downtime,” Chaudhry said.

Buy vs. build

Federal acquisition preferences that favor commercial solutions over custom development are a significant consideration as agencies modernize, Chaudhry noted. He noted that agencies are increasingly leaning toward firm fixed-price contracts and shorter delivery timelines, which push IT teams to evaluate total cost of ownership rather than upfront build costs.

That includes factoring in labor, maintenance, and downtime costs – not just development expenses.

“Why would I want to spend my time [building] it when there’s a ready-made industry solution to do it that is secure by default, that can manage multiple clusters and containers from multiple hyperscale providers that are approved for use within the U.S. government?” he asked.

Zero trust and AI integration

As agencies modernize infrastructure, they must prepare for emerging demands tied to generative AI and data governance, Chaudhry said.

He emphasized starting with a zero trust architecture and designing systems that balance security with usability. Overly complex controls, he warned, can drive users to bypass official systems.

“If you make authentication too painful, folks are going to try to find a shortcut around,” he said.

To address that risk, Chaudhry advocated for shifting from moving data into applications toward deploying applications closer to data using containers and virtualization. That model supports secure, scalable AI workloads while limiting direct data exposure.

Microservices and orchestration layers can build on that approach, helping agencies operationalize AI by breaking work into discrete functions that run securely within a unified architecture, Chaudhry noted.

Targeted investments in data and analytics

On the data side, Chaudhry urged agencies to align tools and investments with actual user needs rather than deploying enterprise-wide solutions that exceed demand.

Most government analytics needs can be met with common tools for business intelligence, while only a small subset of users require advanced data science platforms. Segmenting environments accordingly can reduce costs and improve performance.

That same prioritization should apply to data governance. Agencies should focus protections on “crown jewel” datasets rather than applying uniform controls across all data, he advised.

“Not everything is a priority, because if everything is a priority, nothing is a priority,” Chaudhry said.

Implications for federal IT leaders

Chaudhry’s emphasis on platform-first modernization, commercial solutions, and zero trust-aligned architectures reflects a broader shift toward enterprise-wide approaches that prioritize scalability, interoperability, and mission continuity.

For federal IT leaders, modernizing legacy systems is a strategy of sequencing, Chaudhry advised – stabilizing infrastructure first, then iterating toward more advanced capabilities without disrupting operations.

Watch the Red Hat Government Symposium session: “Infrastructure Overhaul: Revamp and Replace Legacy Systems Without Interruption,” and explore more sessions from the Red Hat Government Symposium.

Read More About
Recent
More Topics
About
MeriTalk Staff
Tags