The General Services Administration’s (GSA) adoption of robotic process automation (RPA) technologies is progressing at a measured but steady pace, and the official heading the agency’s efforts said on Wednesday he is hoping that some of those efforts prove to be “home runs” that will help spur further adoption.

Speaking at an event organized by the American Council for Technology and Industry Advisory Council (ACT-IAC), Edward Burrows, Robotics Process Automation Program Manager at GSA, said the agency’s RPA effort began last year and has resulted in the deployment of about one RPA “bot” per month since then.

He said that while some private sector entities boast deployment of thousands of RPA bots, the relatively small group of GSA staff working on the projects makes it hard for the agency to match anything close to that pace. “It seems hard to scale up, we just haven’t gotten there yet,” he said.

Burrows said the potential pay-offs of RPA bot projects in labors savings seem clear, and that the success of existing projects may help create high-profile examples that will provide incentive for further investments.

“What we need is some home runs” from RPA projects that eliminate many thousands of hours of labor from current processes, he said. In those cases, he said, “your ROI [return on investment] shoots up.” Among the best projects thus far is one that has saved 7,000 hours per month of labor, most of that on data entry tasks, he said. In addition to generating cost savings, Burrows said RPA can also function as a tool for improving data quality.

The GSA official drew a distinction between RPA and artificial intelligence (AI), saying that GSA had yet to tackle any project that he thought involved “truly AI.” “All of our use cases are RPA,” he said, adding that RPA “give you the appearance of being intelligent…but it’s really just code.”

Burrows also said increased adoption of RPA technologies could help the government create more interesting tech jobs–including conceiving of and managing RPA bots–in line with broader objectives by the government to direct IT workers to “higher-value” tasks.

That sentiment was echoed by Carolyn Marsh, Federal client executive at IBM, who said she believes there are many tech jobs that can be made more attractive by turning existing staff into “data scientists” with more challenging tasks.

And she said the nature of some government agency data provides a good reason for RPA and AI technology adoptions. AI technologies can be useful in extracting data from large amounts of unstructured text in order to produce “insight an knowledge” from the data, and she said, “in government, there is a large amount of unstructured text out there.” Marsh added, “that’s where we are seeing a big use case for AI.”

Sukumar Iyer, CEO of Brillient Corp., said his firm is seeing “islands of opportunity” in the Federal government for RPA adoption, but also said the technology is fostering “a lot of hype and anxiety out there.”

Asked whether a fear of job losses is holding back on government’s RPA adoption, Venkat Kodumudi, director of innovation and outreach at CGI Federal, said adoption of RPA technologies has to be pitched as a “win-win” result for organizations and employees.

Marsh noted Federal government efforts to employ RPA and AI in procurement to discover purchase price disparities, which she said can “save a whole lot of money.” And Michael Bruce, vice president and division manager at Leidos, cited past efforts to discover purchase price disparities for commodity goods by the Department of Health and Human Services, and said such discoveries “can save the taxpayers millions of dollars.”

Marc Mancher, principal, U.S. Government and Public Services Robotics and Cognitive Lead at consulting firm Deloitte, presented statistics indicating that that more than 100 RPA bots were currently deployed across the Federal government, and were “helping them realize rapid ROI.”

And William Carroll, reliability centered maintenance engineer at the U.S. Navy’s Military Sealift Command, said that deployment of AI technology generated a savings of 30,000 labor hours in a project looking at maintenance and repair data over 20 years for 52 different classes of ships. In addition to labor savings, he said the project gave his organization greater visibility into its data, and that without it, “our analysis would be incomplete and possibly inaccurate.”

During a separate presentation, Col. Stoney Trent, implementation team chief at the Pentagon’s Joint Artificial Intelligence Center (JAIC), recapped the JAIC’s operations since its formation last year and said the organization was poised to boost staffing significantly this year. He said the JAIC was authorized to add 75 people over the next year, “plus contractors.”

Col. Trent said “we need really highly skilled personnel inside the JAIC,” and told attendees at Wednesday’s meeting to watch for the organization to use public-private partnerships to further that goal.

Read More About
More Topics
Kate Polit
Kate Polit
Kate Polit is MeriTalk's Assistant Copy & Production Editor covering the intersection of government and technology.