Not All Intelligent Process Automation Requires Million-dollar Hardware

June 29, 2020 / artificial intelligence, Intelligent Process Automation, Machine Learning, Uncategorized

cutting the cost of AI solutions

While the artificial intelligence market is unquestionably enjoying rapid growth, cost is a gating factor that gives some companies pause. It’s understandable given the price tag for software and hardware required to run some complex AI applications, notably those involving deep learning. But when it comes to intelligent process automation (IPA), it doesn’t have to be that way. 

Looking at the numbers, you’d think absolutely everyone is on the AI bandwagon. Grand View Research expects the global AI market to reach $390 billion by 2025, growing at a compound annual growth rate of more than 46% from 2019. 

Hardware accounts for a significant chunk of that total. The AI hardware market is expected to reach more than $230 billion by 2025, up from about $42 billion in 2019, according to Statista. And a significant amount of the hardware total is attribute to GPUs, or graphics processing units, the high-powered chips that power the most demanding AI applications, including deep learning. (Google’s cloud-based Tensor Processing Units, or TPUs, are roughly equivalent.) Revenue from GPUs is expected to rise from about $15 billion in 2019 to $54.5 billion in 2025. 

Related Post: How Intelligent Process Automation Addresses the AI Data Problem

Some deep learning requires deep pockets

To be sure, plenty of AI applications can be performed with relatively lightweight compute power, including traditional CPUs, whether on premise or (increasingly) in the cloud. But deep learning applications are not among them.

Deep learning is complex because it requires training models to perform functions that mimic what a human would do. It’s the technology behind autonomous vehicles, for example, enabling them to recognize streetlights, pedestrians, other cars and the like. 

As you can imagine, that kind of intelligence requires an extensive amount of training on an almost infinite set of possibilities. And that training requires plenty of processing power. 

Automating everyday business processes normally performed by humans without templates and rules, requires a deep learning approach. Here’s where AI can get costly. With the kind of large data sets that are often required to train a model, it’s not unusual for a company to have to buy (or rent) 10, 12 or more GPUs or TPUs, at costs that can easily exceed $1 million. (This story has a detailed, analysis from researchers at AI21 labs and elsewhere that proves the point.)

Imagine a financial services company that has to process myriad documents every time it onboards a new customer – statements from various financial institutions, tax returns and more, nearly all of them in PDFs. Someone has to look at each document, identify the pertinent information – names, dates, amounts and such – then enter that data into another downstream processing tool. 

If all the statements were exactly the same, perhaps a templated approach would work, using optical character recognition to identify appropriate fields and extract data. But in this instance, the content is considered unstructured because it varies from one document to another. This requires a cognitive automation approach that involves training a deep learning model to identify the various types of relevant content, no matter where it may show up on a given document, and extract it. Starting from scratch, that would be another compute-intensive endeavor. 

A better approach to intelligent process automation

But, as noted up top, it doesn’t have to be that way. Another approach to deep learning is to train a model on literally millions of data points, giving it a deep level of context. Using technology known as transfer learning, the model can then be trained relatively quickly to focus on a specific task – like automating that financial services onboarding process. In that case, it would take maybe 100 to 200 documents to train the model to find the pertinent information that needs to be extracted.

What’s more, because most of the training is already done up front, the platform can run on just one or two GPUs, and scale up using low-cost CPU. Overall, you get a highly effective intelligent automation tool that can likely pay for itself in short order by dramatically reducing both process cycle times and the human resources required to perform the process. 

That is the benefit of Indico’s intelligent automation tool; it doesn’t require a million-dollar investment to run. To learn more, download this free white paper from process automation software experts at the Everest Group, “Unstructured Data Process Automation.”

Don't Miss a Post

Get our best content on Intelligent Process Automation sent to your inbox weekly.