Does AI require high-end infrastructure?
Connecting state and local government leaders
Agencies can often use the equipment they have to experiment with artificial intelligence, but they may want to opt for high-performance computers for applications that need greater compute power and scalability.
There’s no shortage of buzz around artificial intelligence applications in the public sector. They’ve been touted as something of a digital panacea that can address almost all of an agency’s problems, whether it’s a chatbot offloading work from customer services staff or aiding in fraud detection. Still up for debate, however, is what infrastructure agencies must have in place to make the most of AI.
Data science teams are spending less than a quarter of their time on AI model training and refinement because they’re mired in infrastructure and deployment issues, according to a survey by machine-learning deployment firm Algorithmia.
To address those issues, several vendors say that high-performance computing is the must-have item for agencies looking to launch AI projects. In a whitepaper Intel published in September, the company outlined why the two go so well together. “Given that AI and HPC both require strong compute and performance capabilities, existing HPC users who already have HPC-optimized hardware are well placed to start taking advantage of AI,” according to the paper. The computers also give users the opportunity to improve efficiency and reduce costs by running multiple applications on one infrastructure.
A 2017 report on AI and HPC convergence put the emphasis on scalability. “Scalability is the key to AI-HPC so scientists can address the big compute, big data challenges facing them and to make sense from the wealth of measured and modeled or simulated data that is now available to them.”.
Lenovo also recognizes the connection, announcing last year a software solution to ease the convergence of HPC and AI. Lenovo Intelligent Computing Orchestration (LiCO) helps AI users by providing templates they can use to submit training jobs -- data feeds that will help AI applications learn what patterns to look for -- and it lets HPC users continue to use command-line tools.
But agencies that don’t have high-performance machines shouldn’t despair, according to Steve Conway, senior vice president of research, chief operating officer and AI/high performance data analytics lead at Hyperion Research Holdings.
“You can get into this with -- a lot of times -- the kinds of computers you have in your data centers,” Conway said. “Almost all of the agencies have data centers or access to data centers where there are server systems or clusters, and you can run some of these [AI] applications on those.”
A main benefit of high-end computers is that they can move, process and store much more data in short periods of time, and AI is data-intensive. But chances are that if an agency doesn’t have HPC, it doesn’t have a need for ultra-sophisticated AI.
“At the very forefront of this stuff, you really do need high-performance computers, but the good news there is that they start at under $50,000 now, so they’re not that expensive, and there are a lot of folks who don’t need to spend even that kind of money,” Conway said. “They can use the equipment that they have and start exploring and experimenting with machine learning.”
The biggest use cases for AI are fraud and anomaly detection, autonomous driving, precision medicine and affinity marketing, which Conway said is the mathematical twin of fraud and anomaly detection but with different goals. In detection, the objective is to spot the “oddball,” he said, whereas the other looks for as many similar data points as possible.
But being AI-ready is about more than the machines that power it, said Adelaide O’Brien, research director for IDC’s Government Digital Transformation Strategies. To be most effective with AI, agencies must do their homework.
“It’s really important to have good, basic data management practices,” O’Brien said. “I know that’s not glamorous and it’s not exciting, but [agencies] need to ensure that there’s information access. They also have to have the strategy in place” and a “very robust data foundation” for machine learning, she said. "You need to train that machine with lots and lots of data.”
She also recommended documenting data sources to ensure the information’s veracity and ensuring diverse samples. “You don’t want it based on limited demographic information or even a preponderance of historical data -- which government agencies have a lot of -- because that may not reflect today’s reality,” O’Brien said. “It’s so important to train that machine on relevant data.”
The AI industry is well positioned for growth. Globally, the business value derived from AI is projected to total $1.2 trillion in 2018, according to research firm Gartner. And that business value could hit $3.9 trillion by 2022.
In September, several senators introduced the Artificial Intelligence in Government Act, which would “improve the use of AI across the federal government by providing resources and directing federal agencies to include AI in data-related planning.”
To keep up with AI, agencies don’t have to wait to acquire HPC. “It’s important to get started,” Conway said. “It doesn’t take necessarily a very expensive IBM Watson to do this kind of stuff. They’re doing it with the kinds of normal cluster computers that are very, very common in both the public and private sectors.”
NEXT STORY: When Elon Musk Tunnels Under Your Home