Artificial intelligence (AI) is "a huge thing,” Ericsson told Silverlinings at Mobile World Congress (MWC) Las Vegas last week where the company was busy disrupting the Open RAN status quo and talking about its role in the growing world of AI.

“The biggest challenge today is that the AI field is fragmented,” said Elena Fersman, VP and the head of the global AI accelerator at Ericsson, who spoke to Silverlinings briefly in the sunny halls of the convention center. “We are building so many use cases everywhere [on] how [it all] fits together."

She suggested that internal anthologies and large language models (LLM) are some of the structures that will help hold the fractured use cases of AI together. She also said that Ericsson is focused “intent-based” AI systems that will allow their customers to manage all their systems.

Beyond the chatbots

When it comes to AI, operators are currently a bit cautious of deploying AI in more ambitious circumstances than customer-facing chatbots, Fersman said.

“Exactly, absolutely, of course. That’s why trustworthy algorithms are so important, and of course, if you can build the system so you retrain it on the fly... then it becomes very sensitive,” she said. 

“Our task is to scan all the hot algorithms that are coming out right now,” Fersman said, noting that Ericsson isn’t doing academic work on AI, but is instead looking to deliver products based on the technology. “Our portfolio is both for enterprises and for operators,” she told us.

Some of new areas she is looking at include neuro-symbolic AI and machine reasoning. “This AI field is evolving very fast,” Fersman said.


Want to discuss AI workloads, automation and data center physical infrastructure challenges with us? Meet us in Sonoma, Calif., from Dec. 6-7 for our Cloud Executive Summit. You won't be sorry.