Cast AI gets $35M in new funding, readies next AI automation update

  • Cast AI just scored another $35 million in funding

  • It's platform automates and optimizes cloud workloads

  • The next release will further optimize the amount of compute power needed for cloud tasks

Automation maven Cast AI has just grabbed another $35 million in funding and the startup has announced its latest machine learning-based artificial intelligence (AI) tool.

Chief product officer and co-founder Laurent Gil talked to Silverlinings about what’s been happening at the Kubernetes cost optimization company. Your correspondent first interviewed Gil for our sister publication, Fierce Telecom, in December 2020.

The Series B funding of $35 million, follows on from a $20 million investment in March this year. Gil said that the total funding raised so far is $73 million. Gil said he was especially proud of investment this round from Israeli venture firm Vintage. Partly, because Tel Aviv is considered a “big, big” DevOps center, Gil claimed, and, “especially in these times,” he said.

“Last year, about this time, we were around 40 people, now we are 120,” Gil said of the growing Florida-based startup. “I would not be surprised if next year we are over 300.”


Want to discuss AI, automation and cloud challenges with us? Meet us in Sonoma, Calif., from Dec. 6-7 for our Cloud Executive Summit!


Customers include Akamai, Phlexglobal, and Wohlig among others, although Gil notes that are other customers that he can’t name yet. The system works with AWS, Google Cloud and Azure.

So how does Cast AI cut cloud costs?

The initial release used machine learning AI to optimize Kubernetes clusters, or containers, to use as much compute power as was needed but no more than that. Cast AI claimed the initial platform cut cloud costs in half for AWS, GCP or Azure users.

“We provide an instant return and you can measure it in dollar terms,” the CPO stated.

Gil said that the new release will further optimize cloud costs. Frankly the developer, most of the time, doesn’t know exactly how much compute resources their workload needs. “We made Kubernetes completely autonomous,” Gil said.  So that if a workload can be run in a single pod, then you don’t need to operate other containers for a workload.

This means that a developer won’t even need to think how much compute will be necessary, our AI engine will set it up for you and eliminate any other provisioning based on it, Gil continued. For instance, Cast AI took a container that was using 0.2 of CPU resources and reduced it to 0.021. 

“So we reduce by size by 10x with zero impact on performance,” Gil claimed.

He noted that the latest Cast AI update will work on CPU resources and memory.