Amazon’s CEO Andy Jassy describes 3 layers of GenAI

  • Amazon's CEO talked about the "primitive" phase of GenAI that we're currently in

  • He said there are 3 layers of the stack

  • And GenAI will be built from the start on the cloud

Amazon CEO Andy Jassy published his annual letter to shareholders today, and he devoted a section to the company’s work with generative artificial intelligence (GenAI).

Amazon has a wide variety of businesses (to say the least), so Jassy’s letter was long (it takes about 28 minutes to read).

But in the section about AI, Jassy wrote, “Sometimes, people ask us ‘What’s your next pillar? You have Marketplace, Prime, and AWS. What’s next?’ This, of course, is a thought-provoking question. However, a question people never ask, and might be even more interesting is what’s the next set of primitives you’re building that enables breakthrough customer experiences? If you asked me today, I’d lead with Generative AI.” 

Andy Jassy
Amazon CEO Andy Jassy (Amazon)

He then proceeded to talk about three layers of the GenAI stack. 

He said the bottom layer is for developers and companies wanting to build foundation models (FMs). This layer relies on compute to train models and generate inferences. The compute requires high level chips, and to date all the leading FMs have been trained on Nvidia chips. But Amazon has also built custom AI training chips (named Trainium) and inference chips (named Inferentia). Jassy said, “This past fall, leading FM-maker, Anthropic, announced it would use Trainium and Inferentia to build, train and deploy its future FMs.”

It's no coincidence that Jassy called out Anthropic. Amazon has invested $4 billion in the hot San Francisco startup that has created an AI chatbot called Claude to compete with OpenAI's ChatGPT.

Moving on to the middle layer, Jassy thinks this layer of AI primitives will rely on a cloud provider such as Amazon. He said this layer is for customers seeking to leverage an existing FM, customize it with their own data, and then use a cloud provider’s security and features to build a GenAI application. 

“Amazon Bedrock invented this layer and provides customers with the easiest way to build and scale GenAI applications with the broadest selection of first- and third-party FMs,” he said. “Bedrock is off to a very strong start with tens of thousands of active customers after just a few months.”

In terms of third-party FMs, he mentioned Anthropic’s Claude, Meta’s Llama 2, Mistral, Stability AI and Cohere alongside Amazon’s own Titan family.

Jassy said, “Customers don’t want only one model. They want access to various models and model sizes for different types of applications.”

Finally, the top layer of this primitives stack, in Jassy’s view, is the application layer. “We’re building a substantial number of GenAI applications across every Amazon consumer business,” he wrote. “We’re also building several apps in AWS, including arguably the most compelling early GenAI use case—a coding companion.”

The companion, called Amazon Q, has been trained on all things AWS, and it can write, debug, test and implement code.

Using GenAI to write code has become one of the more popular uses for the software. 

In February, Chivas Nambiar, the general manager of the global telecom business unit at AWS, talked to Fierce about AWS’ CodeWhisperer, which uses AI to assist developers in the telco realm with their coding.

The British telco BT is already using CodeWhisperer to assist developers in their coding. According to BT and AWS, more than 100,000 lines of code had already been generated by CodeWhisperer in the first four months since BT began using the product. And Nambiar said, “They’ve already seen 12% of the code that developers are creating being auto generated.”

Today, Jassy summed up Amazon’s opportunity by saying, “Generative AI may be the largest technology transformation since the cloud (which itself, is still in the early stages), and perhaps since the internet…this GenAI revolution will be built from the start on top of the cloud."