“There’s scarcely a part of the company that is unaffected by AI,” mentioned Vishal Sharma, Amazon’s VP of Synthetic Common Intelligence, on Monday at Cellular World Congress in Barcelona. He dismissed the concept that open-source fashions would possibly cut back compute wants and deflected when requested whether or not European corporations would change their generative AI methods in mild of geopolitical tensions with the US.
Sharma mentioned onstage on the startup convention that Amazon was now deploying AI by its personal foundational fashions throughout Amazon Internet Providers — Amazon’s cloud computing division — the robotics in its warehouses, and the Alexa client product, amongst different purposes.
“We have something like three-quarters of a million robots now, and they are doing everything from picking things to running themselves within the warehouse. The Alexa product is probably the most widely deployed home AI product in existence … There’s no part of Amazon that’s untouched by generative AI.”
In December, AWS introduced a brand new suite of 4 text-generating fashions, a household of multimodal generative AI fashions it calls Nova.
Sharma mentioned these fashions are examined towards public benchmarks: “It became pretty clear there’s a huge diversity of use cases. There’s not a one-size-fits-all. There are some places where you need video generation … and other places, like Alexa, where you ask it to do specific things, and the response needs to be very, very quick, and it needs to be highly predictable. You can’t hallucinate ‘unlock the back door’.”
Nevertheless, he mentioned decreasing compute wants with smaller open-source fashions was unlikely to occur: “As you begin to implement it in different scenarios, you just need more and more and more intelligence,” he mentioned.
Amazon has additionally launched “Bedrock,” a service inside AWS geared toward corporations and startups that need to combine and match varied foundational fashions — together with China’s DeepSeek. It allows customers to modify between fashions seamlessly, he mentioned.
Amazon can also be constructing an enormous AI compute cluster on its Trainium 2 chips in partnership with Anthropic, during which it has invested $8 billion. In the meantime, Elon Musk’s xAI just lately launched its newest flagship AI mannequin, Grok 3, utilizing an infinite knowledge middle in Memphis that comprises round 200,000 GPUs.
Requested about this stage of compute sources, Sharma mentioned: “My personal opinion is that compute will be a part of the conversation for a very long time to come.”
He didn’t suppose Amazon was beneath strain from the blizzard of open supply fashions that had just lately emerged from China: “I wouldn’t describe it like that,” he mentioned. Quite the opposite, Amazon is comfy deploying DeepSeek and different fashions on AWS, he recommended. “We’re a company that believes in choice … We are open to adopting whatever trends and technologies are good from a customer perspective,” Sharma mentioned.
When Open AI launched ChatGPT in late 2022, did he suppose Amazon was caught napping?
“No, I think I would disagree with that line of thought,” he mentioned. “Amazon has been working on AI for about 25 years. If you look at something like Alexa, there’s something like 20 different AI models that are running at Alexa… We had billions of parameters that existed already for language. We’ve been looking at this for quite some time.”
On the difficulty of the current controversy surrounding Trump and Zelenskyy, and the following pressure on U.S. relations with many European nations, did he suppose European corporations would possibly look elsewhere for GenAI sources sooner or later?
Sharma admitted this challenge was “outside” of his “zone of expertise” and the results are “very hard for me to predict …” However he did considerably diplomatically trace that some corporations would possibly regulate their technique. “What I will say is that it is the case that technical innovation responds to incentives,” he mentioned.