At its inaugural LlamaCon AI developer convention on Tuesday, Meta introduced an API for its Llama sequence of AI fashions: the Llama API.
Obtainable in restricted preview, the Llama API lets builders discover and experiment with merchandise powered by totally different Llama fashions, per Meta. Paired with Meta’s SDKs, it permits builders to construct Llama-driven providers, instruments and purposes. Meta didn’t instantly share the API’s pricing with TechCrunch.
The rollout of the API comes as Meta seems to take care of a lead within the fiercely aggressive open mannequin area. Whereas Llama fashions have racked up greater than a billion downloads thus far, in line with Meta, rivals reminiscent of DeepSeek and Alibaba’s Qwen threaten to upend Meta’s efforts to ascertain a far-reaching ecosystem with Llama.
The Llama API gives instruments to fine-tune and consider the efficiency of Llama fashions, beginning with Llama 3.3 8B. Clients can generate knowledge, practice on it, after which use Meta’s analysis suite within the Llama API to check the standard of their customized mannequin.
Meta mentioned it gained’t use Llama API buyer knowledge to coach the corporate’s personal fashions, and that fashions constructed utilizing the Llama API might be transferred to a different host.
For devs constructing on prime of Meta’s not too long ago launched Llama 4 fashions particularly, the Llama API gives model-serving choices through partnerships with Cerebras and Groq. These “early experimental” choices are “available by request” to assist builders prototype their AI apps, Meta mentioned.
“By simply selecting the Cerebras or Groq model names in the API, developers can […] enjoy a streamlined experience with all usage tracked in one location,” wrote Meta in a weblog put up offered to TechCrunch. “[W]e look forward to expanding partnerships with additional providers to bring even more options to build on top of Llama.”
Meta mentioned it’s going to develop entry to the Llama API “in the coming weeks and months.”