At her inaugural conference Llamaco’s developers on Tuesday, Meta announced an API to her Llama series of her models: Llama API.
Available in a limited observation, Llama API allows developers to explore and experiment with products powered by various Llama models for meta. Paired with Meta SDKs, it allows developers to build services, tools and applications led by Llama. Meta did not immediately share the prices of API with Techcrunch.
API participation comes while Meta seems to hold a lead in the space of the open competitive open model. While Llama’s models have accumulated more than one billion downloads so far, according to Meta, rivals like Deepseek and Qwen of Alibaba threaten to raise meta efforts to create a wide ecosystem with lag.
Llama API offers tools to adjust and evaluate the performance of LMama models, starting with 3.3 8b lamp. Customers can generate data, train it, and then use meta ratings at API Llama to prove the quality of their custom model.
Meta said he would not use Llama API customer data to train the company’s own models, and that the models built using API Llama can be transferred to another host.
For the construction of Devs on top of Llama 4 models recently released meta specifically, Llama API offers opportunities that serve the model through partnerships with Cerebras and Groq. These “early experimental” options are “available with demand” to help developers prototy their applications he, Meta said.
“By simply choosing the names of the Cerebras model or GROK in API, developers can (…) enjoy a simplified experience with all the use followed in one place,” Meta wrote in a blog post given for Techcrunch. “(W) expect to expand partnerships with additional providers to bring even more opportunities to build on top of the lamp.”
Meta said it will expand the entry into Llama’s API “in the coming weeks and months.”