Beginning right this moment, builders can access the latest Gemini models by way of the OpenAI Library and REST API, making it simpler to get began with Gemini. We are going to initially assist the Chat Completions API and Embeddings API, with plans for extra compatibility within the weeks and months to return. You’ll be able to learn extra within the Gemini API docs, and for those who aren’t already utilizing the OpenAI libraries, we suggest that you simply name the Gemini API immediately.
python
from openai import OpenAI
consumer = OpenAI(
api_key="gemini_api_key",
base_url="https://generativelanguage.googleapis.com/v1beta/"
)
response = consumer.chat.completions.create(
mannequin="gemini-1.5-flash",
n=1,
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{
"role": "user",
"content": "Explain to me how AI works"
}
]
)
print(response.decisions[0].message)
nodejs
import OpenAI from "openai";
const openai = new OpenAI({
apiKey: "gemini_api_key",
baseURL: "https://generativelanguage.googleapis.com/v1beta/"
});
const response = await openai.chat.completions.create({
mannequin: "gemini-1.5-flash",
messages: [
{ role: "system", content: "You are a helpful assistant." },
{
role: "user",
content: "Explain to me how AI works",
},
],
});
console.log(response.decisions[0].message);
bash
curl "https://generativelanguage.googleapis.com/v1beta/chat/completions"
-H "Content material-Sort: software/json"
-H "Authorization: Bearer $gemini_api_key"
-d '{
"mannequin": "gemini-1.5-flash",
"messages": [
{"role": "user", "content": "Explain to me how AI works"}
]
}'
For an inventory of supported Gemini API parameters, you possibly can learn our API Reference. We’re excited for extra builders to get an opportunity to start out constructing with Gemini and could have extra updates to share quickly. If you’re a Vertex AI Enterprise buyer, we additionally support OpenAI compatibility. Joyful constructing!