Announcing Gemini Support!
We're thrilled to release BAML support for Gemini, Google AI's Large Language Model catalog!
The latest Gemini 1.5 Flash and Pro models offer an impressive context window of up to 1 million tokens, provide advanced safety and generation configurations, and support a broader range of multimodal inputs, such as video and audio, outpacing the current capabilities of Anthropic and OpenAI’s public models.
Integrating with Gemini lets BAML users access the latest advancements in large language models, and most importantly, with structure and safety.
How to add Gemini to your BAML Project:
-
Generate a Google Generative AI API key.
-
Add your API key to BAML’s VSCode playground or save it as the
GOOGLE_API_KEY
environment variable. -
Instantiate a BAML client for Gemini:
-
Craft your function using the Gemini client:
-
And finally, effortlessly refine your prompt and data types in our playground and instantly incorporate them into your Python, TypeScript, and Ruby scripts!