Boundary Blog
Learn about the latest updates to Boundary's toolkit
Learn about the latest updates to Boundary's toolkit
How to do tool-calling or function-calling with Gemini 2.0
How to do tool-calling or function-calling with o3-mini
Roadmap to BAML 1.0
Semantic Streaming
Type System
BAML Chat
VS Code LLM Playground 2.0
A proposal for a new way to build AI applications and agents
Join us on January 27-31
How to do tool-calling or function-calling with Deepseek R1
How to use Open AI O1 to do function calling or tool calls
BAML is now supported in Cursor
BAML now supports parsing triple-backtick code blocks in LLM outputs
Use BAML to evaluate your LLM applications regardless of the language you use to call them
Modify LLM response models at runtime.
A survey of every framework for extracting structured output from LLMs, and how they compare.
A new technique for streaming structured output from LLMs
BAML now integrates with OpenAPI, allowing you to call BAML functions from any language.
Getting structured output out of Ollama, using novel parsing techniques.
We leveraged a novel technique, schema-aligned parsing, to achieve SOTA on BFCL with every LLM.
A technical explanation of every way to extract structed data from an LLM
Exposing the inner workings of BAML
An overview of the work that goes into building a new programming language.
Capturing Non-Text Information and Richer Context with LLMs
Applying structure to Gemini output with BAML
How to do RAG with Ruby streaming AI APIs
How to do RAG with NextJS streaming AI APIs
A deep-dive into how to use type-definitions instead of json schemas in prompt engineering to improve accuracy and reduce costs
BAML is a lightweight programming language to help perform structured prompting in a typesafe way.