Dec 6, 2024
Structured outputs with Open AI O1
OpenAI recently announced O1, a new update to their model that is the best.
BAML is our prompting framework that lets you do function calling and tool calling with any LLM, even when it's not officially supported.
How to use Open AI O1 with BAML
Let's look at an interactive example!
Classifying messages with O1
First we write some BAML code to classify messages. Here we write a function that takes a message and returns a category, that will be executed by an LLM. BAML will parse the result for you into the right enum type.
You can run this in python like this:
from baml_client import b
response = b.ClassifyMessage(message="I want to cancel my subscription")
print(response.category)
Tool calling with O1 Example
BAML lets you write tools like normal Unions of types. Here's another example where we have two tools, one that gets the weather, and one that sends an email.
You can call this in python like this (we also support other languages!):
from baml_client import b
from baml_client.types import WeatherAPI, SendEmail
response = b.ChooseOneTool(user_message="what's the weather in san francisco tomorrow april 23rd 2024 at 3pm?")
print(response)
if isinstance(response, WeatherAPI):
print(f"Weather API called:")
print(f"City: {response.city}")
print(f"Time of Day: {response.timeOfDay}")
elif isinstance(response, SendEmail):
print(f"SendEmail called:")
print(f"To: {response.emailTo}")
print(f"Subject: {response.emailSubject}")