If you don't have it installed already, install the LangChain OpenAI package using pip:
pip install langchain_openai
Set the openai base url to the AI Router base url https://api.airouter.io
in
your code and api key to the one you generated. Then proceed to call the OpenAI
methods as you would normally do:
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
base_url="https://api.airouter.io",
api_key="<THE-API-KEY-YOU-GENERATED>",
model="gpt-4o-mini"
)
This example now uses the AI Router to select the most appropriate model for each request and uses gpt-4o-mini as a default model. You can also supply no model or specify a different model, as well as a list of models for the AI Router to consider to choose from.
In case you want to pass some additional parameters as described in
Model Selection
or Weighting
it can be done by adding the
extra_body
parameter to the model_kwargs
:
llm = ChatOpenAI(
model='auto',
base_url='https://api.airouter.io',
model_kwargs={
'extra_body': {
'models': ['gpt-4o-mini', 'gpt-4o'],
'weighting': {
'quality': 1.0,
'costs': 1.0,
'latency': 1.0,
}
}
}
)