By default, the AI Router uses all available models to determine the most appropriate model for a request. To restrict the AI Router to use only certain models, you can pass a list of model names for the AI Router to consider. First make sure to have everything setup as described in Quickstart. The list of available models can be found in Available Models.
Here is an example on how to limit the models to use:
Using OpenAI:
...
client.chat.completions.create(
model="auto",
messages=[<omitted>],
extra_body: {
'models': ['mistral-large-2402', 'gpt-4-0125-preview']
}
)
Using langchain:
llm = ChatOpenAI(
model="auto",
base_url='https://api.airouter.io',
model_kwargs={
'extra_body': {
'models': ['mistral-large-2402', 'gpt-4-0125-preview'],
}
}
)
Using langchain.js:
import { ChatOpenAI } from '@langchain/openai';
import { PromptTemplate } from '@langchain/core/prompts';
const llm = new ChatOpenAI(
{
model: 'auto',
apiKey: '<THE-API-KEY-YOU-GENERATED>'
configuration: { baseURL: "https://api.airouter.io" },
modelKwargs: {
models: ['mistral-large-2402', 'gpt-4-0125-preview'],
},
},
);
const prompt = PromptTemplate.fromTemplate(
'What is the capital of France?',
);
const chain = prompt.pipe(llm);
const result = await chain.invoke({});