LiteLLM Integration
Learn how to integrate LiteLLM agents with the Scenario testing framework
LiteLLM agents work seamlessly with Scenario through the AgentAdapter
interface. LiteLLM provides a unified interface to 100+ LLMs, making it perfect for testing agents across different models.
Basic Integration Pattern
You can integrate it LiteLLM directly, as its messages are already in the OpenAI format:
import scenario
import litellm
class LiteLLMAgentAdapter(scenario.AgentAdapter):
async def call(self, input: scenario.AgentInput) -> scenario.AgentReturnTypes:
response = litellm.completion(
model="openai/gpt-4.1-mini",
messages=[
{
"role": "system",
"content": """
You are a vegetarian recipe agent.
Given the user request, ask AT MOST ONE follow-up question,
then provide a complete recipe. Keep your responses concise and focused.
""",
},
*input.messages,
],
)
return response.choices[0].message # type: ignore
Full Example Project
For a complete working example with a vegetarian recipe agent, check out the Scenario examples.
Next Steps
- Explore Scenario Basics for advanced testing patterns
- Learn about Scripted Simulations for precise control
- Check out more Agent Integration patterns for other frameworks