OpenRouter Provider
Access 100+ models from multiple providers through OpenRouter's unified gateway using SimplerLLM.
Overview
OpenRouter acts as a unified gateway to access models from multiple providers (OpenAI, Anthropic, Google, Meta, and more) through a single API. SimplerLLM makes it easy to use any model available on OpenRouter.
What is OpenRouter?
OpenRouter provides access to 100+ models from different providers through a single API endpoint. You can switch between GPT-4, Claude, Llama, Gemini, and many others without changing your code - just change the model name.
Using OpenRouter Models
Model Examples
# OpenAI models through OpenRouter
model_name="openai/gpt-4o"
# Anthropic models through OpenRouter
model_name="anthropic/claude-3-5-sonnet"
# Google models through OpenRouter
model_name="google/gemini-pro"
# Meta models through OpenRouter
model_name="meta-llama/llama-3-70b"
# Any model from OpenRouter's catalog
Finding Available Models
Browse all 100+ available models at OpenRouter's model directory. New models are added regularly.
Setup and Authentication
Get Your API Key
Configure Environment Variables
# .env file
OPENROUTER_API_KEY=your-openrouter-api-key-here
# Optional: Add your site info for better rankings
OPENROUTER_SITE_URL=https://your-site.com
OPENROUTER_SITE_NAME=Your App Name
Basic Usage
from SimplerLLM.language.llm import LLM, LLMProvider
# Create OpenRouter LLM instance
llm = LLM.create(
provider=LLMProvider.OPENROUTER,
model_name="openai/gpt-4o" # Use any OpenRouter model
)
# Generate a response
response = llm.generate_response(
prompt="Explain machine learning in simple terms"
)
print(response)
Switching Models
# Switch to Claude
llm = LLM.create(
provider=LLMProvider.OPENROUTER,
model_name="anthropic/claude-3-5-sonnet"
)
# Switch to Llama
llm = LLM.create(
provider=LLMProvider.OPENROUTER,
model_name="meta-llama/llama-3-70b"
)
# Switch to Gemini
llm = LLM.create(
provider=LLMProvider.OPENROUTER,
model_name="google/gemini-pro"
)
# All use the same SimplerLLM interface!
Advanced Features
Structured JSON Output
from pydantic import BaseModel, Field
from SimplerLLM.language.llm_addons import generate_pydantic_json_model
class Summary(BaseModel):
title: str = Field(description="Title")
points: list[str] = Field(description="Key points")
llm = LLM.create(
provider=LLMProvider.OPENROUTER,
model_name="openai/gpt-4o" # Or any other model
)
result = generate_pydantic_json_model(
llm_instance=llm,
prompt="Summarize the benefits of cloud computing",
model_class=Summary
)
print(f"Title: {result.title}")
Pricing Considerations
OpenRouter charges based on the model you use. Each model has different pricing. View current rates at OpenRouter's model directory.
Cost Optimization
- Compare model pricing in OpenRouter's directory
- Use cheaper models for simple tasks
- OpenRouter often offers competitive pricing vs. direct provider access
- Set up billing alerts in your OpenRouter dashboard
Best Practices
1. Explore Different Models
Try various models to find the best fit for your use case
2. Compare Pricing
Check OpenRouter's pricing - it can be more cost-effective
3. Use Site Info
Set OPENROUTER_SITE_URL
for better model rankings
4. Implement Error Handling
Wrap API calls in try-except blocks