OpenRouter Chat Model
Access multiple LLM providers through OpenRouter's unified API
Written By pvdyck
Last updated About 3 hours ago
OpenRouter Chat Model
Connects to OpenRouter's unified API gateway to access models from multiple providers (Anthropic, OpenAI, Meta, Mistral, Google, and more) through a single credential. This is the recommended model node for most workflows.
How It Works
OpenRouter provides a unified API endpoint that routes your requests to the selected model provider. It handles provider-level fallbacks and cost optimization automatically. You specify a model ID (e.g., anthropic/claude-sonnet-4) and OpenRouter routes the request to the best available provider for that model.
Authentication
Requires an OpenRouter API key configured via Secure Vault. Pricing varies by model -- check openrouter.ai/models for current rates.
Parameters
Key Features
Popular Models
Connects To
Limitations
- Must be connected to a parent node β cannot run standalone.
- No streaming support in the Worker environment.
- Token pricing varies significantly by model β monitor costs.
- Some models may have rate limits or availability constraints.
Tips
- Start with
anthropic/claude-sonnet-4β best overall quality for agent workflows and tool calling. - Use
openai/gpt-4o-minifor high-volume, cost-sensitive workloads. - Browse available models at openrouter.ai/models to compare pricing and capabilities.
- OpenRouter is the simplest way to try different models without managing multiple API keys.
- Set Temperature to 0 for data extraction and classification tasks.