Anthropic Chat Model
Use Anthropic's Claude models as the LLM in your AI agent workflows
Written By pvdyck
Last updated About 5 hours ago
Anthropic Chat Model
Connects to Anthropic's Claude API to provide LLM capabilities for Agent and Chain LLM workflows.
Authentication
Requires an Anthropic API key configured via Secure Vault. Token usage counts toward your Anthropic API billing.
Parameters
Available Models
Model availability depends on your Anthropic API plan.
Options
Connects To
Limitations
- Must be connected to a parent node β cannot run standalone.
- No streaming support in the Worker environment.
- Token limits vary by model (check Anthropic docs for current limits).
Tips
- Use Claude 4 Sonnet for most workflows β it offers the best cost/performance ratio.
- Set Temperature to 0 for deterministic, reproducible outputs (e.g., data extraction, classification).
- Set Temperature to 0.7-1.0 for creative tasks (e.g., content generation).
- Consider using OpenRouter Chat Model instead for access to Claude plus other providers through a single credential.