Chain LLM Node
Connect an LLM to a prompt template for text generation.
Written By pvdyck
Last updated 18 days ago
Chain LLM Node
π§ͺ Labs β Experimental, may change or break without notice
What It Does
Sends a prompt to a connected LLM and returns the response. Use it for text generation, summarization, or any task where you need a single LLM call per item.
Connections
Compatibility
Error Handling
When the LLM encounters issues, the node provides clear error messages:
- Timeout (HTTP 524/504) β "AI model timed out" with retry guidance
- Rate limit (HTTP 429) β "AI model rate limited" with wait guidance
- Unavailable (HTTP 502/503) β "AI model temporarily unavailable"
If Continue on Fail is enabled, failed items are returned with an error field instead of stopping the workflow.
Status
This integration has been tested with basic operations on indie.money. Some advanced features may not work as expected. Report issues if you encounter them.