LLM Plugin is a Node-RED sidebar extension for chatting with LLMs, generating/modifying flows, and importing results into the active tab.
Click the image below to watch the video:

Add from "Manage palette" or
npm install @background404/node-red-contrib-llm-pluginRestart Node-RED after install.
- Open the LLM Plugin sidebar.
- Configure provider in Settings:
- Ollama: set URL (default
http://localhost:11434) - OpenAI: set API key
- Pick which flow tabs to include via the flow selector (defaults to Current Open Flow; check additional tabs in the dropdown to send them too).
- Select Agent mode for auto-apply, or Ask mode for manual import.
- Enter model and prompt.
- Click Send to generate and/or apply the flow.
It is highly recommended to add custom or non-core nodes to your flow before passing them to the LLM. Since the LLM does not inherently know the required properties of custom nodes, keeping a small sample flow in the active tab ensures it is sent as the Current Open Flow. The model will then follow real node/property patterns from that sample instead of relying on fixed per-node prompt rules.
- Chat history: conversations are persisted on the server and can be loaded, deleted, or continued across sessions.
- Checkpoint / Restore: flow snapshots are saved before and after each import, allowing rollback to any previous state.
- Custom system prompt: add persistent instructions (preferred node types, coding style, language) via Settings.
- Supports Vibe Schema and raw Node-RED JSON.
- Accepts mixed response text + JSON (with or without code fences).
- Preserves robust parsing when function code contains comment tokens in JSON strings.
- Agent mode seamlessly handles connection updates and LLM-driven deletions.
- Apply strategy is model-driven via top-level
applyMode(edit-only,merge,overwrite,delete-only, with a safe fallback toedit-only).
- Implementation guide: src/README.md
- Prompt template: src/prompt_system.txt
When sharing your Node-RED project (e.g., via Git, exporting the user directory, or sharing the environment), please be aware that this plugin saves your API keys (like OpenAI) using the standard Node-RED settings API (RED.settings).
Depending on your Node-RED configuration, these settings may be stored in files such as flows_cred.json or .config.json within your Node-RED user directory.
This is not unique to this plugin; any Node-RED node or plugin storing credentials works similarly.
Always ensure you ignore these credential files in your .gitignore and do not share them publicly to prevent accidental leakage of your private API keys. The plugin itself masks keys in the UI and redacts them from logs, but the underlying storage file on your disk remains sensitive.
- This plugin is under active development.
- Model output quality varies by model and prompt.
Please report issues at: GitHub Issues
My article: 『Node-REDのプラグインを開発してみる その2(LLM Plugin v0.4.0)』

