This project transforms static Blender files into agentic tools. By bridging OpenClaw with Blender’s Python API (bpy), an LLM can autonomously manipulate procedural nodes (like noise scale, roughness, or color intensity) and trigger high-quality Cycles renders based on natural language prompts.
The goal is to move away from manual "slider-pushing." Instead of opening Blender to tweak a material, you can tell the agent:
"Generate a render where the texture is much grainier and the surface looks like wet marble."
The agent identifies the correct Value Nodes, updates their defaults, and returns a rendered image for review.
- CYCLES: Production quality, photorealistic (slower, 30 min timeout)
- EEVEE: Fast real-time rendering (testing, 1 min timeout)
This repository is designed as an installable Python package for OpenClaw agents:
src/synthclaw/: Contains the OpenClaw execution wrappers (blender_skill.py,analyze_skill.py) that trigger background system processes.scripts/: Internal Blender execution code (agent_bridge.py,analyze_blends.py). These are loaded directly into Blender by the wrappers.config/: JSON tool schemas provided to your LLM agent (render_schema.json,analyze_schema.json).assets/: Containslow.blendandhigh.blendrequired for testing agent configurations.tests/: Automated unit tests for validating Blender environment setup and output.pyproject.toml: Local package definition.
For the agent to "see" your controls, you must explicitly name your Value Nodes:
- Open your
.blendfile. - In the Shader Editor, add an Input > Value node.
- Connect it to your procedural math/texture inputs.
- Critical Step: Select the Value node, press
Nfor the sidebar, and under Item > Name, set a unique name (e.g.,GrainScale).
- Note: The script looks at the internal Name, not just the Label.
Ensure blender is in your system $PATH. Test this by running:
blender --version
Requirement: Version 4.0.0 or higher.
- Place
agent_bridge.pyin your project root or a known scripts directory. - Register the
blender_skill.pyfunction within your OpenClaw worker. - Update the
blend_filepath in your tool call to point to your specific.blendassets.
- User Input: "Make the displacement more aggressive."
- LLM Reasoning: The LLM looks at the tool schema, sees a parameter named
DisplacementStrength, and decides to increase its value from1.0to2.5. - Execution: OpenClaw runs Blender in Headless Mode (
-b). - Modification:
agent_bridge.pyiterates through all materials, finds the node namedDisplacementStrength, and updates it. - Output: A
.pngis rendered and the path is returned to the user.
Before rendering, the agent can run analyze_blend. This script parses the .blend file without launching GUI overhead. It examines poly-count, material configurations, lighting rigs, and rendering setups to produce a Complexity & Realism Score. This gives agents foresight on the visual quality they can generate.
When generating outputs, the tool supports quantitative evaluation of the rendered image:
- Naturalness (GraNatPy): When
compute_metrics: trueis passed, the skill leveragesgranatpyto compute the naturalness factor of the generated image. - Reference Image Similarity (LPIPS): If an optional
reference_imagepath (a real-world photo) is provided alongsidecompute_metrics: true, the skill dynamically compares the render against the real photo. It returns the Learned Perceptual Image Patch Similarity (LPIPS) score and the delta Naturalness Factor (dNf), measuring how close the synthetic image is to reality.
- Cycles Rendering: This skill defaults to the Cycles engine. If running on a headless server without a GPU, the script is configured to fallback to CPU rendering to prevent driver crashes.
- Injection Safety: The bridge uses
sys.argvfiltering. However, ensure the LLM is restricted from passing arbitrary string commands into theparametersdictionary. - Version Lock: This project uses
bpysyntax specific to the 4.0+ mesh and node system. It is not backwards compatible with 2.7x or early 2.8x versions.
This project is licensed under the MIT License - see the LICENSE file for details.