Sign In

CHORD Model Workflow in ComfyUI | PBR Material Generation

Updated: Apr 1, 2026

toolnew

Download

1 variant available

Archive Other

2.86 KB

Verified:

Type

Workflows

Stats

52

Reviews

Published

Apr 1, 2026

Base Model

Other

Hash

AutoV2
2B668D6F99
default creator card background decoration
RunComfy's Avatar

RunComfy

Turns images into true PBR texture maps fast.

Who it's for: creators who want this pipeline in ComfyUI without assembling nodes from scratch. Not for: one-click results with zero tuning — you still choose inputs, prompts, and settings.

Open preloaded workflow on RunComfy

Open preloaded workflow on RunComfy (browser)

Why RunComfy first
- Fewer missing-node surprises — run the graph in a managed environment before you mirror it locally.
- Quick GPU tryout — useful if your local VRAM or install time is the bottleneck.
- Matches the published JSON — the zip follows the same runnable workflow you can open on RunComfy.

When downloading for local ComfyUI makes sense — you want full control over models on disk, batch scripting, or offline runs.

How to use (local ComfyUI)
1. Load inputs (images/video/audio) in the marked loader nodes.
2. Set prompts, resolution, and seeds; start with a short test run.
3. Export from the Save / Write nodes shown in the graph.

Expectations — First run may pull large weights; cloud runs may require a free RunComfy account.


Overview

This workflow helps you create production-ready PBR textures for realistic rendering. Using Ubisoft's advanced rendering decomposition, it converts reference or generated textures into full SVBRDF maps. You can easily control output materials for real-time engines or VFX pipelines. Perfect for designers seeking high-fidelity detail and modularization. Achieve richer, more accurate textures with predictable, high-quality surface properties.

Important nodes:

Key nodes in Comfyui CHORD Model workflow

CLIPTextEncode (#4)

Encodes your text into conditioning for the texture generator. Be explicit about material class, surface qualities, and tiling intent. Terms like orthographic, seamless, grout lines, pores, fibers, or micro-scratches help the generator produce structures that the CHORD Model can decompose reliably.

KSampler (#7)

Drives the latent diffusion process that creates the texture. Use it to trade speed for fidelity, switch samplers, and explore variations via the seed. A blank negative prompt is provided by ConditioningZeroOut (#5); add typical negatives only if you see artifacts you want to suppress.

ModelSamplingAuraFlow (#2)

Applies an AuraFlow-style scheduling to the UNet for sharper, coherent texture synthesis with z_image_turbo. Change the scheduler here when you experiment with different sampling behaviors packed with the model.

ChordMaterialEstimation (#20)

Runs the CHORD Model to estimate SVBRDF maps from the input texture. Results are production-ready base color, normal, roughness, and metalness. Use flat, evenly lit inputs without perspective to maximize accuracy; complex shadows or highlights can bias the decomposition.

ChordNormalToHeight (#18)

Converts the CHORD-predicted normal into a height map suited for displacement. Treat height as a relative surface signal and calibrate intensity in your renderer to match the intended scale.

EmptySD3LatentImage (#6)

Sets the canvas size and batch for texture synthesis. Choose a square resolution that matches your downstream material targets and keep this consistent across generations for predictable texel density.

Notes

CHORD Model Workflow in ComfyUI | PBR Material Generation — see RunComfy page for the latest node requirements.