Training-Free Conditional Diffusion
Cross-source consensus on Training-Free Conditional Diffusion from 1 sources and 5 claims.
1 sources · 5 claims
How it works
Risks & contraindications
Comparisons
Highlighted claims
- Training-free conditional diffusion reuses an unconditional diffusion prior and adds the likelihood only at sampling time. — Tempered Guided Diffusion
- Independent best-of-N sampling improves robustness but spends computation uniformly across trajectories, even when some are poor early. — Tempered Guided Diffusion
- Practical TGD can use MPGD-, DPS-, or DAPS-style modules as approximate conditional reconstruction solvers. — Tempered Guided Diffusion
- The TGD framework unifies several existing training-free samplers under different schedule and weighting choices. — Tempered Guided Diffusion
- Existing training-free samplers can be unreliable on challenging inverse problems. — Tempered Guided Diffusion