Group Parity
Cross-source consensus on Group Parity from 1 sources and 6 claims.
1 sources · 6 claims
Uses
How it works
Risks & contraindications
Evidence quality
Highlighted claims
- The main testbed is group-level even parity on 36-bit binary images. — The two clocks and the innovation window: When and how generative models learn rules
- Increasing the group size increases bit interaction order and provides direct control over rule complexity. — The two clocks and the innovation window: When and how generative models learn rules
- Diffusion transformers learned parity well only for small group sizes at the training endpoint. — The two clocks and the innovation window: When and how generative models learn rules
- High-G parity is difficult because group parity requires degree-G multiplicative interactions. — The two clocks and the innovation window: When and how generative models learn rules
- Novel full samples in parity mainly arise by recombining memorized groups into unseen full samples. — The two clocks and the innovation window: When and how generative models learn rules
- For larger parity groups, more rule-valid generations were exact training samples. — The two clocks and the innovation window: When and how generative models learn rules