SHAPE
Cross-source consensus on SHAPE from 1 sources and 6 claims.
1 sources · 6 claims
How it works
Benefits
Preparation
Highlighted claims
- SHAPE uses an event-stage optimizer composed of a slow planner and a fast local port-Hamiltonian controller. — When Descent Is Too Stable: Event-Triggered Hamiltonian Learning to Optimize
- SHAPE treats fixed-budget nonconvex optimization as a two-timescale phase-space navigation problem. — When Descent Is Too Stable: Event-Triggered Hamiltonian Learning to Optimize
- SHAPE was implemented in PyTorch and trained on a single NVIDIA A100 for functional benchmarks. — When Descent Is Too Stable: Event-Triggered Hamiltonian Learning to Optimize
- SHAPE lifts optimizer state from position alone to position and momentum-like coordinates. — When Descent Is Too Stable: Event-Triggered Hamiltonian Learning to Optimize
- At each stage, SHAPE's planner outputs a mode, anchor, structural gain modifiers, anchor strength, and a horizon or budget. — When Descent Is Too Stable: Event-Triggered Hamiltonian Learning to Optimize
- SHAPE's advantage is interpreted as coming mainly from fixed-budget navigation rather than terminal convergence alone. — When Descent Is Too Stable: Event-Triggered Hamiltonian Learning to Optimize