Prompt-MoE Surrogate
Cross-source consensus on Prompt-MoE Surrogate from 1 sources and 5 claims.
1 sources · 5 claims
How it works
Highlighted claims
- The Prompt-MoE surrogate consumes normalized parameters, context, and the retrieved prompt. — RASP-Tuner: Retrieval-Augmented Soft Prompts for Context-Aware Black-Box Optimization in Non-Stationary Environments
- The surrogate uses six expert MLPs and a gating network to produce weighted predictions. — RASP-Tuner: Retrieval-Augmented Soft Prompts for Context-Aware Black-Box Optimization in Non-Stationary Environments
- Expert disagreement is computed as the weighted squared deviation among expert predictions. — RASP-Tuner: Retrieval-Augmented Soft Prompts for Context-Aware Black-Box Optimization in Non-Stationary Environments
- Most adaptation steps update only the retrieved prompts while keeping expert and gate weights frozen. — RASP-Tuner: Retrieval-Augmented Soft Prompts for Context-Aware Black-Box Optimization in Non-Stationary Environments
- Emergency full updates train experts and gate from a replay buffer when anomaly or predictive variance crosses a threshold. — RASP-Tuner: Retrieval-Augmented Soft Prompts for Context-Aware Black-Box Optimization in Non-Stationary Environments