AICurious Logo

What is: QHM?

SourceQuasi-hyperbolic momentum and Adam for deep learning
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Quasi-Hyperbolic Momentum (QHM) is a stochastic optimization technique that alters momentum SGD with a momentum step, averaging an SGD step with a momentum step:

g_t+1=βg_t+(1β)L^_t(θ_t)g\_{t+1} = \beta{g\_{t}} + \left(1-\beta\right)\cdot{\nabla}\hat{L}\_{t}\left(\theta\_{t}\right) θ_t+1=θ_tα[(1v)L^_t(θ_t)+vg_t+1] \theta\_{t+1} = \theta\_{t} - \alpha\left[\left(1-v\right)\cdot\nabla\hat{L}\_{t}\left(\theta\_{t}\right) + v\cdot{g\_{t+1}}\right]

The authors suggest a rule of thumb of v=0.7v = 0.7 and β=0.999\beta = 0.999.