# UFO-BLO: Unbiased First-Order Bilevel Optimization

Likhosherstov, V and Song, X and Choromanski, K and Davis, J and Weller, A UFO-BLO: Unbiased First-Order Bilevel Optimization. (Unpublished)

Full text not available from this repository.

## Abstract

Bilevel optimization (BLO) is a popular approach with many applications including hyperparameter optimization, neural architecture search, adversarial robustness and model-agnostic meta-learning. However, the approach suffers from time and memory complexity proportional to the length $r$ of its inner optimization loop, which has led to several modifications being proposed. One such modification is \textit{first-order} BLO (FO-BLO) which approximates outer-level gradients by zeroing out second derivative terms, yielding significant speed gains and requiring only constant memory as $r$ varies. Despite FO-BLO's popularity, there is a lack of theoretical understanding of its convergence properties. We make progress by demonstrating a rich family of examples where FO-BLO-based stochastic optimization does not converge to a stationary point of the BLO objective. We address this concern by proposing a new FO-BLO-based unbiased estimate of outer-level gradients, enabling us to theoretically guarantee this convergence, with no harm to memory and expected time complexity. Our findings are supported by experimental results on Omniglot and Mini-ImageNet, popular few-shot meta-learning benchmarks.

Item Type: Article cs.LG cs.LG math.OC stat.ML UNSPECIFIED Div F > Computational and Biological Learning Cron Job 12 Jun 2020 20:01 17 Jun 2021 05:40