AICurious Logo

What is: Progressive Neural Architecture Search?

SourceProgressive Neural Architecture Search
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Progressive Neural Architecture Search, or PNAS, is a method for learning the structure of convolutional neural networks (CNNs). It uses a sequential model-based optimization (SMBO) strategy, where we search the space of cell structures, starting with simple (shallow) models and progressing to complex ones, pruning out unpromising structures as we go.

At iteration bb of the algorithm, we have a set of KK candidate cells (each of size bb blocks), which we train and evaluate on a dataset of interest. Since this process is expensive, PNAS also learns a model or surrogate function which can predict the performance of a structure without needing to train it. We then expand the KK candidates of size bb into KKK' \gg K children, each of size b+1b+1. The surrogate function is used to rank all of the KK' children, pick the top KK, and then train and evaluate them. We continue in this way until b=Bb=B, which is the maximum number of blocks we want to use in a cell.