Graph HyperNetworks for Neural Architecture Search

    Abstract

    Neural architecture search (NAS) automatically finds the best task-specific neural network topology, outperforming many manual architecture designs. However, it can be prohibitively expensive as the search requires training thousands of different networks, while each can last for hours. In this work, we propose the Graph HyperNetwork (GHN) to amortize the search cost: given an architecture, it directly generates the weights by running inference on a graph neural network. GHNs model the topology of an architecture and therefore can predict network performance more accurately than regular hypernetworks and premature early stopping. To perform NAS, we randomly sample architectures and use the validation accuracy of networks with GHN generated weights as the surrogate search signal. GHNs are fast – they can search nearly 10× faster than other random search methods on CIFAR-10 and ImageNet. GHNs can be further extended to the anytime prediction setting, where they have found networks with better speed-accuracy tradeoff than the state-of-the-art manual designs.

    Authors

    Chris Zhang, Mengye Ren, Raquel Urtasun

    Conference

    Meta Learning workshop @ NeurIPS 2018

    Full Paper

    ‘Graph HyperNetworks for Neural Architecture Search’ (PDF)

    Uber ATG

    Comments
    Previous articlePredicting Motion of Vulnerable Road Users using High-Definition Maps and Efficient ConvNets
    Next articleLanczosNet: Multi-Scale Deep Graph Convolutional Networks
    Mengye Ren
    Mengye Ren is a research scientist at Uber ATG Toronto. He is also a PhD student in the machine learning group of the Department of Computer Science at the University of Toronto. He studied Engineering Science in his undergrad at the University of Toronto. His research interests are machine learning, neural networks, and computer vision. He is originally from Shanghai, China.
    Raquel Urtasun
    Raquel Urtasun is the Chief Scientist for Uber ATG and the Head of Uber ATG Toronto. She is also a Professor at the University of Toronto, a Canada Research Chair in Machine Learning and Computer Vision and a co-founder of the Vector Institute for AI. She is a recipient of an NSERC EWR Steacie Award, an NVIDIA Pioneers of AI Award, a Ministry of Education and Innovation Early Researcher Award, three Google Faculty Research Awards, an Amazon Faculty Research Award, a Connaught New Researcher Award, a Fallona Family Research Award and two Best Paper Runner up Prize awarded CVPR in 2013 and 2017. She was also named Chatelaine 2018 Woman of the year, and 2018 Toronto’s top influencers by Adweek magazine