Reviving and Improving Recurrent Back Propagation

    Abstract

    In this paper, we revisit the recurrent back-propagation (RBP) algorithm, discuss the conditions under which it applies as well as how to satisfy them in deep neural networks. We show that RBP can be unstable and propose two variants based on conjugate gradient on the normal equations (CG-RBP) and Neumann series (Neumann-RBP). We further investigate the relationship between Neumann-RBP and back propagation through time (BPTT) and its truncated version (TBPTT). Our Neumann-RBP has the same time complexity as TBPTT but only requires constant memory, whereas TBPTT’s memory cost scales linearly with the number of truncation steps. We examine all RBP variants along with BPTT and TBPTT in three different application domains: associative memory with continuous Hopfield networks, document classification in citation networks using graph neural networks and hyperparameter optimization for fully connected networks. All experiments demonstrate that RBPs, especially the Neumann-RBP variant, are efficient and effective for optimizing convergent recurrent neural networks.

    Authors

    Renjie Liao, Yuwen Xiong, Ethan Fetaya, Lisa Zhang, KiJung Yoon, Xaq Pitkow, Raquel Urtasun, Richard Zemel

    Conference

    ICML 2018

    Full Paper

    ‘Reviving and Improving Recurrent Back Propagation’ (PDF)

    Uber ATG

    Comments
    Previous articleLearning to Reweight Examples for Robust Deep Learning
    Next articleVariational Bayesian dropout: pitfalls and fixes
    Renjie Liao
    Renjie Liao is a PhD student in Machine Learning Group, Department of Computer Science, University of Toronto, supervised by Prof. Raquel Urtasun and Prof. Richard Zemel. He is also a Research Scientist in Uber Advanced Technology Group Toronto. He is also affiliated with Vector Institute. He received M.Phil. degree from Department of Computer Science and Engineering, Chinese University of Hong Kong, under the supervision of Prof. Jiaya Jia. He got B.Eng. degree from School of Automation Science and Electrical Engineering in Beihang University (former Beijing University of Aeronautics and Astronautics).
    Avatar
    Yuwen Xiong is a graduate student in Machine Learning Group at the University of Toronto, and a Research Scientist at Uber ATG Toronto, both supervised by Prof. Raquel Urtasun. Before that he received his bachelor degree in Computer Science from Zhejiang University in June 2018. His research interests include Computer Vision and Machine Learning, especially Deep Learning.
    Raquel Urtasun
    Raquel Urtasun is the Chief Scientist for Uber ATG and the Head of Uber ATG Toronto. She is also a Professor at the University of Toronto, a Canada Research Chair in Machine Learning and Computer Vision and a co-founder of the Vector Institute for AI. She is a recipient of an NSERC EWR Steacie Award, an NVIDIA Pioneers of AI Award, a Ministry of Education and Innovation Early Researcher Award, three Google Faculty Research Awards, an Amazon Faculty Research Award, a Connaught New Researcher Award, a Fallona Family Research Award and two Best Paper Runner up Prize awarded CVPR in 2013 and 2017. She was also named Chatelaine 2018 Woman of the year, and 2018 Toronto’s top influencers by Adweek magazine