Positional normalization

Research output: Contribution to journalConference articleResearchpeer-review

A popular method to reduce the training time of deep neural networks is to normalize activations at each layer. Although various normalization schemes have been proposed, they all follow a common theme: normalize across spatial dimensions and discard the extracted statistics. In this paper, we propose an alternative normalization method that noticeably departs from this convention and normalizes exclusively across channels. We argue that the channel dimension is naturally appealing as it allows us to extract the first and second moments of features extracted at a particular image position. These moments capture structural information about the input image and extracted features, which opens a new avenue along which a network can benefit from feature normalization: Instead of disregarding the normalization constants, we propose to re-inject them into later layers to preserve or transfer structural information in generative networks.

Original languageEnglish
JournalAdvances in Neural Information Processing Systems
Volume32
ISSN1049-5258
Publication statusPublished - 2019
Externally publishedYes
Event33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019 - Vancouver, Canada
Duration: 8 Dec 201914 Dec 2019

Conference

Conference33rd Annual Conference on Neural Information Processing Systems, NeurIPS 2019
CountryCanada
CityVancouver
Period08/12/201914/12/2019
SponsorCitadel, Doc.AI, et al., Lambda, Lyft, Microsoft Research

Bibliographical note

Funding Information:
This research is supported in part by the grants from Facebook, the National Science Foundation (III-1618134, III-1526012, IIS1149882, IIS-1724282, and TRIPODS-1740822), the Office of Naval Research DOD (N00014-17-1-2175), Bill and Melinda Gates Foundation. We are thankful for generous support by Zillow and SAP America Inc.

Publisher Copyright:
© 2019 Neural information processing systems foundation. All rights reserved.

ID: 301823632