We propose a new way of constructing invertible neural networks by combining simple building blocks with a novel set of composition rules. This leads to a rich set of invertible architectures, including those similar to ResNets. Inversion is achieved with a locally convergent iterative procedure that is parallelizable and very fast in practice. Additionally, the determinant of the Jacobian can be computed analytically and efficiently, enabling their generative use as flow models. To demonstrate their flexibility, we show that our invertible neural networks are competitive with ResNets on MNIST and CIFAR-10 classification. When trained as generative models, our invertible networks achieve competitive likelihoods on MNIST, CIFAR-10 and ImageNet 32x32, with bits per dimension of 0.98, 3.32 and 4.06 respectively.
我们提出了一种构建可逆神经网络的新方法,即通过将简单的构建模块与一组新的组合规则相结合。这产生了一系列丰富的可逆架构,包括那些类似于残差网络(ResNets)的架构。通过一种局部收敛的迭代过程实现逆运算,该过程可并行化且在实际中速度非常快。此外,雅可比行列式可以进行解析且高效地计算,使其能够作为流模型用于生成。为了证明它们的灵活性,我们表明我们的可逆神经网络在MNIST和CIFAR - 10分类任务上与残差网络具有竞争力。当作为生成模型进行训练时,我们的可逆网络在MNIST、CIFAR - 10和ImageNet 32x32上取得了有竞争力的似然值,每维比特数分别为0.98、3.32和4.06。