Convolutions and More as Einsum: A Tensor Network Perspective with Advances for Second-Order Methods

AmazUtah_NLP at SemEval-2024 Task 9: A MultiChoice Question Answering System for Commonsense Defying Reasoning


View a PDF of the paper titled Convolutions and More as Einsum: A Tensor Network Perspective with Advances for Second-Order Methods, by Felix Dangel

View PDF

Abstract:Despite their simple intuition, convolutions are more tedious to analyze than dense layers, which complicates the transfer of theoretical and algorithmic ideas to convolutions. We simplify convolutions by viewing them as tensor networks (TNs) that allow reasoning about the underlying tensor multiplications by drawing diagrams, manipulating them to perform function transformations like differentiation, and efficiently evaluating them with einsum. To demonstrate their simplicity and expressiveness, we derive diagrams of various autodiff operations and popular curvature approximations with full hyper-parameter support, batching, channel groups, and generalization to any convolution dimension. Further, we provide convolution-specific transformations based on the connectivity pattern which allow to simplify diagrams before evaluation. Finally, we probe performance. Our TN implementation accelerates a recently-proposed KFAC variant up to 4.5x while removing the standard implementation’s memory overhead, and enables new hardware-efficient tensor dropout for approximate backpropagation.

Submission history

From: Felix Dangel [view email]
[v1]
Wed, 5 Jul 2023 13:19:41 UTC (1,973 KB)
[v2]
Wed, 23 Oct 2024 22:47:01 UTC (3,726 KB)



Source link
lol

By stp2y

Leave a Reply

Your email address will not be published. Required fields are marked *

No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.