Springe direkt zu Inhalt

Valentin Wolf:

EwaldBlocks: Translating the Idea of Ewald Summation to Neural Networks for Global and Local Feature Extraction

Kurzbeschreibung

This Bachelor's thesis introduces the EwaldBlock, a component for NeuralNetworks to efficiently extract local and global features from images. Inspired by the Ewald Summation, a method for efficient computation of long-range interactions in particle systems, EwaldBlocks split their input into high and low frequency components and process them separately. The high frequency part contains local features that can be captured well with standard convolutional layers. Global features remain in the low frequency part of the image. Convolutional layers are unsuited to capture them, as they can only process features as large as their kernel size. EwaldBlocks therefore do pointwise multiplication of the lowest frequency components with learned kernels in the Fourier space. By the circular convolutional theorem, this is an approximated convolution with global kernel size. EwaldBlocks are designed to be able to replace convolutional layers in existing Convolutional Neural Networks (CNN) architectures. Experiments show, that the EwaldBlock can improve performance of shallow architectures  and that computation in the spectral space could lead to models that are robust against noise perturbations and less biased towards local textures compared to CNNs.

Betreuer
Frank Noe, Tim Landgraf
Abschluss
Bachelor of Science (B.Sc.)
Abgabedatum
20.08.2019
Sprache
eng