Abstract: Convolutional neural networks (CNNs) have a natural benefit that a translated input results in translated feature maps. In other words, the feature maps change in a predictable way when the translation is met at the input. This property is called equivariance. In many tasks it would be desirable to have the equivariance behavior also for other transformations. However, this does not happen in classic CNNs. Since the introduction of group convolutional networks in 2016, this topic has attracted a lot of attention. The paper introduced equivariance to 90 rotations. We will talk about a follow-up by Maurice Weiler et al. that generalized the concept to any discrete group of rotations and achieved the SOTA performance on rotated MNIST. They use nice mathematical tools and so called steerable filters.
Who: Václav Košík
When: 10:00 a.m. Friday, February 23
Where: The session will occur physically at the Institute of Information Theory and Automation (UTIA). Depending on the number of listeners in room 25 or 45 (café). For directions to the institute, please refer to the following link: https://www.utia.cas.cz/contacts#way
Language: Czech (if you require English, please let us know in advance)