Romain Hermary

Learning grayscale mathematical morphology with smooth morphological layers

Abstract

The integration of mathematical morphology operations within convolutional neural network architectures has received an increasing attention lately. However, replacing standard convolution layers by morphological layers performing erosions or dilations is particularly challenging because the min and max operations are not differentiable. P-convolution layers were proposed as a possible solution to this issue since they can act as smooth differentiable approximation of min and max operations, yielding pseudo-dilation or pseudo-erosion layers. In a recent work, we proposed two novel morphological layers based on the same principle as the p-convolution, while circumventing its principal drawbacks, and showcased their capacity to efficiently learn grayscale morphological operators while raising several edge cases. In this work, we complete those previous results by thoroughly analyzing the behavior of the proposed layers and by investigating and settling the reported edge cases. We also demonstrate the compatibility of one of the proposed morphological layers with binary morphological frameworks.

Continue reading