225 Attention U net. What is attention and why is it needed for U Net
What is attention and why is it needed for UNet Attention in UNet is a method to highlight only the relevant activations during training. It reduces the computational resources wasted on irrelevant activations and provides better generalization of the network. Two types of attention: 1. Hard attention Highlight relevant regions by cropping. One region of an image at a time; this implies it is non differentiable and needs reinforcement learning. Network can either pay attention or not, nothing in between. Backpropagation cannot be used. 2. Soft attention Weighting different parts of the image. Relevant parts of image get large weights and less relevant parts get small weights. Can be trained with backpropagation. During training, the weights also get trained making the model pay more attention to relevant regions. In summary it adds weights to pixels based on the relevance. Why is attention needed in UNet Unet skip connection combines spatial information from the downsampling path with
|
|