Deep Learning: UNIT -II: CNN : 3.pooling layers
3. Pooling Layer
Apart from Conv2D, we
also used another specific module called MaxPooling2D in our model,
which had no learnable parameters.
Let's try to understand
what is Pooling and what is the significance of a Pooling Module.
The pooling layer helps
in reducing the height and width of the image. Thus, it helps in reducing the
number of parameters and also make invariant to the position in the input. In
other words, it induces "compositionality" in our model architecture.
There are two kinds of
pooling :
- Max pooling
: The filter of size (๐,๐)( is slided over the input
with a stride and we take the maximum in that region. It is used in
between the Conv Layers.
- Average pooling
: The filter of size (๐,๐) is slided over the input with
a sride and we take the average in that region. This type of pooling is
generally used just before the Fully connected layers for feature
extraction!
The above figure illustrates the max pooling using a filter (2,2) dimension and having a stride of 2.
The above figure
illustrates the average pooling using a filter (2,2) dimension and having a
stride of 2.
Does pooling layers add
any learnable weights to the network?
- No, since notice that no
kernel/filter matrix is considered.
- Hence no parameters are needed while
computing pooling
What is the Output shape after pooling layer?
ยท The output shape is given by:
where n is the input size, f is the pooling filter size and s is the stride size.
How Pooling works in
Keras?
layers.MaxPooling2D()
layers.AveragePooling2D()
Both these Pooling layers
have 2 important arguments:
- pool_size:
This is the size of the window on which pooling will be applied. By
default this size is 2. Can be an integer or a tuple of 2 integers. It can
also be thought of as a non-learnable filter which either calculates max
over the region or average out the values.
- strides:
This is same as the one in Conv2D layer. Default is None, which
means that the stride size is same as the pool size.
Comments
Post a Comment