# Max Pooling

*In continuation of my previous post **“ Getting started with Deep Learning”,** in this post I will be discussing Max pooling .*

*Glimpse of what to expect :*

*What is Pooling?**Why Max pooling is used?**How Max Pooling operation works?*

*So let’s start this pool(ing) party !!*

**What is Pooling ?**

P*ooling is an approach to down sampling. It is a technique used to reduce the dimensionality of the image obtained from the previous convolutional layer, by reducing the number of pixels in the output. A pooling layer is a new layer added after the convolutional layer. Commonly used pooling methods are Max pooling, Average pooling and Min pooling .*

*Max Pooling**- calculates**maximum**of the each block of feature map.**Average Pooling**- calculates**average**of the each block of feature map.**Min Pooling**- calculates**minimum**of the each block of feature map.*

*We will focus on Max pooling for this post !!*

*Why Max Pooling is used?*

*Why Max Pooling is used?*

*Two main reasons why max pooling is effective is :*

*It reduces the amount of parameters going forward and hence computational load.**Higher valued pixels are the most activated and hence captured in this operation.*

# Max Pooling Process

*Since we have got a basic idea what max pooling intends to do , let’s discuss about the operation what it actually does.*

*We have discussed about role of** kernels **in the previous post . They act as filters and when they convolve over an image, they create an output corresponding to computations of the filter applied .*

*For max pooling, we define*

*filter(or kernel) of size n*n**stride value k**(by how many pixels we want our filter to move),*

*For each movement of the filter from the n*n pixels block of the image under consideration at that point, maximum value is captured as the output for the next layer . Then the filter is moved by k pixels (defined as stride) to perform the same operation again.One key point to be noted that Max pooling is applied after a convolutional layer .*

*You will hear about Global Average Pooling (GAP layer) too while working with deep learning models , will touch on that in coming posts . Till then , stay tuned ! stay safe !*