The output of the convolutional layer is often passed through the ReLU activation operate to bring non-linearity to your model. It requires the characteristic map and replaces all of the destructive values with zero.While in the convolution layer, we transfer the filter/kernel to every probable posture on the input matrix. Element-wise multiplicati