Relu function is the most used activation function. It helps us to solve vanishing gradient problems.