GD is one of the first order optimization algorithm that works better for minimum function to take steps propositional to the negative gradient functions.