Mathematics behind Gradient Descent..Simply Explained

#artificialintelligence 

So far we have discussed linear regression and gradient descent in previous articles. We got a simple overview of the concepts and a practical tutorial to understand how they work. In this article, we will see the mathematics behind gradient descent and how can an "optimizer" get the global minima point. If the term "optimizer" is new for you, it is simply the function that works to determine the global minima point which refers to the coefficients of best-fit line in linear regression algorithm. By the way, similar concepts are used in deep learning algorithms.

Duplicate Docs Excel Report

Title
None found

Similar Docs  Excel Report  more

TitleSimilaritySource
None found