Gradient descent is an algorithm used to minimize the cost function. we use gradient descent in machine learning to update our model parameters. In Linear regression, parameters refer coefficients and weights in deep learning.

Here, we are lookup into the gradient descent algorithm in machine learning. lets cheers..!

#### Linear regression:

linear regression is a statistical technique, that represents a relationship between one or more independent variables and one dependent variable.

In linear regression, we have a cost function like below. we need to minimize the cost function to get the best predictions.

we need to find the best fit line that predicts future data approximately. we calculate the equation with parameters m and b.

**Read More here:**

Linear Regression in Machine learning – A Simple Way of understanding

## Terminology in Gradient descent

Gradient descent has a little math, but don’t worry you will get a little part of that here. Let’s discuss the terminology in the gradient descent algorithm.

#### Learning rate:

The learning rate is used to take the baby steps towards the local minima. The learning rate is a small float value like 0.001 or 0.0001 or 0.01. we will figure out this value through the trial and error method. It impacts highly on the cost function.

#### Iterations

Iterations – It is an integer, that refers to the no.of iterations. In each iteration, we will update m, b and also we calculated the cost. Iterations are like n =1000 or 2000 or 2500 etc.

#### Derivates of m and b:

Derivatives are used to minimize the function. In linear regression, we have parameters like slope **m **and intercept **b**. Initially, we will take m and b randomly.

#### Upgrade m and b:

We need to update our current **m **and **b **in each iteration. we will update **m **and **b **with **old m**, **old b**, and the **learning rate**.

After updating the **m **and **b **parameters, we will calculate the line equation with new **m **and **b**.

### Steps

I will give you the simple steps to achieve the gradient descent algorithm.

**Step-1:** Define the learning rate, no.of iterations. Take the slope and intercept randomly.

**Step-2:** Calculate the line equation in each iteration.

**Step-3:** Calculate the cost function

**Step-4:** Calculated the derivative of **m **and derivative of **b**.

**Step-5:** Update the **m **and **b **with learning rate with current **m **and **b**.

**Step-6: **whenever we get the cost nearly zero or 1, we will stop the process and then we take those parameters to define our best fit line in linear regression.

Let’s Build the gradient descent algorithm from scratch. here is the code.

```
def gradient_descent(x,y, learning_rate =0.001, iterations =200):
## random m and b
m_curr =1
b_curr =1
n = len(x)
for i in range(iterations):
y_pred = m_curr * x + b_curr ########### y = mx+b
cost = 1/n * sum([val**2 for val in y-y_pred]) ############## 1/n *sum((yi -
y_pred)**2) #MSE
#Derivative of m and b
md = -(2/n)*sum(x*(y-y_pred))
bd = -(2/n)* sum(y-y_pred)
#update m and b
m_curr = m_curr - learning_rate * md
b_curr = b_curr - learning_rate * bd
print('Iteration: {} --- m={} and b={} and cost={}'.format(i, m_curr,b_curr,cost))
x = np.array([1,2,3,4,5,6,7])
y =np.array([12,20,34,41,52,63,72])
#gradient_descent(x,y)
gradient_descent(x,y,iterations=1000,learning_rate=0.001)
```

In this example, I got the cost as **1.5**, After changing the iterations and learning rate you will get the minimum cost.

And my parameters are

Iteration: 999 --- m=9.955591558442103 and b=2.3558995569235415 and cost=1.51912343893408

so now using those parameters, I can Build my line equation like

```
y = 9.96 * Xi + 2.35 # It is y = mx + b
```

I hope you definitely got a clear idea about the gradient descent algorithm in machine learning. Subscribe to our newsletter to get more on machine learning. Thank you for being here.

Check here more *Link*

Pingback: gradient descent machine learning | Gradient descent for linear regression | machine learning jp – Trumpathon – News and information on latest top stories, weather, business, entertainment, politics,