# Linear Regression

Given a set of points, find a line that fits the data best! Even this simple form of regression allows us to predict future points.

Start## Key Concepts

Review core concepts you need to learn to master this subject

Gradient Descent in Regression

Gradient Descent in Regression

`Gradient Descent`

is an iterative algorithm used to tune the parameters in regression models for minimum loss.

Linear Regression

Lesson 1 of 1

- 1The purpose of machine learning is often to create a model that explains some real-world data, so that we can predict what may happen next, with different inputs. The simplest model that we can fi…
- 2In the last exercise, you were probably able to make a rough estimate about the next data point for Sandra’s lemonade stand without thinking too hard about it. For our program to make the same leve…
- 4The goal of a linear regression model is to find the slope and intercept pair that minimizes loss on average across all of the data. The interactive visualization in the browser lets you try to fi…
- 5As we try to minimize loss, we take each parameter we are changing, and move it as long as we are decreasing loss. It’s like we are moving down a hill, and stop once we reach the bottom: The p…
- 6We have a function to find the gradient of b at every point. To find the m gradient, or the way the loss changes as the slope of our line changes, we can use this formula: -\frac{2}{N}\sum_{i=1}^{…
- 7Now that we know how to calculate the gradient, we want to take a “step” in that direction. However, it’s important to think about whether that step is too big or too small. We don’t want to oversh…
- 8How do we know when we should stop changing the parameters m and b? How will we know when our program has learned enough? To answer this, we have to define convergence.
**Convergence**is when the… - 9We want our program to be able to iteratively
*learn*what the best m and b values are. So for each m and b pair that we guess, we want to move them in the direction of the gradients we’ve calculat… - 10At each step, we know how to calculate the gradient and move in that direction with a step size proportional to our learning rate. Now, we want to make these steps until we reach convergence.
- 11We have constructed a way to find the “best” b and m values using gradient descent! Let’s try this on the set of baseball players’ heights and weights that we saw at the beginning of the lesson.
- 12Congratulations! You’ve now built a linear regression algorithm from scratch. Luckily, we don’t have to do this every time we want to use linear regression. We can use Python’s scikit-learn librar…

## What you'll create

Portfolio projects that showcase your new skills

## How you'll master it

Stress-test your knowledge with quizzes that help commit syntax to memory