# #30 Machine Learning Specialization [Course 1, Week 2, Lesson 2] | Summary and Q&A

15.3K views
โข
December 1, 2022
by
DeepLearningAI
#30 Machine Learning Specialization [Course 1, Week 2, Lesson 2]

## TL;DR

Learn how to fit curves and non-linear functions to your data using polynomial regression and feature engineering.

## Key Insights

• ๐ป Polynomial regression allows fitting curves and non-linear functions to data, expanding the modeling capabilities beyond simple linear relationships.
• ๐งก Feature scaling becomes increasingly important when using high-powered polynomial features, as their value ranges can differ significantly from the original features.
• ๐ง There is a wide range of choices for additional features in polynomial regression, such as squared, cubed, or square root functions of the original feature.
• ๐จโ๐ป Implementing polynomial regression and feature engineering using code can be done using libraries like scikit-learn, a widely-used machine learning toolkit.
• ๐ Understanding how to implement linear regression manually is crucial for a solid understanding of the algorithm, even when using libraries like scikit-learn.
• ๐จโ๐ป The optional labs accompanying the video provide practical examples and code for implementing polynomial regression and using scikit-learn for linear regression.
• ๐ช In practice, machine learning practitioners often rely on libraries like scikit-learn for their efficiency and convenience in training models.

## Transcript

so far we've just been fitting straight lines to our data let's take the ideas of multiple linear regression and feature engineering to come up with a new algorithm called polynomial regression which lets you fit curves non-linear functions to your data let's say you have a housing data set that looks like this where feature X is the size in square... Read More

### Q: What is polynomial regression?

Polynomial regression is an algorithm that allows fitting curves and non-linear functions to data by adding features raised to different powers, such as squared or cubed features.

### Q: Why is feature scaling important in polynomial regression?

Feature scaling is critical in polynomial regression because the added polynomial features can have drastically different ranges of values compared to the original features. Feature scaling ensures that all features are on comparable scales for efficient gradient descent optimization.

### Q: What are some alternative choices of features in polynomial regression?

Besides squared and cubed features, alternative choices of features in polynomial regression can include square root functions of the original feature. These alternative features can result in different curve shapes and help improve model fit.

### Q: How can one determine which features to use in polynomial regression?

Determining which features to use in polynomial regression involves evaluating different models with various features and measuring their performance. In the second course of the specialization, you learn methods for comparing and selecting the most suitable features.

## Summary & Key Takeaways

• Polynomial regression allows fitting curves and non-linear functions to data by incorporating additional features raised to different powers.

• Feature scaling is crucial when creating polynomial features, as they can have significantly different value ranges compared to the original features.

• Different choices of features, such as squared, cubed, or square root of the original feature, can be used in polynomial regression to improve model performance.