#30 Machine Learning Specialization [Course 1, Week 2, Lesson 2]  Summary and Q&A
TL;DR
Learn how to fit curves and nonlinear functions to your data using polynomial regression and feature engineering.
Key Insights
 ๐ป Polynomial regression allows fitting curves and nonlinear functions to data, expanding the modeling capabilities beyond simple linear relationships.
 ๐งก Feature scaling becomes increasingly important when using highpowered polynomial features, as their value ranges can differ significantly from the original features.
 ๐ง There is a wide range of choices for additional features in polynomial regression, such as squared, cubed, or square root functions of the original feature.
 ๐จโ๐ป Implementing polynomial regression and feature engineering using code can be done using libraries like scikitlearn, a widelyused machine learning toolkit.
 ๐ Understanding how to implement linear regression manually is crucial for a solid understanding of the algorithm, even when using libraries like scikitlearn.
 ๐จโ๐ป The optional labs accompanying the video provide practical examples and code for implementing polynomial regression and using scikitlearn for linear regression.
 ๐ช In practice, machine learning practitioners often rely on libraries like scikitlearn for their efficiency and convenience in training models.
Transcript
so far we've just been fitting straight lines to our data let's take the ideas of multiple linear regression and feature engineering to come up with a new algorithm called polynomial regression which lets you fit curves nonlinear functions to your data let's say you have a housing data set that looks like this where feature X is the size in square... Read More
Questions & Answers
Q: What is polynomial regression?
Polynomial regression is an algorithm that allows fitting curves and nonlinear functions to data by adding features raised to different powers, such as squared or cubed features.
Q: Why is feature scaling important in polynomial regression?
Feature scaling is critical in polynomial regression because the added polynomial features can have drastically different ranges of values compared to the original features. Feature scaling ensures that all features are on comparable scales for efficient gradient descent optimization.
Q: What are some alternative choices of features in polynomial regression?
Besides squared and cubed features, alternative choices of features in polynomial regression can include square root functions of the original feature. These alternative features can result in different curve shapes and help improve model fit.
Q: How can one determine which features to use in polynomial regression?
Determining which features to use in polynomial regression involves evaluating different models with various features and measuring their performance. In the second course of the specialization, you learn methods for comparing and selecting the most suitable features.
Summary & Key Takeaways

Polynomial regression allows fitting curves and nonlinear functions to data by incorporating additional features raised to different powers.

Feature scaling is crucial when creating polynomial features, as they can have significantly different value ranges compared to the original features.

Different choices of features, such as squared, cubed, or square root of the original feature, can be used in polynomial regression to improve model performance.