# Stanford ENGR108: Introduction to Applied Linear Algebra | 2020 | Lecture 35-VMLS LS data fitting | Summary and Q&A

2.8K views
February 26, 2021
by
Stanford Online
Stanford ENGR108: Introduction to Applied Linear Algebra | 2020 | Lecture 35-VMLS LS data fitting

## TL;DR

This content explains the concept of least squares data fitting, which involves creating a model that approximates the relationship between an independent variable and an outcome variable.

## Install to Summarize YouTube Videos and Get Transcripts

### Q: What is the purpose of least squares data fitting?

The purpose is to create a model that approximates the relationship between an independent variable and an outcome variable, allowing for predictions to be made based on the independent variable.

### Q: How is the prediction model constructed?

The prediction model is constructed using basis functions, which are combined through a linear combination with model parameters.

### Q: What does the least squares method minimize?

The least squares method minimizes the RMS prediction error, which is the sum of the squared residuals divided by the number of data points.

### Q: What is the simplest model in least squares data fitting?

The simplest model is a constant model, where the prediction is a single value (usually the mean) and does not depend on the independent variable.

## Summary & Key Takeaways

• Least squares data fitting is the process of creating a model that predicts the outcome variable based on an independent variable.

• The model is built using basis functions, and the parameters of the model are chosen to minimize the root mean squared (RMS) prediction error.

• The simplest model is a constant model, where the prediction is the mean of the outcome variable.