What are the uses of minimization?

Minimization is useful for finding stable equilibrium, can reformulate systems of equations, and least squares minimizations. We can think of it as a linear system Ax = b, with m equations in n unknowns. We seek to minimize the function where || || refers to norm as seen in Inner Products. .

What is the minimum of p(x)?

The function p(x) has a minimum value of 0, which is only true/works when x is a solution to Ax=b. If there no solution x to the equation, then we find the closest possible solution.

What is least squares solution?

If a linear system has no solution, we look for an approximate solution by minimizing the residual vector: r = b - Ax. We want to minimize the magnitude of the residual . The value which minimizes the square residual norm function is called the least square solution, denoted .

What are the flexibilities in LSM?

Least squares solutions depend on the choice of inner product. We can get different solutions depending on the vector space’s inner product. Given a vector b, we seek to find the vector v* which is closest to b in the subspace V. This is equivalent to an orthogonal project of b onto the subspace V. All of this work is equivalent to solving the least squares, minimizing the residual.

What is linear regression?

Linear regression applies least squares to minimize the error of: .

What is gradient descent?

Gradient descent is an iterative optimization algorithm to minimize loss.