Least Square Regression

We want to fit a+bt^2 to the data given below

Set up A

In [1]:
import numpy as np

A = np.transpose(np.matrix([[1,1,1,1,1], [(-2)**2, (-1)**2, 0, 1, 2**2]]))
A
Out[1]:
matrix([[1, 4],
        [1, 1],
        [1, 0],
        [1, 1],
        [1, 4]])

Set up b

In [2]:
b = np.transpose(np.matrix([[1,1,2,3,-2]]))
b
Out[2]:
matrix([[ 1],
        [ 1],
        [ 2],
        [ 3],
        [-2]])

Find inverse

In [3]:
# First, we find A^T A

C = np.transpose(A)*A

Cinv = np.linalg.inv(C)

Cinv
Out[3]:
matrix([[ 0.48571429, -0.14285714],
        [-0.14285714,  0.07142857]])

Solve the system

Now we solve the system by multiplying both sides by the inverse of (A^T)(A)

In [4]:
x = Cinv * np.transpose(A)* b
x
Out[4]:
matrix([[ 2.42857143],
        [-0.71428571]])

Conclusion

In [5]:
# a coefficient
x[0,:]
Out[5]:
matrix([[ 2.42857143]])
In [6]:
# b coefficient
x[1,:]
Out[6]:
matrix([[-0.71428571]])