Tutorial 08 (AY24/25 Sem 1)¶
# Required imports
import sympy as sym
from ma1522 import Matrix
Question 1¶
Apply Gram-Schmidt Process to convert
(a)¶
$\left\{ \begin{pmatrix} 1 \\ 1 \\ 1 \\ 1 \end{pmatrix}, \begin{pmatrix} 1 \\ -1 \\ 1 \\ 0 \end{pmatrix}, \begin{pmatrix} 1 \\ 1 \\ -1 \\ -1 \end{pmatrix}, \begin{pmatrix} 1 \\ 2 \\ 0 \\ 1 \end{pmatrix} \right\}$ into an orthonormal basis for $\mathbb{R}^4$.
U = Matrix.from_str("1 1 1 1; 1 -1 1 0; 1 1 -1 -1; 1 2 0 1").T
U
U.gram_schmidt(factor=True, verbosity=1)
(b)¶
$\left\{ \begin{pmatrix} 1 \\ 2 \\ 2 \\ 1 \end{pmatrix}, \begin{pmatrix} 1 \\ 2 \\ 1 \\ 0 \end{pmatrix}, \begin{pmatrix} 1 \\ 0 \\ 1 \\ 0 \end{pmatrix}, \begin{pmatrix} 1 \\ 0 \\ 2 \\ 1 \end{pmatrix} \right\}$ into an orthonormal set. Is the set obtained an orthonormal basis? Why?
U = Matrix.from_str("1 2 2 1; 1 2 1 0; 1 0 1 0; 1 0 2 1").T
U
U.gram_schmidt(factor=True, verbosity=1) # UserWarning raised because a zero vector is found
/tmp/ipykernel_2427/1175356544.py:1: UserWarning: Vectors are linearly dependent. Note that there is no QR factorisation U.gram_schmidt(factor=True, verbosity=1) # UserWarning raised because a zero vector is found
Question 2¶
Let $\mathbf{A} = \begin{pmatrix} 0 & 1 & 1 & 0 \\ 1 & -1 & 1 & -1 \\ 1 & 0 & 1 & 0 \\ 1 & 1 & 1 & 1 \end{pmatrix}$ and $\mathbf{b} = \begin{pmatrix} 6 \\ 3 \\ -1 \\ 1 \end{pmatrix}$.
(a)¶
Is the linear system $\mathbf{A}\mathbf{x} = \mathbf{b}$ inconsistent?
A = Matrix.from_str("0 1 1 0; 1 -1 1 -1; 1 0 1 0; 1 1 1 1")
b = Matrix.from_str("6; 3; -1; 1")
A, b
# Check if the system is inconsistent
b.is_subspace_of(A, verbosity=2)
Check if span(self) is subspace of span(other) Before RREF: [other | self]
After RREF:
Span(self) is not a subspace of span(other).
False
(b)¶
Find a least squares solution to the system. Is the solution unique?
sol = A.solve_least_squares(b, verbosity=2)
sol
Before RREF: [self.T @ self | self.T @ rhs]
After RREF
# separate the solution into particular and general solution
sol.sep_part_gen()
(c)¶
Use your answer in (b), compute the projection of $\mathbf{b}$ onto the column space of $\mathbf{A}$. Is the solution unique?
proj = A @ sol
proj
Question 3 (Application)¶
A line $p(x) = a_1 x + a_0$ is said to be the least squares approximating line for a given set of data points $(x_1, y_1), (x_2, y_2), \ldots, (x_m, y_m)$ if the sum $$S = [y_1 - p(x_1)]^2 + [y_2 - p(x_2)]^2 + \cdots + [y_m - p(x_m)]^2$$ is minimized. Writing $$\mathbf{x} = \begin{pmatrix} x_1 \\ x_2 \\ \vdots \\ x_m \end{pmatrix}, \quad \mathbf{y} = \begin{pmatrix} y_1 \\ y_2 \\ \vdots \\ y_m \end{pmatrix}, \quad \text{and} \quad p(\mathbf{x}) = \begin{pmatrix} p(x_1) \\ p(x_2) \\ \vdots \\ p(x_m) \end{pmatrix} = \begin{pmatrix} a_1 x_1 + a_0 \\ a_1 x_2 + a_0 \\ \vdots \\ a_1 x_m + a_0 \end{pmatrix}$$
the problem is now rephrased as finding $a_0, a_1$ such that $$S = ||\mathbf{y} - p(\mathbf{x})||^2$$ is minimized. Observe that if we let $$\mathbf{N} = \begin{pmatrix} 1 & x_1 \\ 1 & x_2 \\ \vdots & \vdots \\ 1 & x_m \end{pmatrix} \quad \text{and} \quad \mathbf{a} = \begin{pmatrix} a_0 \\ a_1 \end{pmatrix}$$
then $\mathbf{N}\mathbf{a} = p(\mathbf{x})$. And so our aim is to find $\mathbf{a}$ that minimizes $||\mathbf{y} - \mathbf{N}\mathbf{a}||^2$.
It is known the equation representing the dependency of the resistance of a cylindrically shaped conductor (a wire) at $20°C$ is given by $$R = \rho \frac{L}{A}$$ where $R$ is the resistance measured in Ohms $\Omega$, $L$ is the length of the material in meters $m$, $A$ is the cross-sectional area of the material in meter squared $m^2$, and $\rho$ is the resistivity of the material in Ohm meters $\Omega m$.
A student wants to measure the resistivity of a certain material. Keeping the cross-sectional area constant at $0.002 m^2$, he connected the power sources along the material at various lengths and measured the resistance and obtained the following data.
$L$ | $0.01$ | $0.012$ | $0.015$ | $0.02$ |
---|---|---|---|---|
$R$ | $2.75 \times 10^{-4}$ | $3.31 \times 10^{-4}$ | $3.92 \times 10^{-4}$ | $4.95 \times 10^{-4}$ |
It is known that the Ohm meter might not be calibrated. Taking that into account, the student wants to find a linear graph $R = \frac{\rho}{0.002} L + R_0$ from the data obtained to compute the resistivity of the material.
(a)¶
Relabeling, we let $R = y$, $\frac{\rho}{0.002} = a_1$ and $R_0 = a_0$. Is it possible to find a graph $y = a_1 x + a_0$ satisfying the points?
A = Matrix.from_str("1 0.01; 1 0.012; 1 0.015; 1 0.02", aug_pos=1)
# scientific notation supported using E
b = Matrix.from_str("2.75E-4; 3.31E-4; 3.92E-4; 4.95E-4")
A, b
# Observe that numbers are represented as decimals
# This is bad, because it means that SymPy may not be using
# full precision arithmetic. Use the `simplify` method to convert
# the numbers to symbolic fractions.
A.simplify(tolerance=1e-10)
b.simplify(tolerance=1e-10)
A, b
b.is_subspace_of(A, verbosity=2)
Check if span(self) is subspace of span(other) Before RREF: [other | self]
After RREF:
Span(self) is not a subspace of span(other).
False
(b)¶
Find the least square approximating line for the data points and hence find the resistivity of the material. Would this material make a good wire?
sol = A.solve_least_squares(b, verbosity=2)
sol
self.T @ self is invertible. The lest squares solution is unique.
# Unlike MATLAB, we can get the exact solution with rational numbers
# However, as it is unwieldy and MA1522 does not require exact solution,
# we can use `evalf(n)` to evaluate the solution to `n` significant values.
sol.evalf(4)
Question 4 (Application)¶
Suppose the equation governing the relation between data pairs is not known. We may want to then find a polynomial $$p(x) = a_0 + a_1 x + a_2 x^2 + \cdots + a_n x^n$$ of degree $n$, $n \leq m - 1$, that best approximates the data pairs $(x_1, y_1), (x_2, y_2), \ldots, (x_m, y_m)$. A least square approximating polynomial of degree $n$ is such that $$||\mathbf{y} - p(\mathbf{x})||^2$$ is minimized. If we write $$\mathbf{x} = \begin{pmatrix} x_1 \\ x_2 \\ \vdots \\ x_m \end{pmatrix}, \quad \mathbf{y} = \begin{pmatrix} y_1 \\ y_2 \\ \vdots \\ y_m \end{pmatrix}, \quad \mathbf{N} = \begin{pmatrix} 1 & x_1 & x_1^2 & \cdots & x_1^n \\ 1 & x_2 & x_2^2 & \cdots & x_2^n \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 1 & x_m & x_m^2 & \cdots & x_m^n \end{pmatrix} \quad \text{and} \quad \mathbf{a} = \begin{pmatrix} a_0 \\ a_1 \\ \vdots \\ a_n \end{pmatrix}$$
then $p(\mathbf{x}) = \mathbf{N}\mathbf{a}$, and the task is to find $\mathbf{a}$ such that $||\mathbf{y} - \mathbf{N}\mathbf{a}||^2$ is minimized. Observe that $\mathbf{N}$ is a matrix minor of the Vandermonde matrix. If at least $n + 1$ of the $x$-values $x_1, x_2, \ldots, x_m$ are distinct, the columns of $\mathbf{N}$ are linearly independent, and thus $\mathbf{a}$ is uniquely determined by $$\mathbf{a} = (\mathbf{N}^T \mathbf{N})^{-1} \mathbf{N}^T \mathbf{y}$$
We shall now find a quartic polynomial $$p(x) = a_0 + a_1 x + a_2 x^2 + a_3 x^3 + a_4 x^4$$ that is a least square approximating polynomial for the following data points
$x$ | $4$ | $4.5$ | $5$ | $5.5$ | $6$ | $6.5$ | $7$ | $8$ | $8.5$ |
---|---|---|---|---|---|---|---|---|---|
$y$ | $0.8651$ | $0.4828$ | $2.590$ | $-4.389$ | $-7.858$ | $3.103$ | $7.456$ | $0.0965$ | $4.326$ |
x = Matrix.from_str("4 4.5 5 5.5 6 6.5 7 8 8.5").T
y = Matrix.from_str("0.8651 0.4828 2.590 -4.389 -7.858 3.103 7.456 0.0965 4.326").T
x, y
# Similar to the previous example, we can use `simplify` to
# convert the numbers to symbolic fractions.
x.simplify(tolerance=1e-10)
y.simplify(tolerance=1e-10)
x, y
# Create the Vandermonde matrix
N = Matrix.create_vander(num_rows=x.rows, num_cols=5)
N
# To substitute the values of x into the Vandermonde matrix
N = N.apply_vander(x)
N
sol = N.solve_least_squares(y, verbosity=2)
sol
self.T @ self is invertible. The lest squares solution is unique.
sol.evalf(8) # Evaluate the solution to 8 significant figures
A = Matrix.from_str("1 1 0; 1 1 0; 1 1 1; 0 1 1")
A
A.gram_schmidt(factor=True, verbosity=1)
QR = A.QRdecomposition(verbosity=1)
QR
Finding orthogonal basis via Gram-Schmidt process:
Q matrix:
R matrix: Q.T @ self
(b)¶
Use your answer in (a) to find the least square solution to $\mathbf{A}\mathbf{x} = \mathbf{b}$, where $\mathbf{b} = \begin{pmatrix} 1 \\ 1 \\ 0 \\ 0 \end{pmatrix}$.
b = Matrix.from_str("1; 1; 0; 0")
b
Q, R = QR.Q, QR.R
sol = R.inverse() @ Q.T @ b
sol
# Or brute force
sol = A.solve_least_squares(b, verbosity=2)
sol
self.T @ self is invertible. The lest squares solution is unique.