In the context of square matrices, what is the defining property of an inverse matrix $B$ for a matrix $A \in \mathbb{R}^{n \times n}$?
$AB = I_n = BA$, where $I_n$ is the identity matrix.
What term describes a square matrix that does not possess an inverse?
Singular or noninvertible.
How is the inverse of a matrix product $(AB)^{-1}$ calculated in terms of the individual inverses?
$(AB)^{-1} = B^{-1}A^{-1}$.
Given the inverse of a matrix $A$, what is the relationship between $(A^{-1})^{\top}$ and $(A^{\top})^{-1}$?
They are equal, denoted as $A^{-\top}$.
True or False: For any two invertible matrices $A$ and $B$, $(A+B)^{-1} = A^{-1} + B^{-1}$.
False.
If a square matrix $A$ is symmetric, what property is guaranteed for its inverse $A^{-1}$?
$A^{-1}$ is also symmetric.
In the case of a non-square matrix $A$ with linearly independent columns, what formula yields the unique least-squares solution to $Ax=b$?
$x = (A^{\top}A)^{-1}A^{\top}b$.
What is the name of the operator $A^{+} = (A^{\top}A)^{-1}A^{\top}$ used in solving overdetermined systems?
The Moore-Penrose pseudo-inverse.
What are the four essential axioms a set $G$ and an operation $\otimes$ must satisfy to be considered a Group?
Closure, Associativity, Neutral Element, and Inverse Element.
What additional property distinguishes an Abelian group from a standard group?
Commutativity (e.g., $x \otimes y = y \otimes x$).
Why is the set of natural numbers $(\mathbb{N}_0, +)$ not considered a group despite having a neutral element (0)?
It lacks inverse elements for all members.
Why is the set of integers under multiplication $(\mathbb{Z}, \cdot)$ not a group?
Inverses are missing for most elements (e.g., the inverse of 2 is $0.5 \notin \mathbb{Z}$).
In linear algebra, what is the 'General Linear Group' $GL(n, \mathbb{R})$?
The set of regular (invertible) $n \times n$ matrices under multiplication.
A real-valued Vector Space $V$ consists of a set $V$ equipped with which two operations?
Addition (inner operation) and Scalar Multiplication (outer operation).
According to the vector space axioms, what kind of group must $(V, +)$ be?
An Abelian group.
What are the two distributive laws that must hold in a vector space?
$\lambda \cdot (x + y) = \lambda \cdot x + \lambda \cdot y$ and $(\lambda + \psi) \cdot x = \lambda \cdot x + \psi \cdot x$.
The set of complex numbers $\mathbb{C}$ is isomorphic to which real vector space?
$\mathbb{R}^2$.
What is the formal definition of a vector subspace $U \subseteq V$?
A subset that is itself a vector space under the restricted operations of $V$ (closed under addition and scalar multiplication).
Definition: Linear Independence
A set of vectors $x_1, \dots, x_k$ where the equation $\sum_{i=1}^{k} \lambda_i x_i = 0$ has only the trivial solution $\lambda_1 = \dots = \lambda_k = 0$.
What defines a 'Basis' of a vector space $V$?
A linearly independent generating set of $V$.
How is the dimension of a finite-dimensional vector space $V$ defined?
The number of vectors in its basis.
What is the 'Rank' of a matrix $A \in \mathbb{R}^{m \times n}$?
The number of linearly independent columns in the matrix.
In data science, what does the rank of a data matrix $A$ signify?
The number of independent directions (intrinsic dimensionality) in the data.
What condition must a mapping $\Phi: V \to W$ satisfy to be considered 'linear' (a homomorphism)?
$\Phi(\lambda x + \psi y) = \lambda \Phi(x) + \psi \Phi(y)$ for all vectors $x, y$ and scalars $\lambda, \psi$.
Define an 'Injective' mapping.
A mapping where $\Phi(x) = \Phi(y)$ implies $x = y$ (one-to-one).
What is an 'Isomorphism' in the context of linear mappings?
A linear mapping that is both injective and surjective (bijective).
What is an 'Endomorphism'?
A linear mapping from a vector space to itself ($\Phi: V \to V$).
How is the transformation matrix $A_{\Phi}$ of a linear mapping $\Phi: V \to W$ constructed relative to bases $B$ and $C$?
The columns of $A_{\Phi}$ are the coordinate vectors of $\Phi(b_j)$ represented in basis $C$.
What is the formal definition of 'Similar' matrices $A, \tilde{A} \in \mathbb{R}^{n \times n}$?
There exists a regular matrix $S$ such that $\tilde{A} = S^{-1}AS$.
Distinguish between 'Equivalence' and 'Similarity' for matrices.
Similarity requires $V=W$ and using the same basis change $S$, whereas Equivalence allows different bases $S$ and $T$ for domain and codomain.
What is the 'Kernel' (or Null Space) of a linear mapping $\Phi$?
The set of all vectors in the domain that map to the zero vector in the codomain.
What is the 'Image' (or Range) of a linear mapping $\Phi$?
The set of all vectors in the codomain that can be expressed as $\Phi(v)$ for some $v$ in the domain.
How is the rank of a matrix $A$ related to the image of its corresponding linear mapping $\Phi$?
$rk(A) = dim(Im(\Phi))$.
What is an 'Affine Mapping' $\phi(x)$?
A composition of a linear mapping $\Phi(x)$ and a translation vector $a$, written as $\phi(x) = a + \Phi(x)$.
What property is preserved by an orthogonal matrix $A$ during a transformation?
It preserves lengths of vectors and angles between them (distances and dot products).
What is the defining characteristic of an orthogonal matrix $A$ regarding its inverse?
$A^{-1} = A^{\top}$.
A square matrix $A$ is invertible if and only if its determinant $det(A)$ satisfies what condition?
$det(A) \ne 0$.
What is the 'Trace' of a square matrix $A$?
The sum of the diagonal elements of the matrix.
What is the cyclic permutation property of the trace for three matrices $A, K, L$?
$tr(AKL) = tr(KLA) = tr(LAK)$.
The Cholesky decomposition $A = LL^{\top}$ is defined for which class of matrices?
Symmetric, positive definite matrices.
What are the components of the Singular Value Decomposition (SVD) of a matrix $A$?
$A = U \Sigma V^{\top}$, where $U$ and $V$ are orthogonal and $\Sigma$ is a diagonal matrix of singular values.
In SVD, what do the columns of $U$ and $V$ represent?
$U$ contains the left-singular vectors and $V$ contains the right-singular vectors.
How are the nonzero singular values $\sigma_i$ of $A$ related to the eigenvalues of $A^{\top}A$?
The singular values are the square roots of the nonzero eigenvalues of $A^{\top}A$.
What makes the SVD more flexible than the eigendecomposition?
SVD exists for all matrices (including non-square), whereas eigendecomposition requires a square, non-defective matrix.
According to the Eckart-Young theorem, how is the best rank-$k$ approximation of a matrix $A$ obtained?
By truncating the SVD to the top $k$ singular values (truncated SVD).
What is the primary objective of Principal Component Analysis (PCA)?
To find a lower-dimensional representation that maximizes the variance of the projected data.
In PCA, the principal components correspond to which specific vectors of the data covariance matrix $S$?
The eigenvectors associated with the largest eigenvalues.
How is the variance lost during PCA data compression calculated?
It is the sum of the eigenvalues associated with the discarded principal components.
What is the relationship between PCA and the reconstruction error?
PCA finds the subspace that minimizes the average squared reconstruction error.
In the context of SVMs, what is a 'Separating Hyperplane'?
An affine subspace that partitions the feature space into two halves corresponding to different classes.
What is the 'Kernel Trick' in machine learning?
Computing inner products in a high-dimensional feature space implicitly using a kernel function $k(x_i, x_j)$.
Define 'Linear Combination' in a vector space $V$.
A vector $v = \sum_{i=1}^{k} \lambda_i x_i$, where $x_i \in V$ and $\lambda_i$ are real scalars.
A set of vectors is 'linearly dependent' if what condition regarding the zero vector is met?
There exists a non-trivial linear combination (not all scalars zero) that equals the zero vector.
What is the 'Span' of a set of vectors $A$?
The set of all possible linear combinations of the vectors in $A$.
If $U$ is a subspace of $V$, what is the relationship between their dimensions?
$dim(U) \le dim(V)$, with equality only if $U = V$.
In the SVD transformation sequence, what operation does the matrix $V^{\top}$ perform first?
A change of basis in the domain $\mathbb{R}^n$.
What is an 'Automorphism'?
A bijective linear mapping from a space to itself.
In a coordinate system, what are the 'coordinates' of a vector $x$ relative to an ordered basis $B = (b_1, \dots, b_n)$?
The unique scalars $\alpha_1, \dots, \alpha_n$ such that $x = \sum \alpha_i b_i$.
Under what condition is a linear mapping $\Phi$ injective?
If and only if its kernel consists solely of the zero vector ($ker(\Phi) = \{0_V\}$).
Formula: Determinant of the product of two square matrices $A$ and $B$.
$det(AB) = det(A)det(B)$.
What does the 'rank-nullity theorem' imply about the dimensions of the kernel and image of $\Phi: V \to W$?
$dim(V) = dim(ker(\Phi)) + dim(Im(\Phi))$.
How does the Cholesky decomposition facilitate efficient determinant calculation?
$det(A) = det(L)^2$, which is the square of the product of the diagonal entries of the triangular matrix $L$.
Why is array-based 'element-wise' multiplication rarely used in standard linear algebra compared to matrix multiplication?
It does not follow standard matrix multiplication rules and has mismatched dimensions when vectors are treated as $n \times 1$ matrices.
What is the 'Spectral Theorem' regarding symmetric matrices?
It states that symmetric matrices can be diagonalized by an orthonormal basis of eigenvectors.
What is the 'null space' of a matrix $A$ in terms of linear equations?
The set of all solutions to the homogeneous system $Ax = 0$.