A symmetric matrix is a square matrix that is equal to its transpose. In other words, if we reflect the elements of the matrix along its main diagonal (from the top-left to the bottom-right), the resulting matrix will be the same as the original matrix.
Let’s consider a symmetric matrix $\textbf{A}$ of size \(n \times n\):
\[
\textbf{A} =
\begin{bmatrix}
a_{11} & a_{12} & \ldots & a_{1n} \\
a_{21} & a_{22} & \ldots & a_{2n} \\
\vdots & \vdots & \ddots & \vdots \\
a_{n1} & a_{n2} & \ldots & a_{nn}
\end{bmatrix}
\]
For a matrix to be symmetric, it must satisfy the condition \(a_{ij} = a_{ji}\) for all elements in the matrix. This means that the element at row \(i\) and column \(j\) is equal to the element at row \(j\) and column \(i\).
To express the symmetry condition mathematically, we can write:
\[
\textbf{A} = \textbf{A}^T
\]
where \(\textbf{A}^T\) denotes the transpose of matrix $\textbf{A}$.
Example:
Let’s consider an example of a symmetric matrix:
\[
\textbf{A} =
\begin{bmatrix}
2 & 4 & 6 \\
4 & 5 & -1 \\
6 & -1 & 3
\end{bmatrix}
\]
In this matrix, we can observe that the element at row 2, column 1 (\(a_{21}\)) is the same as the element at row 1, column 2 (\(a_{12}\)), and so on. This property holds for all elements in the matrix.
To verify the symmetry, we can also calculate the transpose of $\textbf{A}$:
\[
\textbf{A}^T =
\begin{bmatrix}
2 & 4 & 6 \\
4 & 5 & -1 \\
6 & -1 & 3
\end{bmatrix}
\]
As we can see, the transpose of $\textbf{A}$ is the same as the original matrix $\textbf{A}$, confirming that $\textbf{A}$ is symmetric.
Properties
Symmetric matrices have some interesting properties.
- The eigenvalues of a symmetric matrix are always real.
- Eigenvectors corresponding to distinct eigenvalues are orthogonal to each other.
