Matrix multiplication is an important operation in linear algebra. Let’s consider two matrices, $A$ and $B$, which we want to multiply together.
\[
A =
\begin{bmatrix}
a_{11} & a_{12} & \dots & a_{1n} \\
a_{21} & a_{22} & \dots & a_{2n} \\
\vdots & \vdots & \ddots & \vdots \\
a_{m1} & a_{m2} & \dots & a_{mn}
\end{bmatrix}
\]
\[
B =
\begin{bmatrix}
b_{11} & b_{12} & \dots & b_{1p} \\
b_{21} & b_{22} & \dots & b_{2p} \\
\vdots & \vdots & \ddots & \vdots \\
b_{n1} & b_{n2} & \dots & b_{np}
\end{bmatrix}
\]
To multiply matrices $A$ and $B$, we need to ensure that the number of columns in $A$ matches the number of rows in $B$. In this case, $A$ is an $m \times n$ matrix and $B$ is an $n \times p$ matrix. The resulting matrix $C$ will be an $m \times p$ matrix.
The element at row $i$ and column $j$ in matrix $C$ is calculated as follows:
\[
c_{ij} = a_{i1} \cdot b_{1j} + a_{i2} \cdot b_{2j} + \dots + a_{in} \cdot b_{nj}
\]
This can be expressed in a compact form using matrix notation:
\[
C = A \cdot B
\]
where $C$ is the resulting matrix obtained by multiplying matrices $A$ and $B$.
Matrix multiplication is not commutative, which means that $A \cdot B$ may not be equal to $B \cdot A$. It is important to remember the order of the matrices when performing matrix multiplication.
