ButterAddict

Matrix & Tensor

Word count: 1.6kReading time: 9 min
2019/03/30
loading

Matrices

Vector space

Definition

  • A set of elements on which vector addition and scalar multiplication are defined
  • Closed under these operations
  • Includes identity \(\mathbf{0}​\) for vector addition

Note:

  • 2D spaces \(\mathbb{R}^2 \neq \mathbb{C}\), since \(\mathbb{C}\) has a single point of \(\infty\).

Span

The set of all vectors that are linear combinations of any vectors of a set \(\mathbb{S}​\).

Linearly independent

None of the vectors is a linear combination of others, or \[ x_1\mathbf{e}_1+x_2\mathbf{e}_2+x_3\mathbf{e}_3+...=0 \iff x_1=x_2=x_3=...=0 \]

Basis

A set that spans \(\mathbb{V}​\) and are linearly independent.

  • The number of elements is the dimension of \(\mathbb{V}\).
  • Any vector can be expressed as a linear combination of the basis set in a unique way.
  • Can have infinite dimensions - e.g. Fourier series

Change of bases

Primed / unprimed opposite to lecture notes convention.

  • Idea: the same vector has different components in different bases.

\[ \mathbf{e}_j'=\mathbf{e}_i\mathbf{R}_{ij} \]

where \(\mathbf{R}_{ij}\) is the \(i\)th component of \(\mathbf{e}_j'\) in the unprimed basis, i.e. \[ \mathbf{R}_{ij}= \begin{pmatrix} \uparrow & \uparrow & \uparrow\\ \mathbf{e_1'} & \mathbf{e_2'} &\mathbf{e_3'}\\ \downarrow &\downarrow &\downarrow \end{pmatrix} \] Check by reversing the argument \[ \begin{pmatrix} \uparrow & \uparrow & \uparrow\\ \mathbf{e_1'} & \mathbf{e_2'} &\mathbf{e_3'}\\ \downarrow &\downarrow &\downarrow \end{pmatrix} \begin{pmatrix} \uparrow \\ \mathbf{e'}\\ \downarrow \end{pmatrix} =\begin{pmatrix} \uparrow \\ \mathbf{e'}\,\mathrm{in}\,\mathbf{e}\\ \downarrow \end{pmatrix} \] Now, \[ \mathbf{x}=x_i\mathbf{e}_i=x_j'\mathbf{e}_j'=x_j'\mathbf{e}_i\mathbf{R}_{ij}\\ \implies x_i=\mathbf{R}_{ij}x_j' \]

Inverse

Define the transformations \[ \mathbf{e}_j'=\mathbf{e}_i\mathbf{R}_{ij} \\ \mathbf{e}_j=\mathbf{e}_i'\mathbf{S}_{ij} \] Sub in, \[ \mathbf{e}_j=\mathbf{e}_k\mathbf{R}_{ki}\mathbf{S}_{ij} \] True iff \(\mathbf{R}_{ki}\mathbf{S}_{ij}=\delta_{kj}\), similarly, \(\mathbf{S}_{ki}\mathbf{R}_{ij}=\delta_{kj}\). Hence, inverse transform is represented by inverse matrix.

Transforming matrices

\[\begin{align} \mathbf{A}\mathbf{x}=\mathbf{e}_i\mathbf{A}_{ij}x_j&=\mathbf{e}_i'\mathbf{A'}_{ij}x_j'\\ \mathbf{e}_i\mathbf{A}_{ij}x_j&=\mathbf{e}_k\mathbf{R}_{ki}\mathbf{A'}_{ij}x_j'\\ \mathbf{A}\mathbf{x}&=\mathbf{R}\mathbf{A'}\mathbf{R}^{-1}\mathbf{x'}\\ \mathbf{A}&=(\mathbf{R}\mathbf{A'})\mathbf{R}^{-1}\\ \end{align}\]

Linear Operation

\[ \mathbf{A}(\alpha\mathbf{x}+\mathbf{y})=\alpha\mathbf{A}\mathbf{x}+\mathbf{A}\mathbf{y} \]

Without reference to any bases.

Inner Product

\[ <\!\!x|y\!\!>=x_i^*y_i \]

  • Linear in 2nd argument
  • Anti-linear in 1st argument
  • Hermitian: \(<\!\!x|y\!\!>^*=<y|x>​\)
  • Positive-definite: \(<\!\!x|x\!\!>\geq0\)

Inner Product and bases

\[ <\!\!x|y\!\!>=<\!\!x_i\mathbf{e}_i|y_j\mathbf{e}_j\!\!>=x_i^*y_j<\!\!\mathbf{e}_i|\mathbf{e}_j\!\!>=x_i^*y_j\mathbf{G}_{ij} \]

  • Metric coefficient: \(\mathbf{G}_{ij}\), hermitian

Cauchy-Schwartz

\[ |<\!\!\mathbf{x}|\mathbf{y}\!\!>|\leq||\mathbf{x}||\,||\mathbf{y}|| \]

Proof by considering \(<\!\!\mathbf{x}-\alpha\mathbf{y}|\mathbf{x}-\alpha\mathbf{y}\!\!>\), then choose appropriate phase of \(\alpha\), so that \(\alpha<\!\!\mathbf{x}|\mathbf{y}\!\!>\) is real and positive; appropriate length of \(\alpha=\frac{|\mathbf{x}|}{|\mathbf{y}|}\).

Hermitian Conjugate

\[ \mathbf{A}^\dagger=(\mathbf{A}^T)^*\\ \mathbf{AB}^\dagger=\mathbf{B}^\dagger\mathbf{A}^\dagger \]

An operator is adjoint if \(<\!\!\mathbf{A}^\dagger \mathbf{x}|\mathbf{y}\!\!>=<\!\!\mathbf{x}|\mathbf{A} \mathbf{y}\!\!>\).

Special square matrices

Type Relation
Real Symmetric \(A_{ij}=A_{ji}\)
Real Antisymmetric \(A_{ij}=-A_{ji}\)
Orthogonal \(\mathbf{A}^\mathrm{T}=\mathbf{A}^{-1}\)
Hermitian \(\mathbf{A}^\dagger=\mathbf{A}\)
Anti-hermitian \(\mathbf{A}^\dagger=-\mathbf{A}\)
Unitary \(\mathbf{A}^\dagger=\mathbf{A}^{-1}\)
Normal \(\mathbf{A}^\dagger\mathbf{A}=\mathbf{A}\mathbf{A}^\dagger\)

Special results

  1. If A is Hermitian then iA is anti-Hermitian
  2. if A is Hermitian then exp(iA) is unitary

c.f. real number

Eigenvectors

Two things to remember: \[ \mathbf{Ax}=\lambda\mathbf{x}\\ \mathbf{AS}=\mathbf{S\Lambda} \]

  1. Always find Eigenvalues first, by solving characteristic equation \(\det{(\mathbf{A}-\lambda\mathbf{I})}=0\).
  2. Roots
    1. n distinct roots - n linearly independent vectors
    2. m repeated roots - may be 1 to m vectors corresponding to it, and any linear combinations of those are Eigenvectors.

Derivation of Eigenvalue, Eigenvector properties of special matrices.

Diagonalisation

Diagonalisation is also a similarity transformation, into the Eigenvector basis.

  • Only diagonalisable if the matrix has n independent eigenvectors.

Normal matrices can be diagonalised by unitary transformation. - Transformation between orthonormal bases is described by a unitary vector.

  • Tr and Det are most easily studied in the diagonal form, invariant under similarity transformations.

Forms

Quadratic form

Any homogeneous quadratic funtion is a quadratic form of a symmetric matrix. \[ Q(\mathbf{x})=\mathbf{x}^T\mathbf{A}\mathbf{x}=A_{ij}x_i x_j \] By diagonalisation, we can get a \(Q(\mathbf{x}')\) with no cross-terms.

  • The eigenvectors of A defines the principal axes of the quadratic form.
  • Positive (semi-)definite means all eigenvalues are larger than (or equal to) 0.

Quadratic surface

\[ Q(\mathbf{x})=k \]

  • Ellipsoids - \(\lambda​\) same sign

  • Hyperboloids - \(\lambda\) different signs

  • \(\lambda_1=\lambda_2=\lambda_3\) - sphere

  • \(\lambda_1=\lambda_2\) - revolution about z’ (axis of symmetry)

  • \(\lambda_3=0\) - conic section translated along z’

Hermitian Form

Similar idea for complex vector space. \[ H(\mathbf{x})=\mathbf{x}^\dagger\mathbf{H}\mathbf{x}=\sum\lambda_i|x_i'|^2 \]

Stationary property

Define Rayleigh quotient \[ \lambda(\mathbf{x})=\frac{\mathbf{x}^\dagger\mathbf{A}\mathbf{x}}{\mathbf{x}^\dagger\mathbf{x}} \]

  • \(\lambda\) is a scalar
  • \(\lambda(\alpha\mathbf{x})=\lambda(\mathbf{x})\)
  • When \(\mathbf{x}\) is eigenvector, \(\lambda\) is eigenvalue.

Rayleigh-Ritz

Eigenvalues of a matrix is the stationery values of the Rayleigh quotient.

Proof by considering \(\lambda(\mathbf{x}+\delta \mathbf{x})-\lambda(\mathbf{x})\).

Cartesian Tensor

Use the transformation matrix \(L\): \[ L_{ij}=\mathbf{e}_i'\cdot\mathbf{e}_j\\ \mathbf{L}= \begin{pmatrix} \leftarrow & \mathbf{e_1'} & \rightarrow\\ \leftarrow & \mathbf{e_2'} &\rightarrow\\ \leftarrow &\mathbf{e_3'} &\rightarrow \end{pmatrix} =\begin{pmatrix} \uparrow & \uparrow & \uparrow\\ ... & \mathbf{e_i}\,\mathrm{in}\,\mathbf{e_i}' &...\\ \downarrow &\downarrow &\downarrow \end{pmatrix} \\ \] So the transformation law for a vector is (only this form makes sense for me…) \[ v_i'=L_{ij}v_j= \begin{pmatrix} \uparrow & \uparrow & \uparrow\\ ... & \mathbf{e_i}\,\mathrm{in}\,\mathbf{e_i}' &...\\ \downarrow &\downarrow &\downarrow \end{pmatrix} \begin{pmatrix} \uparrow\\ \mathbf{v}\\\downarrow \end{pmatrix} \]

Definition in terms of transformation laws

Vector: a set of coefficents \(v_i​\), defined wrt orthonormal basis \(\mathbf{e}_i​\), such that \(v_i'​\) wrt another basis \(\mathbf{e}_i'​\) are given by \(v_i'=L_{ij}v_j​\).

Axial vectors: \(v_i'=\det(\mathbf{L})L_{ij}v_j\)

Tensors: \(T_{ijk...}'=L_{ia}L_{jb}L_{kc}...T_{abc...}​\)

Pseudo-tensor: \(T_{ijk...}'=\det(\mathbf{L})L_{ia}L_{jb}L_{kc}...T_{abc...}​\)

Examples

Everything is checked by the transformation law above.

  • Delta is a 2nd order tensor
  • Epsilon is a 3rd order pseudo tensor
  • Inertia tensor: \(I_{ij}=\int_V\rho(\mathbf{x})(x_kx_k\delta_{ij}-x_ix_i)\,dV\)

Derive from the angular momentum definition.

Operations

  1. Addition
  2. Outer product
  3. Inner product (contraction)

Note: Cross-product is defined using the epsilon symbol, so any cross product is a pseudo-tensor.

  1. Symmetric and antisymmetric properties are invariant.
    • Contraction of symmetric & antisymmetric indices \(S_{ijk...}A_{ijr...} = 0\).

Proof by using symmetry properties once and relabel once.

Second-order tensors

Two new things

Antisymmetric (3D)

Only three degrees of freedom, hence can be represented by a vector. \[ A_{ij}=\epsilon_{ijk}\omega_k \] The vector \(\omega\) is a dual vector, defined by \[ \omega_k=\frac{1}{2}\epsilon_{klm}A_{lm} \]

Symmetric (3D)

Can be further decomposed, uniquely, into a traceless symmetric matrix and identity. \[ S=\tilde{S}+\frac{\mathrm{Tr}(S)}{3}I \]

Isotropic tensors

Tensors or pseudo-tensors with the same components in all frames.

  • Scalars are isotropic
  • No non-zero rank 1 isotropic tensor
  • Rank 2 isotropic tensors are \(\lambda\delta_{ij}\)

Proof by (wlog) considering a rotation \(\pi/2\) by z-axis and by y-axis.

  • Rank 3 isotropic tensors are \(\lambda\epsilon_{ijk}\)
  • Rank 4 isotropic tensors are \(\lambda\delta_{ij}\delta_{kl}+\mu\delta_{ik}\delta_{jl}+\nu\delta_{il}\delta_{jk}\)

Applications

  1. Since invariant, we can pick the most convenient frame (e.g eigenvector frame).
  2. Integrals

Integral 1

\[ X_i=\int_{r\leq a}x_i\rho(r)\,dV \]

Relabel the integration variables as primed. Transform \(x_i'=R_{ij}x_j\) we find \(X_i=X_i'\). But only isotropic rank 1 tensor is \(\mathbf{0}\).

Integral 2

\[ K_{ij}=\int_{r\leq a}x_ix_j\rho(r)\,dV \]

This is also isotropic, so \(K_{ij}=\lambda\delta_{ij}\). \[ K_{ij}=\frac{1}{3}(\int_{r\leq a}r^2\rho(r)\,dV)\delta_{ij} \] Only need to do 1 integral.

Tensor fields

  • \(\nabla\), with components \(\partial_i\), is a vector in Cartesian only.

One idea (maybe) new

  • The derivative of a second-order tensor \(\sigma_{ij}\) is a rank 3 tensor field \(\partial_i\sigma_{jk}\).

Webmentions - no interactions yet

    CATALOG
    1. 1. Matrices
      1. 1.1. Vector space
        1. 1.1.1. Definition
        2. 1.1.2. Span
        3. 1.1.3. Linearly independent
      2. 1.2. Basis
        1. 1.2.1. Change of bases
          1. 1.2.1.1. Inverse
          2. 1.2.1.2. Transforming matrices
      3. 1.3. Linear Operation
      4. 1.4. Inner Product
        1. 1.4.1. Inner Product and bases
        2. 1.4.2. Cauchy-Schwartz
      5. 1.5. Hermitian Conjugate
        1. 1.5.1. Special square matrices
        2. 1.5.2. Special results
      6. 1.6. Eigenvectors
        1. 1.6.1. Diagonalisation
      7. 1.7. Forms
        1. 1.7.1. Quadratic form
          1. 1.7.1.1. Quadratic surface
        2. 1.7.2. Hermitian Form
      8. 1.8. Stationary property
        1. 1.8.1. Rayleigh-Ritz
    2. 2. Cartesian Tensor
      1. 2.1. Definition in terms of transformation laws
      2. 2.2. Examples
      3. 2.3. Operations
      4. 2.4. Second-order tensors
        1. 2.4.1. Antisymmetric (3D)
        2. 2.4.2. Symmetric (3D)
      5. 2.5. Isotropic tensors
        1. 2.5.1. Applications
          1. 2.5.1.1. Integral 1
          2. 2.5.1.2. Integral 2
      6. 2.6. Tensor fields