frequency, \(\omega \), is a rational multiple of \( 2\pi \)
or equivalently, if \(e^{j\omega N} = 1\)
so not every sinusoid is periodic in discrete time
aliasing
the same point in the unit circle may have many names:
the point at \(e^{j\alpha}\) can
\(e^{2\pi + j\alpha}\)
\(e^{6\pi + j\alpha}\)
\(e^{-2\pi + j\alpha}\)
this is called aliasing
natural property of complex exponential
in discrete time, this limits how fast we can go around the unit circle with a discrete-time signal
the frequency of the discrete-time machine is limited
\(0 \leq \omega < 2\pi\)
when it is faster than \(2\pi\), due to the periodicity of the complex exponential,
we fall back via a modulo operation
even within the range, care must be taken between backwards and forwards motion
when \(\omega = \pi \),
the point simply oscillates between \( 1 \) and \(-1\) on the unit circle
when \(\omega > \pi\)
it can also be view as rotation backwards to get that point
it is shorter to get to that point in the backwards direction
so anytime \(\omega > \pi\),
it appears like a smaller step in the clockwise direction
this reverse effect aliasing is even more pronounced when \(\omega\) is close to \(2\pi\)
if \(\omega » \pi \), the rotating body appears stationary due to aliasing
so at different frequencies, i.e. \(\omega \) values, aliasing introduces different artifacts of illusion
eigenvalues and eigenvectors
almost all vectors change direction when they are multiplied by a matrix
however, certain vectors are in the same direction even after multiplication with that matrix
those certain vectors are eigenvectors
since they are the in the same direction, multiplication with the matrix is like scaling the original vector
the equivalent scaling factors are called eigenvalues
consider equation: \( Ax = \lambda x \)
\(A\): square matrix
\(\lambda\): eigenvalues of \(A\)
\(x\): eigenvectors of \(A\)
rearranging this, we get: \( A - \lambda I = 0\)
from this we get A’s characteristic equation:
\(\lvert A - \lambda I \rvert = 0\)
where \(\lvert A \rvert\): determinant of matrix \(A\)
the roots of this equation are eigenvalues \(\lambda\)
having computed the eigenvalues \(\lambda\), the eigenvectors \(x\) can be found using \( Ax = \lambda x \)
if eigenvector elements can take on arbitrary values based on a relationship between them, make sure to normalize them with the square root of the sum of squares of all elements
i.e. normalize it with the length (first modulus) of the vector