For a diagonal matrix, the eigenvalues are equal to the diagonal entries. So the matrix norm is the absolute value of the largest eigenvalue. Show that this is not true for an arbitrary matrix (by providing a counterexample using MATLAB/OCTAVE). (In MATLAB/Octave eig(A) computes the eigenvalues of A) For a matrix A with real entries, there is a relationship between the norm of A and the eigenvalues of A^T A. Can you guess what it is using MATL AB/Octave?
Expert Answer
Executable Code:
Sample Output: