Let be the matrix with entry . From a previous post, we know has a tridiagonal inverse with entry1 For example, if then has inverse
We can use our knowledge of to eigendecompose . To see how, let be the eigenpairs of . Yueh (2005, Theorem 1) shows that the eigenvector corresponding to the eigenvalue has component where is an arbitrary scalar. This vector has length where the last equality can be verified using Wolfram Alpha and proved using complex analysis. So choosing ensures that the eigenvectors of have unit length. Then, by the spectral theorem, these vectors form an orthonormal basis for . As a result, the matrix with entry is orthogonal. Moreover, letting be the diagonal matrix with entry yields the eigendecomposition of . It follows from the orthogonality of that is the eigendecomposition of . Thus and have the same eigenvectors, but the eigenvalues of are the reciprocated eigenvalues of .
Here’s one scenario in which this decomposition is useful: Suppose I observe data generated by the process where is a sample path of a standard Wiener process and where the errors are iid normally distributed with variance . I use these data to estimate for some .2 My estimator has conditional variance where is the vector with component and where is the covariance matrix with entry . If for each , then we can express this matrix as the sum where is the matrix defined above and where is the identity matrix. But we know . We also know , since is orthogonal. It follows that from which we can derive a (relatively) closed-form expression for the conditional variance of given .
-
One can verify this claim by showing equals the identity matrix. ↩︎
-
I discuss this estimation problem in a recent paper. ↩︎