Let \(0<x_1<x_2<\ldots<x_n\)
and let \(A\)
be the symmetric \(n\times n\)
matrix with \({ij}^\text{th}\)
entry \(A_{ij}=\min\{x_i,x_j\}\)
.1
This matrix has linearly independent columns and so is invertible.
Its inverse \(A^{-1}\)
is symmetric, tridiagonal, and has \({ij}^\text{th}\)
entry
$$[A^{-1}]_{ij}=\begin{cases} \frac{1}{x_1}+\frac{1}{x_2-x_1} & \text{if}\ i=j=1 \\ \frac{1}{x_i-x_{i-1}}+\frac{1}{x_{i+1}-x_i} & \text{if}\ 1<i=j<n \\ \frac{1}{x_n-x_{n-1}} & \text{if}\ i=j=n \\ -\frac{1}{x_j-x_i} & \text{if}\ i=j-1 \\ -\frac{1}{x_i-x_j} & \text{if}\ i=j+1 \\ 0 & \text{otherwise}. \end{cases}$$
For example, if \(x_i=2^{i-1}\)
for each \(i\le n=5\)
then
$$A=\begin{bmatrix} 1 & 1 & 1 & 1 & 1 \\ 1 & 2 & 2 & 2 & 2 \\ 1 & 2 & 4 & 4 & 4 \\ 1 & 2 & 4 & 8 & 8 \\ 1 & 2 & 4 & 8 & 16 \\ \end{bmatrix}$$
and
$$A^{-1}=\begin{bmatrix} 2 & -1 & 0 & 0 & 0 \\ -1 & 1.5 & -0.5 & 0 & 0 \\ 0 & -0.5 & 0.75 & -0.25 & 0 \\ 0 & 0 & -0.25 & 0.375 & -0.125 \\ 0 & 0 & 0 & -0.125 & 0.125 \\ \end{bmatrix}$$
You may wonder: why is this useful?
Suppose I observe data \(\{(x_i,y_i)\}_{i=1}^n\)
, where the function \(f:[0,\infty)\to\mathbb{R}\)
mapping regressors \(x_i\ge0\)
to outcomes \(y_i=f(x_i)\)
is the realization of a Wiener process.
I use these data to estimate some value \(f(x)\)
via Bayesian regression.
My estimate depends on the inverse of the covariance matrix for the outcome vector \(y=(y_1,y_2,\ldots,y_n)\)
.
This matrix has \({ij}^\text{th}\)
entry \(\min\{x_i,x_j\}\)
, so I can compute its inverse using the expression above.
-
Let me know if the family of such matrices has a name! ↩︎