Tea Hour No. 1: A matrix of sums

Let \(n\) be a positive integer and \(a = (a_1, a_2, \dotsc, a_n)\) a vector in \(\mathbb{R}^n\). What are the eigenvalues of the \((n\times n)\)-matrix \[ A = (a_i + a_j)_{ij}, \] and when is this matrix positive semidefinite?


For every two vectors \(x = (x_1, x_2, \dotsc, x_n)\) and \(y = (y_1, y_2, \dotsc, y_n)\) in \(\mathbb{R}^n\), let \[ \langle x, y \rangle = \sum_{i = 1}^n x_i y_i \] denote the standard inner product, and \(\|x\| = \sqrt{\langle x, x \rangle}\) the induced norm. If \(e\) is defined as the vector \((1, 1, \dotsc, 1)\) in \(\mathbb{R}^n\) and \(x = (x_1, x_2, \dotsc, x_n)\) is any vector in \(\mathbb{R}^n\), then observe that \[ Ax = \langle x , a \rangle e + \langle x, e \rangle a\] Hence the kernel of \(A\) is the space of vectors which are orthogonal to both \(e\) and \(a\). If \(a\) and \(e\) are linearly dependent, say \(a = \alpha e\) for some \(\alpha \in \mathbb{R}\), then the eigenvalues of \(A\) are \(2\alpha n\) and zero.

Now let us assume that \(a\) and \(e\) are linearly independent, so in particular \(n \ge 2\), and let \((v_1, v_2, \dotsc, v_{n - 2})\) be a basis of the kernel of \(A\). Passing to the basis \((e, a, v_1, v_2, \dotsc, v_{n - 2})\) of \(\mathbb{R}^n\), we see that \(A\) is similar to the block matrix \[ \begin{pmatrix} A' & 0 \\ 0 & 0 \end{pmatrix}, \quad \text{where} \quad A' = \begin{pmatrix} \langle e, a \rangle & \|a\|^2 \\ n & \langle e, a \rangle \end{pmatrix}, \] so that it suffices to compute the eigenvalues of \(A'\) to find the nonzero eigenvalues of \(A\). These are \[ \lambda_{\pm} = \langle e, a \rangle \pm \sqrt{n} \|a\|. \] Note how this formula is still vaild if \(a\) and \(e\) are linearly dependent. In particular, since \(\|e\| = \sqrt{n}\), the Cauchy-Schwarz inequality implies now that \(A\) is positive semidefinite if and only if \(a = \alpha e\) for some \(\alpha > 0\).