# Daniel Kressner (EPF Lausanne): Randomized joint diagonalization

It is a basic linear algebra result that a family of two or more commuting symmetric matrices has a common eigenvector basis and can thus be jointly diagonalized. Perhaps surprisingly, the development of a robust numerical algorithm for effecting such a joint diagonalization is by no means trivial. To start with, roundoff error or other forms of error will inevitably destroy the commutativity assumption. In turn, one can at best hope to find an orthogonal matrix that almost diagonalizes every matrix in the family. Most current methods solve this problem using optimization techniques. In this talk, we propose a novel randomized method that reduces the problem to a standard eigenvalue problem through random linear combinations. Unlike existing optimization-based methods, our algorithm is easy to implement and leverages existing high-quality linear algebra software packages. We prove its robustness by proving that the backward error of the output orthogonal joint diagonalizer will be small with high probability through a perturbation analysis. The algorithm can be further improved by deflation techniques. Through numerical experiments on synthetic and real-world data, we show that our algorithm reaches state-of-the-art performance.

This talk is based on joint work with Haoze He.

Supported by the combined SNSF research project Probabilistic methods for joint and singular eigenvalue problems (grant 200021L_192049) and Slovenian Research Agency (grant N1-0154).