5
$\begingroup$

Suppose that we have an $n \times n$ symmetric positive definite (SPD) matrix $\bf Q$ and that we would like to compute its condition number via convex optimization. In section 3.2 of Boyd et al.$^\color{magenta}{\star}$, the following eigenvalue problem (EVP) is proposed

$$ \begin{array}{ll} \underset {\gamma, \nu \in {\Bbb R}} {\text{minimize}} & \gamma \\ \text{subject to} & {\bf I}_n \preceq \nu \, {\bf Q} \preceq \gamma {\bf I}_n\end{array} \tag{EVP} $$

Alternatively, assuming I made no mistakes, the mid-range of the (real) spectrum of $\bf Q$ can be found via minimization of a spectral norm, as follows

$$ x_{\min} := \arg\min_{x \in {\Bbb R}} \left\| x {\bf I}_n - {\bf Q} \right\|_2 = \frac{\lambda_{\min} ({\bf Q}) + \lambda_{\max} ({\bf Q})}{2} = \frac{\sigma_{\min} ({\bf Q}) + \sigma_{\max} ({\bf Q})}{2} $$

as, since the matrix $\bf Q$ is SPD, its eigenvalues and its singular values are the same. Let the corresponding minimum be denoted by

$$ y_{\min} := \left\| x_{\min} {\bf I}_n - {\bf Q} \right\|_2 = \frac{\lambda_{\max} ({\bf Q}) - \lambda_{\min} ({\bf Q})}{2} = \frac{\sigma_{\max} ({\bf Q}) - \sigma_{\min} ({\bf Q})}{2} $$

Thus, the condition number of $\bf Q$ is

$$ \kappa ({\bf Q}) := \frac{\sigma_{\max} ({\bf Q})}{\sigma_{\min} ({\bf Q})} = \color{blue}{\frac{x_{\min} + y_{\min}}{x_{\min} - y_{\min}}} $$

I am looking for references, as it is unlikely no one has arrived at this result before. Of course, perhaps no one published this result because it is too computationally expensive, or too numerically unstable. The difference $x_{\min} - y_{\min}$ in the denominator is a bit worrying, for example. Any references on the computation of condition numbers of SPD matrices via convex optimization would be welcome.


Numerical experiment

Amongst symmetric positive definite (SPD) matrices, Hilbert matrices are infamously ill-conditioned. Let ${\bf H}_5$ be the $5 \times 5$ Hilbert matrix. Its condition number is $\kappa \left( {\bf H}_5 \right) \sim 4.8 \cdot 10^5$.

$$ {\bf H}_5 := \begin{bmatrix} 1 & \frac{1}{2} & \frac{1}{3} & \frac{1}{4} & \frac{1}{5} \\ \frac{1}{2} & \frac{1}{3} & \frac{1}{4} & \frac{1}{5} & \frac{1}{6} \\ \frac{1}{3} & \frac{1}{4} & \frac{1}{5} & \frac{1}{6} & \frac{1}{7} \\ \frac{1}{4} & \frac{1}{5} & \frac{1}{6} & \frac{1}{7} & \frac{1}{8} \\ \frac{1}{5} & \frac{1}{6} & \frac{1}{7} & \frac{1}{8} & \frac{1}{9} \end{bmatrix} $$

The following Python & NumPy & CVXPY script computes the condition number of ${\bf H}_5$ via NumPy's function linalg.cond, via the spectral norm minimization problem (SNMP) and via the eigenvalue problem (EVP) proposed by Boyd et al.$^\color{magenta}{\star}$:

import numpy as np
from cvxpy import *


# compute the condition number of symmetric positive definite matrix Q 
# via the mid-range of the (real) spectrum via spectral norm minimization
def cond_SNMP(Q):

    n = Q.shape[0]

    # create n x n identity matrix
    I_n = np.identity(n)

    # solve spectral norm minimization problem (SNMP)
    x    = Variable()
    SNMP = Problem(Minimize(norm((x * I_n) - Q, 2)), [])
    SNMP.solve(verbose = False) 

    # create variables
    status = SNMP.status
    x_min  =    x.value
    y_min  = SNMP.value

    # if the SNMP was solved, return the estimate of the condition number of Q
    if status == "optimal":
        return((x_min + y_min) / (x_min - y_min))
    else:
        return "Error!"


# compute the condition number of symmetric positive definite matrix Q 
# via Boyd & El Ghaoui & Feron & Balakrishnan's eigenvalue problem (EVP)
def cond_EVP(Q):

    n = Q.shape[0]

    # create n x n identity matrix
    I_n = np.identity(n)

    # solve eigenvalue problem (EVP)
    gamma       = Variable()
    nu          = Variable()
    constraints = [ (nu    *   Q) -     I_n  >> 0, 
                    (gamma * I_n) - (nu * Q) >> 0,
                                          nu >= 0 ]
    EVP         = Problem(Minimize(gamma), constraints)
    EVP.solve(max_iters = 1000000, verbose = False)

    # create variables
    status    =   EVP.status
    gamma_min = gamma.value

    # if the EVP was solved, return the estimate of the condition number of Q
    if status == "optimal":
        return(gamma_min)
    else:
        return "Error!"


def main():

    # create 5 x 5 Hilbert matrix
    H_5 = np.fromfunction(lambda i, j: 1 / (i + j + 1), (5,5))

    print("Estimate of the condition number of H_5 via NumPy:", np.linalg.cond(H_5))
    print("Estimate of the condition number of H_5 via SNMP :",      cond_SNMP(H_5))
    print("Estimate of the condition number of H_5 via EVP  :",       cond_EVP(H_5))


if __name__ == "__main__":
    main()

The output of this script is as follows:

Estimate of the condition number of H_5 via NumPy: 476607.2502419222
Estimate of the condition number of H_5 via SNMP : 476607.50854445784
Estimate of the condition number of H_5 via EVP  : 476611.42072007037

Surprisingly, solving the SNMP produces an estimate of the condition number $\kappa \left( {\bf H}_5 \right)$ that is closer to the estimate produced by NumPy's function linalg.cond than the estimate produced by solving the EVP. Also, making verbose = True to read what the solvers output, both the SNMP and the EVP were numerically solved using Brendan O'Donoghue's Splitting Conic Solver (version 3.2.7). However, to solve the SNMP, the solver required approximately $6 \cdot 10^2$ iterations, whereas, shockingly, to solve the EVP, the solver required approximately $2.5 \cdot 10^5$ iterations!


References

$\color{magenta}{\star}$ Stephen Boyd, Laurent El Ghaoui, Eric Feron, Venkataramanan Balakrishnan, Linear Matrix Inequalities in System and Control Theory, Volume 15 of Studies in Applied Mathematics, SIAM, 1994.

$\endgroup$
1
  • 2
    $\begingroup$ You might also try the Computational Science stack exchange if you don't get answers here. $\endgroup$ Commented Jan 24, 2025 at 17:46

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.