5
$\begingroup$

A generic $k \times k$ block symmetric matrix $\Sigma$ is denoted as \begin{align} \Sigma = \begin{bmatrix}\Sigma_{11} & \Sigma_{12} & \ldots & \Sigma_{1k} \\ \Sigma_{21} & \Sigma_{22} & \ldots & \Sigma_{2k} \\ \ldots & \ldots & \ldots & \ldots \\ \Sigma_{k1} & \Sigma_{k2} & \ldots & \Sigma_{kk}\end{bmatrix}. \end{align} It has the following structures:

  1. For all $i$, $\Sigma_{ii} \in \mathbb{S}^{p_i \times p_i}$ and it is positive definite .

  2. For $i \neq j$, $\Sigma_{ij} \in \mathbb{R}^{p_i \times p_j} $ and $\Sigma_{ij} = \lambda \Sigma_{ii} \theta_{i} \theta_{j}^\top \Sigma_{jj}$, where $\lambda \in \mathbb{R}$, $\theta_{i} \in \mathbb{R}^{p_i}$ and $\theta_{i}^\top \Sigma_{ii} \theta_{i} = 1$ for all $i$.

Fix $\Sigma_{ii}$ and $\lambda$, we can see that $\Sigma$ is fully characterized by $\theta = (\theta_1, \ldots, \theta_k)$, thus we can use $\Sigma_{\theta}$ to denote the block symmetric matrix characterized by $\theta$.

We consider the following three matrices. Let $\Sigma_{u}$ denote the matrix characterized by $u = (u_1, \ldots, u_k)$ and $\Sigma_{v}$ denote the matrix characterized by $v = (v_1, \ldots, v_k)$ and let $\Sigma_{0}$ be the block diagonal matrix of $\{\Sigma_{ii}\}$. We want to compute the following determinant

\begin{align} \text{det}(\Sigma_{u}^{-1} + \Sigma_{v}^{-1} - \Sigma_{0}^{-1}). \end{align}

I have calculated this determinant for $k = 2$, where the answer is

\begin{align} \left(\frac{1 - \lambda^2 a b}{1 - \lambda^2}\right)^2 \frac{1}{\text{det}(\Sigma_{11})\text{det}(\Sigma_{22})}, \end{align}

with $a = u_1^\top \Sigma_{11} v_1$ and $b = u_2^\top \Sigma_{22} v_2$.

How could I compute the determinant for the general $k$?

Thank you!

$\endgroup$
2
  • $\begingroup$ For better legibility, I'd suggest that you replace $\theta^{1} = (\theta^{1}_1, \ldots, \theta^{1}_k)$ by e.g. $V = (v_1, \ldots, v_k)$ and $\theta^{2} = (\theta^{2}_1, \ldots, \theta^{2}_k)$ by $W = (w_1, \ldots, w_k)$ where the $v_i$ and $w_i$ are vectors of length $p_i$. Using upper indexes is not a good idea when powers are also involved! So you'd have $a =v_1^{\top} \Sigma_{11} w_1$ (I suppose not $w_1^{\top}$) etc. $\endgroup$ Commented Apr 22, 2016 at 8:57
  • $\begingroup$ Thank you for your suggestion. I have changed the notation. $\endgroup$ Commented Apr 22, 2016 at 18:19

1 Answer 1

1
$\begingroup$

Perhaps you should write this more geometrically. Denote by $(-,-)_i$ the canonical Euclidean inner product on $\newcommand{\bR}{\mathbb{R}}$ $\bR^{p_i}$. Then we can identify $\Sigma_{ii}$ with a symmetric positive operator. $\Sigma_{ij}: \bR^{p_j}\to\bR^{p_i}$ is the operator that has the invariant description

$$\Sigma_{ij}=\lambda \Sigma_{ii}\circ \theta_i\otimes \theta_j^T\circ \Sigma_{jj}. $$

Now fix $(-,-)_i$ orthonormal bases in $\bR^{p_i}$ such that all the operators $\Sigma_{ii}$ are diagonal. Perform the computations in this case and express the result in invariant terms.

Try to understand what happens in the "simplest" case $p_1=\cdots=p_k=1$.

$\endgroup$
2
  • $\begingroup$ Can you explain more on that? I'm not familiar with the terms you used, especially the "big" equation you used. Thank you very much! $\endgroup$ Commented Apr 22, 2016 at 18:20
  • $\begingroup$ Don't worry about the meaning of $\otimes$. What I wrote is equivalent with your definition. My not-really-an-answer makes two points: 1). it suffices to assume the matrices $\Sigma_{ii}$ are diagonal; 2) even the simplest case $p_1=\cdots =p_k=1$ is nontrivial. $\endgroup$ Commented Apr 23, 2016 at 15:42

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.