1
$\begingroup$

$\DeclareMathOperator\SO{SO}\DeclareMathOperator\PL{PL}$I was trying to reduce the Hadamard problem of calculating the maximum value of the determinant of a $\{1,-1\}$-matrix to the problem of maximizing the 1-norm over the set of orthogonal matrices.

The question is, in the process, I make some assumptions that look true but I don't know how to prove or disprove. They are the tree hypothesis below.

To this end, consider the sets:

\begin{align*}V_{n^2}&=\{ A\in M_n(\mathbb{R}); |A_{ij}|=1 \text{ for all } i,j \}\\ n\mathbb{S}^{n^2 -1}&=\{ A\in M_n(\mathbb{R}) ; \| A\|=n \} \\ \sqrt{n} \SO_n &=\{ \sqrt{n}A; A\in \SO_n \} \end{align*}

where $M_n(\mathbb{R})$ is the set of real $n \times n$ matrices, $\|A\|$ is the Frobenius norm of matrix $A$, and $\SO_n$ is the special orthogonal group. It is very important to note for what follows that $\sqrt{n} \SO_n \subset n\mathbb{S}^{n^2 -1}$ and $V_{n^2}\subset n\mathbb{S}^{n^2 -1}$.

Our next point is that the maximum value of $\det(A)$ for a matrix in $n\mathbb{S}^{n^2-1}$ is $n^{\frac{n}{2}}$, which occurs only when $A\in \sqrt{n} \SO_n$. This leads us to the first hypothesis:


Hypothesis $1$.$\,$ Let $A\in V_{n^2}$. The value of $\det(A)$ will be maximal when the distance (in the Frobenius norm) between $A$ and $\sqrt{n}\SO_n$ is minimal.


Even if Hypothesis 1 is correct, finding the matrix $A$ that is as close as possible to $\sqrt{n}\SO_n$ does not seem any easier. However, we can look at the problem backward and search for the matrix $B \in \sqrt{n} \SO_n$ whose distance to $V_{n^2}$ is minimal; in this case, the matrix $A \in V_{n^2}$ closest to $B$ is given by $A_{ij} = \mathrm{sgn}(B_{ij})$, where the sgn function is defined as:

$$ \operatorname{sgn}(t) := \begin{cases} +1, & t \geq 0 \\ -1, & t < 0 \end{cases} $$

Perhaps defining $\mathrm{sgn}(0)$ was unnecessary. To see whys, let us define the set:

$$\PL_{n^2}=\{A\in M_n(\mathbb{R}); A_{ij}=0 \text{ for at least one pair } i,j\}$$

that is, $\PL_{n^2}$ is the set of matrices possessing at least one null entry. The points in $n\mathbb{S}^{n^2-1}$ that are at the greatest possible distance from the set $\PL_{n^2}$ are precisely those in the set $V_{n^2}$. Thus, we are led to the hypothesis that if we maximize the distance from $B \in \SO_n$ to the set $\PL_{n^2}$, we will be minimizing the distance from $B$ to $V_{n^2}$. Observing that the distance from a matrix $A$ to $\PL_{n^2}$ is given by $\min\{|A_{i,j}|; 1\leq i,j\leq n\}$, our second hypothesis can be formalized as:


Hypothesis $2$.$\,$ The matrices $B \in \SO_n$ for which the distance to $V_{n^2}$ is minimal coincide with the matrices for which $f(B) = \min\left\{ |B_{i,j}|; 1\leq i,j \leq n \right\}$ is maximal.


Note then that if Hypothesis 2 is correct, the entries of $B$ will all be non-zero.How then to find the matrix $B$? One way would be to find the maximum points of the function $f(B)$ in $\SO_n$, but the formula defining this function is not very friendly. Another strategy is to observe that the matrices in $V_{n^2}$ are also the maximum points in $n\mathbb{S}^{n^2-1}$ of the entry-wise sum norm:

$$|A|_1 = \sum_{i,j=1}^n |A_{ij}|$$

which leads us to the third hypothesis:


Hypothesis $3$.$\,$ The matrices $B \in \SO_n$ for which the distance to $V_{n^2}$ is minimal coincide with the matrices for which the norm $|B|_1$ is maximal.


If Hypotheses 1 and 3 are correct, the problem of finding a matrix $A \in V_{n^2}$ with maximum determinant can be converted into the problem of finding a matrix $B \in \SO_n$ with maximum $|B|_1$. Now, if Hypothesis 2 is correct, the extreme points of the sum norm in $\SO_n$ will be points where this function is differentiable, allowing us to use methods from calculus and analysis to find them.

As an example, I tried using the map $\exp: \mathfrak{o}(n) \to \SO_n$ and then calculating the maximum of the function $g(X) = |\exp(X)|_1$ for $X \in \mathfrak{o}(n)$, requiring its gradient to be a symmetric matrix—as it would then be orthogonal (considering the Frobenius inner product) to $\mathfrak{o}(n)$, the set of skew-symmetric matrices. However, the calculations became very complicated, and I did not know how to proceed. Perhaps a more clever parametrization exists.

In any case, if the three hypotheses are correct, we do not need the exact matrix $B \in \SO_n$ that maximizes $|B|_1$. If we have a numerical method that generates a sequence of matrices $\{B(k)\}_{k \in \mathbb{N}}$ such that $B = \lim_{k \to \infty} B(k)$ (such as the gradient descent method, which is available since the sum norm is differentiable at the matrices of interest), then for some $N$, we will have:

$$k \geq N \implies \operatorname{sgn}(B(k)_{ij}) = \operatorname{sgn}(B_{ij}) \text{ for all } i,j$$

and thus the matrix $A$ given by $A_{ij} = \operatorname{sgn}(B(k)_{ij})$ will have the maximum determinant for matrices in $V_{n^2}$. Note that even if the hypotheses are correct, this method does not give us the maximum value of the determinant in $V_{n^2}$, but rather produces a matrix in that set whose determinant is maximal.

$\endgroup$
5
  • $\begingroup$ I'm sorry to be dense, but what is the question? $\endgroup$ Commented Feb 17 at 5:17
  • 4
    $\begingroup$ @LSpice some people use "hypothesis" in the German way, i.e., not in the sense of "assumption" but in the sense of "conjecture" and I assume that the question is whether the three "hypotheses" are true. $\endgroup$ Commented Feb 17 at 7:27
  • 1
    $\begingroup$ This isn't an answer, but local maximizers of $|A|_1$ in the orthogonal group have been explored for the purposes of trying to learn about Hadamard matrices: arxiv.org/abs/1202.2025 $\endgroup$ Commented Feb 17 at 13:57
  • $\begingroup$ @LSpice is I make an edit to make the problem more obvious, but is like Ycor say, the problem is to prove or refute the hypothesis I make. $\endgroup$ Commented Feb 17 at 14:10
  • $\begingroup$ Doesn't Hypothesis 1 follow from Hadamard's inequality? $\endgroup$ Commented Feb 20 at 0:18

0

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.