Skip to main content
Appended motivation for the question, which was provided by the OP in the comment section
Source Link

Determine all $n\times n$ matrices $A$ with the following property

For any square submatrix $B$ of $A$, $\det(B)\det(\bar B)=0$. Here, $\bar B$ is the complementary matrix of $B$ obtained by deleting from $A$ the rows and columns containing entries of $B$.

By Laplace expansion, this property automatically implies $\det(A)=0$. There are some easy-to-see sufficient conditions to guarantee this. For instance, $A$ admits a zero row(column); or $\operatorname{rank} A$ is less than half of $n$ (in this case, at least one of $B$ and $\bar B$ have size greater than $\operatorname{rank} A$).


Motivation

I have a collection of polynomials $\{p_i\}_{i=1}^m$ such that for each $p$ in the collection with monomial expansion $\sum a_I z_I$, the coefficients $a_I$ are expressed by entries of the aforementioned matrix $A$. My goal is to show that these polynomials are linearly dependent, while what I already have is an identity $\sum c_i p_i=0$ where each $c_i$ is a scalar of the form $\det(B) \det(\bar B)$. If there is a nonzero $c_i$, then I am done. Otherwise, all $\det(B) \det(\bar B)=0$ and I hope this is strong enough to force linear dependence as well.

Determine all $n\times n$ matrices $A$ with the following property

For any square submatrix $B$ of $A$, $\det(B)\det(\bar B)=0$. Here, $\bar B$ is the complementary matrix of $B$ obtained by deleting from $A$ the rows and columns containing entries of $B$.

By Laplace expansion, this property automatically implies $\det(A)=0$. There are some easy-to-see sufficient conditions to guarantee this. For instance, $A$ admits a zero row(column); or $\operatorname{rank} A$ is less than half of $n$ (in this case, at least one of $B$ and $\bar B$ have size greater than $\operatorname{rank} A$).

Determine all $n\times n$ matrices $A$ with the following property

For any square submatrix $B$ of $A$, $\det(B)\det(\bar B)=0$. Here, $\bar B$ is the complementary matrix of $B$ obtained by deleting from $A$ the rows and columns containing entries of $B$.

By Laplace expansion, this property automatically implies $\det(A)=0$. There are some easy-to-see sufficient conditions to guarantee this. For instance, $A$ admits a zero row(column); or $\operatorname{rank} A$ is less than half of $n$ (in this case, at least one of $B$ and $\bar B$ have size greater than $\operatorname{rank} A$).


Motivation

I have a collection of polynomials $\{p_i\}_{i=1}^m$ such that for each $p$ in the collection with monomial expansion $\sum a_I z_I$, the coefficients $a_I$ are expressed by entries of the aforementioned matrix $A$. My goal is to show that these polynomials are linearly dependent, while what I already have is an identity $\sum c_i p_i=0$ where each $c_i$ is a scalar of the form $\det(B) \det(\bar B)$. If there is a nonzero $c_i$, then I am done. Otherwise, all $\det(B) \det(\bar B)=0$ and I hope this is strong enough to force linear dependence as well.

deleted 8 characters in body
Source Link
Lchencz
  • 467
  • 2
  • 7

Determine all $n\times n$ matrices $A$ with the following property

For any square submatrix $B$ of $A$, either $\det(B)=0$ or $\det(\bar B)=0$ (here$\det(B)\det(\bar B)=0$. Here, $\bar B$ is the complementary matrix of $B$ obtained by deleting from $A$ the rows and columns containing entries of $B$).

By Laplace expansion, this property automatically implies $\det(A)=0$. There are some easy-to-see sufficient conditions to guarantee this. For instance, $A$ admits a zero row(column); or $\operatorname{rank} A$ is less than half of $n$ (in this case, at least one of $B$ and $\bar B$ have size greater than $\operatorname{rank} A$).

Determine all $n\times n$ matrices $A$ with the following property

For any square submatrix $B$ of $A$, either $\det(B)=0$ or $\det(\bar B)=0$ (here, $\bar B$ is the complementary matrix of $B$ obtained by deleting from $A$ the rows and columns containing entries of $B$).

By Laplace expansion, this property automatically implies $\det(A)=0$. There are some easy-to-see sufficient conditions to guarantee this. For instance, $A$ admits a zero row(column); or $\operatorname{rank} A$ is less than half of $n$ (in this case, at least one of $B$ and $\bar B$ have size greater than $\operatorname{rank} A$).

Determine all $n\times n$ matrices $A$ with the following property

For any square submatrix $B$ of $A$, $\det(B)\det(\bar B)=0$. Here, $\bar B$ is the complementary matrix of $B$ obtained by deleting from $A$ the rows and columns containing entries of $B$.

By Laplace expansion, this property automatically implies $\det(A)=0$. There are some easy-to-see sufficient conditions to guarantee this. For instance, $A$ admits a zero row(column); or $\operatorname{rank} A$ is less than half of $n$ (in this case, at least one of $B$ and $\bar B$ have size greater than $\operatorname{rank} A$).

added 37 characters in body
Source Link

Determine all  $n\times n$ matrixmatrices $A$ with the following property

For any square submatrix $B$ of $A$, either $det(B)=0$$\det(B)=0$ or $det(\bar B)=0$$\det(\bar B)=0$ (here, $\bar B$ is the complemetarycomplementary matrix of $B$ obtained by deleting from $A$ the rows and columns containing entries of $B$).

By laplaceLaplace expansion, this property automatically implies $det(A)=0$$\det(A)=0$. There are some easy-to-see sufficient conditions to guarantee this. For instance, $A$ admits a zero row(column); or $rankA$$\operatorname{rank} A$ is less than half of $n$  (in this case, at least one of $B$ and $\bar B$ have size greater than $rank A$$\operatorname{rank} A$).

Determine all  $n\times n$ matrix $A$ with the following property

For any square submatrix $B$ of $A$, either $det(B)=0$ or $det(\bar B)=0$(here $\bar B$ is the complemetary matrix of $B$ obtained by deleting from $A$ the rows and columns containing entries of $B$).

By laplace expansion this property automatically implies $det(A)=0$. There are some easy-to-see sufficient conditions to guarantee this. For instance, $A$ admits a zero row(column); or $rankA$ is less than half of $n$(in this case at least one of $B$ and $\bar B$ have size greater than $rank A$).

Determine all $n\times n$ matrices $A$ with the following property

For any square submatrix $B$ of $A$, either $\det(B)=0$ or $\det(\bar B)=0$ (here, $\bar B$ is the complementary matrix of $B$ obtained by deleting from $A$ the rows and columns containing entries of $B$).

By Laplace expansion, this property automatically implies $\det(A)=0$. There are some easy-to-see sufficient conditions to guarantee this. For instance, $A$ admits a zero row(column); or $\operatorname{rank} A$ is less than half of $n$  (in this case, at least one of $B$ and $\bar B$ have size greater than $\operatorname{rank} A$).

added 45 characters in body
Source Link
Lchencz
  • 467
  • 2
  • 7
Loading
edited tags
Link
Tony Huynh
  • 32.9k
  • 12
  • 120
  • 190
Loading
deleted 28 characters in body
Source Link
Lchencz
  • 467
  • 2
  • 7
Loading
Source Link
Lchencz
  • 467
  • 2
  • 7
Loading