Determine all $n\times n$ matrices $A$ with the following property
For any square submatrix $B$ of $A$, $\det(B)\det(\bar B)=0$. Here, $\bar B$ is the complementary matrix of $B$ obtained by deleting from $A$ the rows and columns containing entries of $B$.
By Laplace expansion, this property automatically implies $\det(A)=0$. There are some easy-to-see sufficient conditions to guarantee this. For instance, $A$ admits a zero row(column); or $\operatorname{rank} A$ is less than half of $n$ (in this case, at least one of $B$ and $\bar B$ have size greater than $\operatorname{rank} A$).
Motivation
I have a collection of polynomials $\{p_i\}_{i=1}^m$ such that for each $p$ in the collection with monomial expansion $\sum a_I z_I$, the coefficients $a_I$ are expressed by entries of the aforementioned matrix $A$. My goal is to show that these polynomials are linearly dependent, while what I already have is an identity $\sum c_i p_i=0$ where each $c_i$ is a scalar of the form $\det(B) \det(\bar B)$. If there is a nonzero $c_i$, then I am done. Otherwise, all $\det(B) \det(\bar B)=0$ and I hope this is strong enough to force linear dependence as well.