- Research
- Open access
- Published:
The maximal and minimal ranks of matrix expression with applications
Journal of Inequalities and Applications volume 2012, Article number: 54 (2012)
Abstract
We give in this article the maximal and minimal ranks of the matrix expression A-B1V1C1-B2V2C2-B3V3C3-B4V4C4 with respect to V1, V2, V3, and V4. As applications, we derive the extremal ranks of the generalized Schur complement A - BM(1)C - DN(1)G and the partial matrix (A BM(1)C DN(1)G) with respect to the generalized inverse M(1) ε M{1} and N(1) ∈ N{1}.
AMS classifications: 15A03; 15A09; 15A24.
1 Introduction
Let Cm × n be the set of all m × n complex matrices with complex entries. I n denotes the identity matrix of order n and O m × n denotes the m × n matrix of all zero entries (if no confusion occurs, we will omit the subscript). For a given a matrix A ∈ Cm× n, the symbols A* and r(A) will stand for the conjugate transpose and the rank of the matrix A, respectively. We recall that a generalized inverse X ∈ Cn× mof A ∈ Cm × nis a matrix which satisfies some of the following four Penrose equations [1]:
For a subset {i,j,...,k} of the set {1,2,3,4}, the set of n × m matrices satisfying the equations (i), (j), ...,(k) from among the above four Penrose Equations (1)-(4) is denoted by A{i,j,...,k}. A matrix X from A{i,j,...,k} is called an {i,j,...,k}-inverse of A and is denoted by A(i,j,...,k). In particular, an n × m matrix X of the set A{1} is called a g-inverse of A and denoted by A(1). The unique {1, 2, 3, 4}-inverse of A is denoted by A†, which is called the Moore-Penrose inverse of A. Throughout this article, the abbreviated symbols E A and F A stand for the two projectors E A = I - AA† and F A = I − A†A of A, respectively. We refer the reader to [2, 3] for basic results on the generalized inverses.
Given a matrix with some variant entries in it (often called partial matrix) or a matrix expression with some variant matrices in it, the rank of the partial matrix or matrix expression will vary with respect to the variant entries or variant matrices. Because the rank of matrix is an integer between 0 and the minimal of row and column numbers of the matrix, maximal and minimal ranks of partial matrix or matrix expressions must exist with respect to their variant entries or variant matrices. Many problems in matrix theory and applications are closely related to maximal and minimal possible ranks of matrix expressions with variant entries. For example, a matrix equation AXB = C is consistent if and only if the minimal rank of C-AXB with respect to X is zero, see [4–6]; there is matrix X such that the partial matrix AXB of order n is nonsingular if and only if the maximal rank of AXB with respect to X is n, see [7–11].
The maximal and minimal ranks of matrix expressions or partial matrix are two basic concepts in matrix theory for describing the dimension of the row or column vector space of matrix expressions or partial matrix, both of which are well understood and are easy to compute by the well-known elementary or congruent matrix operations, see [5, 7, 8, 10–16]. These two quantities play an essential role in characterizing algebraic properties of matrices expressions or partial matrices. In fact, maximal and minimal ranks of matrix expressions or partial matrices have been the main objects of study in matrix theory and applications. Some previous systematical researches on maximal and minimal ranks of matrix expressions or partial matrices and their applications can be found in [17–20]. In recent years, the present author reconsidered the maximal and minimal ranks of matrix expressions or partial matrices by using some tricky operations on block matrices and generalized inverses of matrices, and obtained many new formulas for maximal and minimal ranks of matrix expressions or partial matrices and their applications, see [4, 6, 9, 21–28].
In this article, given matrices , we will present the maximal and minimal ranks of the matrix expression A-B1V1C1 - B2V2C2 - B3V3C3 - B4V4C4 with respect to V1, V2, V3, and V4. As applications, the maximal and minimal ranks of the generalized Schur complement A - BM(1)C - DN(1)G and the partial matrix (A BM(1)C DN(1)G) with respect to the generalized inverse M(1) ∈ M{1} and N(1) ∈ N{1} are also considered. The results in this article extend the earlier studies by various authors, see, e.g., [4–6, 11, 16, 18, 21, 25, 26].
We first introduce some well-known results which will be used in this article.
where are given, and are two variant matrices. Then
Lemma 1.2 [2]. Let A ∈ Cm × n. Then the expression of {1}-inverses of A can be written as
where W ∈ Cn × mand Z ∈ Cn× mare arbitrary.
Lemma 1.3 [9]. Let A ∈ Cm × n, B ∈ Cm × k, and C ∈ Cl × n. Then
where E A = I m - AA† and F A = I n - A†A.
2 The maximal and minimal ranks of A - B1V1C1- B2V2C2- B3V3C3- B4V4C4
In this section, we will present the maximal and minimal ranks of the linear matrix expression
where , are given matrices, with respect to four variant matrices . Applying the formula (1) in Lemma 1.1 to the linear matrix expression in (4) and simplifying, we obtain the following result.
Theorem 2.1 Let P(V1, V2, V3, V4) be given as (4). Then
where
Proof. It is easy to verify by block Gaussian elimination that the rank of P(V1,V2,V3, V4) in (4) can be expressed as
where and , are denotes the identity matrix of order p i and q i , respectively.
and
According to this result, we have
Then applying the formula (1) in Lemma 1.1 to matrix T, we have
where
Again applying the formula (1) in Lemma 1.1, we get
and
Substituting (8)-(11) into (7) and (6) yield (5).
Recall a simple fact that a matrix equation AXB = C is consistent for every variant matrices X, if and only if the maximal rank of C - AXB with respect to X is zero. Thus, by Theorem 2.1 we can immediately obtain the following result.
Corollary 2.2 Let P(V1,V2,V3, V4) be given as (4). Then the matrix equation A = B1V1C1 + B2V2C2 + B3V3C3 + B4V4C4 holds for any V1, V2, V3, and V4 if and only if T1 = O or T2 = O or T3 = O or T4 = O.
Because the right side of (5) are just composed by ranks of block matrices, they can be easily simplified by block Gaussian elimination when the given matrices in (4) satisfy some restrictions.
Theorem 2.3 Let P(V1, V2, V3, V4) be given as (4) and let R(B1) ⊆ R(B2), R(B3) ⊆ R(B4), Then
where
Proof. In fact, we can write B1 = B2X, B3 = B4Y, C2 = ZC1, and C4 = WC3 under the hypotheses of Theorem 2.3. In this case, we have
and
and
and
Combining (5) with (13)-(16) yields (12).
Corollary 2.4 Let P(V1, V2, V3, V4) be given as (4) and let then the matrix equation A = B1V1C1 + B2V2C2 + B3V3C 3 + B4V4C4 holds for any V1, V2, V3, and V4 if and only if τ1 = O or τ2 = O or τ3 = O.
In the rest of this section, we will find the minimal rank of the linear matrix expression P(V1,V2, V3, V4) in (4), with respect to four variant matrices , when P(V1, V2, V3, V4) satisfy some restrictions.
Theorem 2.5 Let P(V1, V2, V3, V4) be given as (4) and let . Then
where
Proof. From the proof of Theorem 2.1, it is easy to verify that the minimal rank of P(V1, V2, V3, V4) in (4) can be expressed as
where T, S, E i , p i and q i , i = 1,2,3,4, are given as the proof of Theorem 2.1. Then applying the formula (2) in Lemma 1.1 to matrix T, we have
In this case, we derive from (19) that
Again applying the formula (2) in Lemma 1.1, we have
where S3 is given as the Equation (7) of the proof of Theorem 2.1. Since B1 = B2X, B3 = B4Y, C2 = ZC1, and C4 = WC3, (21) is reduced to
The last equality holds, since the well-known Frobenius rank inequality r(ABC) ≥ r(AB) + r(BC) − r(B), then
With the similar method, we also have
On the other hand, by the formula (1) in Lemma 1.1, we have
Contrasting (18), (20) and (22)-(29) yields (17).
Corollary 2.6 Let P(V1, V2, V3, V4) be given as (4) and let R(B1) ⊆ R(B2), R(B3) ⊆ R(B4), . Then the matrix equation A = B1V1C1 + B2V2C2+ B3V3C3 + B4V4C4 is consistent if and only if the right side of (17) is zero.
3 Some applications to generalized Schur complement and partial matrix
As direct applications of the results in Section 2, we determine in this section the maximal and minimal ranks of the generalized Schur complement A-BM(1)C-DN(1)G and the partial matrix (A BM(1)C DN(1)G) with respect to two variant matrices M(1) ∈ M{1} and N(1) ∈ N{1}.
Theorem 3.1 Let A ∈ Cm × n, B ∈ Cm × p, C ∈ Cq × n, D ∈ Cm × s, G ∈ Ct × n, M ∈ Cq × p, and N ∈ Ct × s.
Then
where
Proof. Applying Lemma 1.2, we have
and
where W i , i = 1,2,3,4 are arbitrary, E M = I q - MM† and F M = I p - M†M. Substituting the Equation (31) and Equation (32) into the generalized Schur complement A - BM(1)C - DN(1)G yields
where A1 = A - BM†C - DN†G.
In fact A1 - BF M W1C - BW2E M C - DF N W3G - DW4E N G is a special case of the matrix expression P(V1,V2,V3,V4), and R(BF M ) ⊆ R(B), R(DF N ) ⊆ R(D), R((E M C)*) ⊆ R(C*), R((E N G)*) ⊆ R(G*). In this case, from the formula (12) in Theorem 2.3, we have
where
For T1′, simplifying the ranks of matrices by Lemma 1.3 and block Gaussian elimination, we find that:
Combining the rank equalities (35), (36) with (37), we have .
By the similar approach, we also have and . Then we have complete the proof of theorem.
Corollary 3.2 Let A ∈ Cm × n, B ∈ Cm × p, C ∈ Cq × n, D ∈ Cm × s, G ∈ Ct × n, M ∈ Cq × p, and N ∈ Ct × s. Then the identity A = BM(1)C + DN(1)G holds for every M(1) ∈ M{1} and N(1) ∈ N{1} if and only if or or .
From the proof of Theorem 3.1, we known that A - BM(1)C - DN(1)G = A1 - BF m W1C - BW2E m C - DF N W3G - DW4E N G, where A1 = A - BM†C - DN†G. In this case, A - BM(1)C - DN(1)G is a special case of the matrix expression P(V1, V2, V3, V4), and R(BF M ) ⊆ R(B), R(DF N ) ⊆ R(D), R((E M C)*) ⊆ R(C*), R((E N G)*) ⊆ R(G*). Then from the Theorem 2.5, we have
Theorem 3.3 Let A ∈ Cm × n, B ∈ Cm × p, C ∈ Cq × n, D ∈ Cm × s, G ∈ Ct × n, M ∈ Cq × p, and N ∈ Ct × s.
Then
where
Corollary 3.4 Let A ∈ Cm × n, B ∈ Cm × p, C ∈ Cq × n, D ∈ Cm × s, G ∈ Ct × n, M ∈ Cq × p, and N ∈ Ct × s. then the identity A = BM(1)C + DN(1)G is consistent if and only if the right side of (38) is zero.
Next, we will determine the maximal and minimal ranks of the partial matrix
with respect to M(1) ∈ M{1} and N(1) ∈ N{1}, by applying the results in Section 2.
It is quite obvious that the partial matrix (A BM(1)C DN(1)G) may be written as
Then from (39) and Theorems 2.3 and 3.1, we have
Theorem 3.5 Let A ∈ Cm × n, B ∈ Cm × p, C ∈ Cq × n, D ∈ Cm × s, G ∈ Ct × n, M ∈ Cq × p, and N ∈ Ct × s.
Then
where
Corollary 3.6 Let A ∈ Cm × n, B ∈ Cm × p, C ∈ Cq × n, D ∈ Cm × s, G ∈ Ct × n, M ∈ Cq × p, and N ∈ Ct × s. then the inclusion R(BM(1)C + DN(1)G) ⊆ R(A) holds for every M(1) ∈ M{1} and N(1) ∈ N{1} if and only if or or .
On the other hand, from (39) and Theorems 2.5 and 3.3, we can easily obtain the minimal rank of the partial matrix (A BM(1)C DN(1)G) with respect to M(1) ∈ M{1} and N(1) ∈ N{1}.
Theorem 3.7 Let A ∈ Cm × n, B ∈ Cm × p, C ∈ Cq × n, D ∈ Cm × s, G ∈ Ct × n, M ∈ Cq × p, and N ∈ Ct × s.
Then
where
Corollary 3.8 Let A ∈ Cm × n, B ∈ Cm × p, C ∈ Cq × n, D ∈ Cm × s, G ∈ Ct × n, M ∈ Cq × p, and N ∈ Ct × s. then there are some M(1) ∈ M{1} and N(1) ∈ N{1}, such that the inclusion R(BM(1)C+DN(1)G) ⊆ R(A) holds if and only if the right side of (41) is zero.
References
Penrose R: A generalized inverse for matrices. Proc Cambridge Philos Soc 1955, 51: 406–413. 10.1017/S0305004100030401
Ben-Israel A, Greville TNE: Generalized Inverse: Theory and Applications, Wiley-Interscience, 1974. 2nd edition. Springer-Verlag, New York; 2002.
Wang G, Wei Y, Qiao S: Generalized Inverses: Theory and Computations. Science Press, Beijing; 2004.
Braden HW: The matrix equation ATX ± XTA = B . SIAM J Matrix Anal Appl 1998, 20: 295–302. 10.1137/S0895479897323270
Johnson CR, Whitney GT: Minimum rank completions. Linear and multilinear Algebra 1991, 28: 271–273. 10.1080/03081089108818051
Tian Y: Ranks of solutions of the matrix equation AXB = C . Linear and multilinear Algebra 2003, 51: 111–125. 10.1080/0308108031000114631
Bostain AA, Woerdeman HJ: Unicity of minimal rank completions for tri-diagonal partial block matrices. Linear Algebra Appl 2001, 325: 23–25. 10.1016/S0024-3795(00)00253-6
Cohen N, Johnson CR, Rodman L, Woerdeman HJ: Ranks of completions of partial matrices. Oper Theory Adv Appl 1989, 40: 165–185.
Marsaglia G, Styan GPH: Equalities and inequalities for ranks of matrices. Linear and multilinear Algebra 1974, 2: 269–292. 10.1080/03081087408817070
Woerdeman HJ: Minimal rank completions for block matrices. Linear Algebra Appl 1989, 121: 105–122.
Woerdeman HJ: Minimal rank completions of partial banded matrices. Linear and multilinear Algebra 1993, 36: 59–69. 10.1080/03081089308818275
Davis C: Completing a matrix so as to minimize its rank. Oper Theory Adv Appl 1988, 29: 87–95.
Johnson CR: Matrix completion problems: a survey in matrix theory and applications. Proc Sympos Appl Math AMS 1990, 40: 171–179.
Rao CR, Mitra SK: Generlized Inverse of matrices and its Application. Wiley, New York; 1971.
Tian Y: Completing block matrices with maximal and minimal ranks. Linear Algebra Appl 2000, 321: 327–345. 10.1016/S0024-3795(00)00224-X
Tian Y: The minimal rank of a 3 × 3 partial block matrix. Linear and multilinear Algebra 2002, 50: 125–131. 10.1080/03081080290019531
Cohen N, Dancis J: Maximal rank Hermitian completions of partially specified Hermitian matrices. Linear Algebra Appl 1996, 244: 265–276.
Liu Y, Tian Y: More on extremal ranks of the matrix expressions A - BX ± X * B * with statistical applications. Numer Linear Algebra Appl 2008, 15: 307–325. 10.1002/nla.553
Liu Y, Tian Y: Extremal ranks of submatrices in an Hermitian solution to the matrix equation AX A * = B with applications. J Appl Math Comput 2010, 32: 289–301. 10.1007/s12190-009-0251-8
Tian Y, Liu Y: Extremal ranks of some symmetric matrix expressions with applications. SIAM J Matrix Anal Appl 2006, 28: 890–905. 10.1137/S0895479802415545
Mitra SK: A pair of simulations linear matrix equations A1X1B1= C1and A2X2B2= C2and a programming problem. Linear Algebra Appl 1990, 131: 107–123.
Puntanen S, Styan GHP, Tian Y: Three rank formulas associated with the covariance matrices of the BLUE and the OLSE in the general linear model. Econometric Theory 2005, 21: 659–664.
Rao CR: Unified theory of linear estimation. Sankhyā Ser A 1971, 33: 371–394.
Rao CR: Representations of best linear unbiased estimators in the Gauss-Markoff model with a singular dispersion matrix. J Multivariate Anal 1973, 3: 276–292. 10.1016/0047-259X(73)90042-0
Tian Y: Upper and lower bounds for ranks of matrix expressions using generalized inverses. Linear Algebra Appl 2002, 355: 187–214. 10.1016/S0024-3795(02)00345-2
Tian Y: Using rank formulas to characterize equalities for Moore-Penrose inverse of matrix products. Appl Math Comput 2004, 147: 581–600. 10.1016/S0096-3003(02)00796-8
Tian Y: More on maximal and minimal ranks of Schur complements with applications. Appl Math Comput 2004, 152: 675–692. 10.1016/S0096-3003(03)00585-X
Tian Y, Cheng S: The maximal and minimal ranks of A - BXC with applications. New York J Math 2003, 9: 345–362.
Acknowledgements
The authors would likes to thank the Editor-in-Chief and the anonymous referees for their very detailed comments, which greatly improved the presentation of this article. The research of the Z. Xiong was supported by the start-up fund of Wuyi University Jiangmen 529020, Guangdong province, P.R. China and the foundation for High-Level Talents in Guangdong province, P.R. China. The research of the S.Yuan was supported by Guangdong Natural Science Fund of China (No.10452902001005845).
Author information
Authors and Affiliations
Corresponding author
Additional information
Competing interests
The authors declare that they have no competing interests.
Authors' contributions
The authors jointly worked on deriving the results. All authors read and approved the final manuscript.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 2.0 International License ( https://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
About this article
Cite this article
Xiong, Z., Qin, Y. & Yuan, S. The maximal and minimal ranks of matrix expression with applications. J Inequal Appl 2012, 54 (2012). https://doi.org/10.1186/1029-242X-2012-54
Received:
Accepted:
Published:
DOI: https://doi.org/10.1186/1029-242X-2012-54
Comments
View archived comments (1)