[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]Contents

I find it may cost me so much time in doing such solutions to exercises and problems....I am sorry that I could not be persistent in doing it...Wish I could just recover it later on.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]PrI.6.1

Given a basis $U=(u_1,\cdots,u_n)$ not necessarily orthonormal, in $\scrH$, how would you compute the biorthogonal basis $\sex{v_1,\cdots,v_n}$? Find a formula that expresses $\sef{v_j,x}$ for each $x\in\scrH$ and $j=1,\cdots,k$ in terms of Gram matrices.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.10

Every $k\times k$ positive matrix $A=(a_{ij})$ can be realised as a Gram matrix, i.e., vectors $x_j$, $1\leq j\leq k$, can be found so that $a_{ij}=\sef{x_i,x_j}$ for all $i,j$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.9

(Schur's Theorem) If $A$ is positive, then $$\bex \per(A)\geq \det A. \eex$$

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.8

Prove that for any matrices $A,B$ we have $$\bex |\per (AB)|^2\leq \per (AA^*)\cdot \per (B^*B). \eex$$ (The corresponding relation for determinants is an easy equality.)

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.7

Prove that for any vectors $$\bex u_1,\cdots,u_k,\quad v_1,\cdots,v_k, \eex$$ we have $$\bex |\det(\sef{u_i,v_j})|^2 \leq \det\sex{\sef{u_i,u_j}}\cdot \det \sex{\sef{v_i,v_j}}, \eex$$ $$\bex |\per(\sef{u_i,v_j})|^2 \leq \per\sex{\sef{u_i,u_j}}\cdot \per \sex{\sef{v_i,v_j}}. \eex$$

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.6

Let $A$ be a nilpotent operator. Show how to obtain, from aJordan basis for $A$, aJordan basis of $\wedge^2A$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.5

Show that the inner product $$\bex \sef{x_1\vee \cdots \vee x_k,y_1\vee \cdots\vee y_k} \eex$$ is equal to the permanent of the $k\times k$ matrix $\sex{\sef{x_i,y_j}}$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.4

If $\dim \scrH=3$, then $\dim \otimes^3\scrH =27$, $\dim \wedge^3\scrH =1$ and $\dim \vee^3\scrH =10$. In terms of an orthonormal basis of $\scrH$, write an element of $(\wedge^3\scrH )\oplus \vee^3\scrH)^\perp$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.3

Let $\scrM$ be a $p$-dimensional subspace of $\scrH$ and $\scrN$ its orthogonal complement. Choosing $j$ vectors from $\scrM$ and $k-j$ vectors from $\scrN$ and forming the linear span of the antisymmetric tensor products of all such vectors, we get different subspaces of $\wedge^k\scrH$; for example, one of those is $\vee^k\scrM$. Determine all the subspaces thus obtained and their dimensionalities. Do the same for $\vee^k\scrH$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.2

The elementary tensors $x\otimes \cdots \otimes x$, with all factors equal, are all in the subspace $\vee^k\scrH$. Do they span it?

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.5.1

Show that the inner product $$\bex \sef{x_1\wedge \cdots \wedge x_k,y_1\wedge \cdots\wedge y_k} \eex$$ is equal to the determinant of the $k\times k$ matrix $\sex{\sef{x_i,y_j}}$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.4.6

Let $A$ and $B$ be two matrices (not necessarily of the same size). Relative to the lexicographically ordered basis on the space of tensors, the matrix for $A\otimes B$ can be written in block form as follows: if $A=(a_{ij})$, then $$\bex A\otimes B=\sex{\ba{ccc} a_{11}B&\cdots&a_{1n}B\\ \vdots&\ddots&\vdots\\ a_{n1}B&\cdots&a_{nn}B \ea}. \eex$$

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.4.5

Suppose it is known that $\scrM$ is an invariant subspace for $A$. What invariant subspaces for $A\otimes A$ can be obtained from this information alone?

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.4.4

(1). There is a natural isomorphism between the spaces $\scrH\otimes \scrH^*$ and $\scrL(\scrH,\scrK)$ in which the elementary tensor $k\otimes h^*$corresponds to the linear map that takes a vector $u$ of $\scrH$ to $\sef{h,u}k$. This linear transformation has rank one and all rank one transformations can be obtained in this way.

 

(2). An explicit transformation of this isomorphism $\varphi$ is outlined below. Let $e_1,\cdots,e_n$ be an orthonormal basis for $\scrH$ and for $\scrH^*$. Let $f_1,\cdots,f_m$ be an orthonormal basis of $\scrK$. Identify each element of $\scrL(\scrH,\scrK)$ with it matrix with respect to these bases. Let $E_{ij}$ be the matrix all whose entries are zero except the $(i,j)$-entry, which is $1$. Show that $\varphi(f_i\otimes e_j)=E_{ij}$ for all $1\leq i\leq m$, $1\leq j\leq n$. Thus, if $A$ is any $m\times n$ matrix with entries $a_{ij}$, then $$\bex \varphi^{-1}(A)=\sum_{i,j}a_{ij}(f_i\otimes e_j) =\sum_{i,j}(Ae_j)\otimes e_j. \eex$$

 

(3). the space $\scrL(\scrH,\scrK)$ is a Hilbert space with inner product $$\bex \sef{A,B}=\tr A^*B. \eex$$ The set $E_{ij}$, $1\leq i\leq m$, $1\leq j\leq n$ is an orthonormal basis for this space. Show that the map $\varphi$ is a Hilbert space isomorphism; i.e., $$\bex \sef{\varphi^{-1}(A),\varphi^{-1}(B)} =\sef{A,B},\quad\forall\ A,B. \eex$$

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.4.1

 

Let $x,y,z$ be linearly independent vectors in $\scrH$. Find a necessary and sufficient condition that a vector $w$ mush satisfy in order that the bilinear functional $$\bex F(u,v)=\sef{x,u}\sef{y,v}+\sef{z,u}\sef{w,v} \eex$$ is elementary.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.3.7

For every matrix $A$, the matrix $$\bex \sex{\ba{cc} I&A\\ 0&I \ea} \eex$$ is invertible and its inverse is $$\bex \sex{\ba{cc} I&-A\\ 0&I \ea}. \eex$$ Use this to show that if $A,B$ are any two $n\times n$ matrices, then $$\bex \sex{\ba{cc} I&A\\ 0&I \ea}^{-1}\sex{\ba{cc} AB&0\\ B&0 \ea} \sex{\ba{cc} I&A\\ 0&I \ea}=\sex{\ba{cc} 0&0\\ B&BA \ea}. \eex$$ This implies that $AB$ and $BA$ have the same eigenvalues.(This last fact can be proved in another way as follows. If $B$ is invertible, then $AB=B^{-1}(BA)B$. So, $AB$ and $BA$ have the same eigenvalues. Since invertible matrices are dense in the space of matrices, and a general known fact in complex analysis is that the roots of a polynomial vary continuously with the coefficients, the above conclusion also holds in general.)

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.3.6

If $A$ is a contraction, show that $$\bex A^*(I-AA^*)^{1/2}=(I-A^*A)^{1/2}A^*. \eex$$ Use this to show that if $A$ is a contraction on $\scrH$, then the operators $$\bex U=\sex{\ba{cc} A&(I-AA^*)^{1/2}\\ (I-A^*A)^{1/2}&-A^* \ea}, \eex$$ $$\bex V=\sex{\ba{cc} A&-(I-AA^*)^{1/2}\\ (I-A^*A)^{1/2}&A^* \ea} \eex$$ are unitary operators on $\scrH\oplus \scrH$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.3.1

Let $A=A_1\oplus A_2$. Show that

(1). $W(A)$ is the convex hull of $W(A_1)$ and $W(A_2)$; i.e., the smallest convex set containing $W(A_1)\cup W(A_2)$.

(2). $$\beex \bea \sen{A}&=\max\sed{\sen{A_1},\sen{A_2}},\\ \spr(A)&=\max\sed{\spr(A_1),\spr(A_2)},\\ w(A)&=\max\sed{w(A_1),w(A_2)}. \eea \eeex$$

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.10

(1). The numerical radius defines a norm on $\scrL(\scrH)$.

(2). $w(UAU^*)=w(A)$ for all $U\in \U(n)$.

(3). $w(A)\leq \sen{A}\leq 2w(A)$ for all $A$.

(4). $w(A)=\sen{A}$ if (but not only if) $A$ is normal.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.9

(1). When $A$ is normal, the set $W(A)$ is the convex hull of the eigenvalues of $A$. For nonnormal matrices, $W(A)$ may be bigger than the convex hull of its eigenvalues. For Hermitian operators, the first statement says that $W(A)$ is the close interval whose endpoints are the smallest and the largest eigenvalues of $A$.

(2). If a unit vector $x$ belongs to the linear span of the eigenspaces corresponding to eigenvalues $\lm_1,\cdots,\lm_k$ of a normal operator $A$, then $\sef{x,Ax}$ lies in the convex hull of $\lm_1,\cdots,\lm_k$. (This fact will be used frequently in Chapter III.)

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.8

For any matrix $A$ the series $$\bex \exp A=I+A+\frac{A^2}{2!}+\cdots+\frac{A^n}{n!}+\cdots \eex$$ converges. This is called the exponential of $A$. The matrix $A$ is always invertible and $$\bex (\exp A)^{-1}=\exp(-A). \eex$$ Conversely, every invertible matrix can be expressed as the exponential of some matrix. Every unitary matrix can be expressed as the exponential of a skew-Hermitian matrix.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.7

The set of all invertible matrices is a dense open subset of the set of all $n\times n$ matrices. The set of all unitary matrices is a compact subset of all $n\times n$ matrices. These two sets are also groups under multiplication. They are called the general linear group $\GL(n)$ and the unitary group $\U(n)$, respectively.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.6

If $\sen{A}<1$, then $I-A$ is invertible, and $$\bex (I-A)^{-1}=I+A+A^2+\cdots, \eex$$ aa convergent power series. This is called the Neumann series.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.5

Show that matrices with distinct eigenvalues are dense in the space of all $n\times n$ matrices. (Use the Schur triangularisation)

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.4

(1). The singular value decomposition leads tot eh polar decomposition: Every operator $A$ can be written as $A=UP$, where $U$ is unitary and $P$ is positive. In this decomposition the positive part $P$ is unique, $P=|A|$. The unitary part $U$ is unique if $A$ is invertible.

(2). An operator $A$ is normal if and only if the factors $U$ and $P$ in the polar decomposition of $A$ commute.

(3). We have derived the polar decomposition from the singular value decomposition. Show that it is possible to derive the latter from the former.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.3

(1). Let $\sed{A_\al}$ be a family of mutually commuting operators. Then, there exists a common Schur basis for $\sed{A_\al}$. In other words, there exists a unitary $Q$ such that $Q^*A_\al Q$ is upper triangular for all $\al$.

(2). Let $\sed{A_\al}$ be a family of mutually commuting normal operators. Then, there exists a unitary $Q$ such that $Q^*A_\al Q$ is diagonal for all $\al$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.2

Show that the following statements are equivalent:

(1). $A$ is positive.

(2). $A=B^*B$ for some $B$.

(3). $A=T^*T$ for some upper triangular $T$.

(4). $A=T^*T$ for some upper triangular $T$ with nonnegative diagonal entries. If $A$ is positive definite, then the factorization in (4) is unique. This is called the Cholesky decomposition of $A$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.2.1

For fixed basis of in $\scrH$ and $\scrK$, the matrix $A^*$ is the conjugate transpose of the matrix of $A$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.1.3

Use the QR decomposition to prove Hadamard's inequality: if $X=(x_1,\cdots,x_n)$, then $$\bex |\det X|\leq \prod_{j=1}^n \sen{x_j}. \eex$$ Equality holds here if and only if the $x_j$ are mutually orthogonal or some $x_j$ are zero. 

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.1.2

Let $X$ be nay basis of $\scrH$ and let $Y$ be the basis biorthogonal to it. Using matrix multiplication, $X$ gives a linear transformation from $\bbC^n$ to $\scrH$. The inverse of this is given by $Y^*$. In the special case when $X$ is orthonormal (so that $Y=X$), this transformation is inner-product preserving if the standard inner product is used on $\bbC^n$.

 

[Bhatia.Matrix Analysis.Solutions to Exercises and Problems]ExI.1.1

Given any $k$-tupel of linearly independent vectors $X$ as above, there exists a $k$-tuple $Y$ biorthognal to it. If $k=n$, this $Y$ is unique. 

你可能感兴趣的:(content)