In this post all my matrices will be $mathbb R^{Ntimes N}$ symmetric positive semi-definite (psd), but I am also interested in the Hermitian case. In particular the square root $A^{frac 12}$ of a psd matrix $A$ is defined unambigusouly via the spectral theorem.

Also, I use the conventional Frobenius scalar product and norm

$$

<A,B>:=Tr(A^tB),

qquad

|A|^2:=<A,A>

$$

**Question**: is the folowing inequality true

$$

|A^{frac 12}-B^{frac 12}|^2leq C_N |A-B|quad ???

$$

for all psd matrices $A,B$ and a positive constant $C_N$ depending on the dimension only.

For non-negative scalar number (i-e $N=1$) this amounts to asking whether $|sqrt a-sqrt b|^2leq C|a-b|$, which of course is true due to $|sqrt a-sqrt b|^2=|sqrt a-sqrt b|times |sqrt a-sqrt b|leq |sqrt a-sqrt b| times |sqrt a+sqrt b|=|a-b|$.

If $A$ and $B$ commute then by simultaneous diagonalisation we can assume that $A=diag(a_i)$ and $B=diag(b_i)$, hence from the scalar case

$$

|A^frac 12-B^frac 12|^2

=sumlimits_{i=1}^N |sqrt a_i-sqrt b_i|^2

leq sumlimits_{i=1}^N |a_i-b_i|

leq sqrt N left(sumlimits_{i=1}^N |a_i-b_i|^2right)^frac 12=sqrt N |A-B|

$$

Some hidden convexity seems to be involved, but in the general (non diagonal) case I am embarrasingly not even sure that the statement holds true and I cannot even get started. Since I am pretty sure that this is either blatantly false, or otherwise well-known and referenced, I would like to avoid wasting more time reinventing the wheel than I already have.

This post and that post seem to be related but do not quite get me where I want (unless I missed something?)

**Context**: this question arises for technical purposes in a problem I’m currently working on, related to the Bures distance between psd matrices, defined as

$$

d(A,B)=minlimits_U |A^frac 12-B^frac 12U|

$$

(the infimum runs over unitary matrices $UU^t=Id$)