machine learning: for the linear equation $ boldsymbol {Ax} = boldsymbol {b} $, why can not there be more than, but less than infinite solutions for a particular b?

It is easy to find some examples for the case of $ 0 $, $ 1 $or $ infty $ Many solutions We will discard all other cases by using the test through contradiction.

Now, suppose that we have at least two different but not infinite solutions for the given equation. Then we will pick up $ boldsymbol {x} _1 $ Y $ boldsymbol {x} _2 $ as solutions. Then we build $ boldsymbol {z} = boldsymbol {x} _1 + mu left[ boldsymbol{x}_1-boldsymbol{x}_2right]$ and show that $ boldsymbol {z} $ allows us to build endless solutions using only the two solutions $ boldsymbol {x} _1 $ Y $ boldsymbol {x} _2 $. We can check it by plugging $ boldsymbol {z} $ in the equation

$$ boldsymbol {A} boldsymbol {z} = boldsymbol {A} boldsymbol {x} _1 + mu boldsymbol {A} left[ boldsymbol{x}_1-boldsymbol{x}_2right]= boldsymbol {b} + mu left[boldsymbol{b}-boldsymbol{b} right]= boldsymbol {b}. $$

We have used $ boldsymbol {Ax} _1 = boldsymbol {Ax}} _2 = boldsymbol {b} $ in this derivation. But that is a contradiction that we only have two different solutions. Therefore, we can conclude that we can only have $ 0 $, $ 1 $ or $ infty $ many solutions