Relative error of a number in machine epsilon units

I came across an estimation of the relative error between two representations of the same number, one implemented in C++ and another one via a computer algebra program, that was in units of machine epsilon. My question is trivial, though I wasn’t able to find an answer:
If I say that the number has 3 machine epsilons relative error, how many digits does it mean that the approximated number looses compared to the true number?