stochastic processes – How to characterize the variance of a linear Gaussian system with switching?

Consider a random process described by the following linear dynamics:

$$
x_{k+1} = a x_k + n_k,
$$

where $|a|<1$ and $n_k$s are i.i.d. standard normal distributed.

It is quite easy to prove that $x_k$ converges to a zero mean normal distribution with variance $1/(1-a^2)$.

However, if we consider the following process,
$$
x_{k+1} = begin{cases}
a x_k + n_k, & if |x|<M\\
b x_k + n_k, & if |x|geq M
end{cases},
$$

where $|b|<1$ is also stable. I think it is quite easy to show that $x_k$ still converges to some stationary distribution with a zero mean.

On the other hand, is there a way to characterize that the covariance of such a distribution, especially when $M$ is very large? For example, something like
$$
left|lim_{krightarrow infty} mathbb Ex_k^2 – 1/(1-a^2)right| leq C_1times exp(-C_2M^2),
$$

where $C_1$ and $C_2$ are some constants related to $a$ and $b$.

The reason for believing the above inequality is that when $M$ is very large, there is a very small probability for $x$ to exit the region $\{|x|<M\}$ (which I think should be related to the error function of normal distribution, although $x_k$ is not exactly normal distributed), and even if it exists the region, it will come back very quickly since $b$ is stable. However, I am having trouble to put it in a rigorous way.