# probability: why is the event \$ {X_ {(j)} le x_i } \$ equivalent to the event \$ {Y_i ge j } \$?

From the statistical inference of Casella and Berger:

Leave $$X_1, points X_n$$ Be a random sample of a discrete distribution.
with $$f_X (x_i) = p_i$$, where $$x_1 lt x_2 lt dots$$ they are possible
values ​​of $$X$$ in ascending order. Leave $$X _ {(1)}, dots, X _ {(n)}$$
denotes the order statistics of the sample. Define $$Y_i$$ as the number of $$X_j$$ that are less than or equal to
$$x_i$$. Leave $$P_0 = 0, P_1 = p_1, dots, P_i = p_1 + p_2 + dots + p_i$$.

Yes $${X_j le x_i }$$ it's a "success" and $${X_j gt x_i }$$ it is a "failure", then $$Y_i$$ it is binomial with parameters $$(n, P_i)$$.

Then the event $${X _ {(j)} le x_i }$$ is equivalent to the event $${Y_i ge j }$$

Can anyone explain why these two are equivalent?

$${X_ {(j)} le x_i } = {s in text {sun} (X _ {(j)}): X _ {(j)} (s) le x_i }$$

$${Y_i ge j } = {s & # 39; in text {sun} (Y_i): Y_i (s & # 39;) ge j }$$

I have trouble understanding how these functions of random variables show this equivalence.