In a book I met a formula for mathematics. expectation of a random variable $ xi $ with distribution function $ F (x) $:

$$ M { xi} = – int _ {- infty} ^ {0} F (x) dx + int_ {0} ^ { infty} (1-F (x)) dx $$

I wonder how do I show it?

My attempt continues:

$ M xi equiv int _ {- infty} ^ { infty} xdF (x) = lim_ {a to- infty} ^ {b to infty} int_ {a} ^ {b } xdF (x) $

By integrating myself in parts, I get

$ int_ {a} ^ {b} xdF (x) = (xF (x)) rvert_ {a} ^ {b} – int_ {a} ^ {b} F (x) dx = bF (b) -aF (a) – int_ {a} ^ {0} F (x) dx- int_ {0} ^ {b} F (x) dx = bF (b) -aF (a) – int_ {a } ^ {0} F (x) dx + int_ {0} ^ {b} (1-F (x)) dx-b =[-int_{a}^{0}F(x)dx+int_{0}^{b}(1-F(x))dx]+ b (F (b) -1) -aF (a). $

When I go to the limit, I get

$ M xi = – int _ {- infty} ^ {0} F (x) dx + int_ {0} ^ { infty} (1-F (x)) dx- lim_ {a to – infty} aF (a) + lim_ {b to infty} b (F (b) -1) $

So, to prove the initial statement, I need to show that for the arbitrary distribution function $ F $

$ lim_ {a to- infty} aF (a) = 0 $

Y

$ lim_ {b to infty} b (F (b) -1) = 0 $

However, I have no idea how to test it and, in addition, I doubt it is true.