## probability: two cards are collected and shuffled in another deck

P: Probability: two random cards are chosen without replacement of a deck and inserted into another deck. This deck is shuffled and a card is drawn. If this card is an ace, what is the probability that no ace has moved since the first deck?

## Probability – Definition of a permutation \$ r- \$

I am trying to understand the definition of a $$r-$$permutation. Suppose you have $$7$$ aligned seats and $$7$$ different people, there are $$7!$$ Different ways to settle them.

Suppose I am trying to sit down $$7$$ different people in $$9$$ seats, is that when I use the formula $$P (n, r) = frac {n!} {(N-r)!}$$?

My opinion is that there are two same seats (empty), so the number of distinguishable permutations is $$9! / two!$$ what is that previous formula So my question is, is $$r-$$ permutation a method to discover the permutation of $$r$$ objects in $$n$$ boxes when $$n ge k$$?

Follow up if you have time: If the seats formed a circle, we would divide by $$7$$ How are the seats equivalent to the correct rotation?

## Probability of measured value – MathOverflow

I gave some material A to contain on another material B to the laboratory.

The laboratory returned the response that material A contains 0.26 mg of material B and the accuracy of the laboratory analysis is + -0.05 mg.

With the information given, can I calculate the probability that the content of material B in material A is greater than 0.3 mg?

PS Sorry for my English …

## combinatorial: probability that a randomly drawn subset of a randomly drawn subset has elements \$ k \$

Leave $$A$$ be a uniform random subset of $${1, 2, …, n }$$. Leave $$B$$ be a random subset of $$A$$, also chosen uniformly. What is the probability that $$B$$ have $$k$$ elements?

My approach was as follows:

Note that the probability of a subset of a size $$N$$ set have $$m$$ elements is $$frac { binom Nm} {2 ^ N}$$, since there are $$2 ^ N$$ possible subsets and $$binom Nm$$ ways to make a subset with $$m$$ elements. Then

begin {align *} operatorname {Pr} (B text {has k }) & = sum_j operatorname {Pr} (B text {has k }, A text {has j }) \ & = sum_j operatorname {Pr} (B text {has k } , | , A text {has j }) operatorname {Pr} (A text {has j }) \ & = sum_ {j = 0} ^ n Big ( frac { binom jk} {2 ^ j} Big) Big ( frac { binom nj} {2 ^ n} Big) \ & = frac 1 {2 ^ n} sum_ {j = 0} ^ n frac 1 {2 ^ j} binom jk binom nj end {align *}

But the summary $$sum_ {j = 0} ^ n frac 1 {2 ^ j} binom jk binom nj$$ something unpleasant comes out, which makes me doubt my answer. What did I do wrong?

## probability: show that \$ Pr (g (X) geq c) leq frac {E (g (X)} {c} \$

Leave $$X$$ be a random variable and $$g: R a R ; neither ; g (x) geq 0 ; forall x in R$$ and let $$c> 0$$. Then, show that, (as long as the expectation exists)
$$Pr (g (X) geq c) leq frac {E (g (X)} {c}$$

I will try when $$X$$ It is continuous Leave $$f$$ be the PDF of $$X$$, then

$$E (g (X)) = int _ {- infty} ^ { infty} g (x) f (x) dx = int_ {g (x) geq c} g (x) f (x ) dx + int_ {g (x)

Which proves it. Is this the correct test?

## probability theory – Decomposition of mutual information

I found a book where the author uses the following property of mutual information:

Leave $$X$$,$$Y$$,$$Z$$ be arbitrary discrete random variables and leave $$W$$ Be a random indicator variable.

$$(1) I (X: Y mid Z) = Pr (W = 0) I (X: Y mid Z, W = 0) + Pr (W = 1) I (X: Y mid Z, W = 1)$$

I do not understand why this property is maintained in general.
To show this, I was thinking of proceeding as follows:
begin {align} I (X: Y mid Z) & = E_z (I (X: Y mid Z = z)) \ & = E_w (E_z (I (X: Y mid Z = z) | W = w)) \ & = Pr (W = 0) E_z (I (X: Y mid Z = z) | W = 0) \ & + Pr (W = 1) E_z (I (X: Y mid Z = z) | W = 1). end {align}
where the second line follows the law of total expectation.
However, this does not seem to be the right approach since it is not clear to me that
$$E_z (I (X: Y mid Z = z) | W = w) = I (X: Y mid Z, W = w)$$
holds.

What is the correct way to show (1)?

## Probability: why is Polish space a standard measurable space?

A measurable space ($$Omega$$,$$mathcal F$$) is called standard measurable space if it is Borel isomorphic to one of the following measurable values
spaces: ($$<1, n>$$,$$mathcal B (<1,n>$$ ($$mathbb N$$,$$mathcal B ( mathbb N)$$) or ($$M, math B (M)$$), where <1, n> =
{1, 2 ,. . . , n} with the discrete topology, $$mathbb N$$ = {1, 2 ,. . . } with the
discrete topology and $$M$$=$${0,1 } ^ { mathbb N}$$ With the topology of the product.

Where Borel isomorphic means that there is measurable bijection mapping.

My question is why is Polish space a standard measurable space?

## probability theory: find the PDF of the sum of two independent normal variables squared using the pdf of the normal variable squared

I know that pdf of $$X ^ 2$$, where $$X sim N ( mu, sigma ^ 2)$$, it is $$f (t) = frac {1} { sqrt {2 pi} sigma} exp ( frac { sqrt {t} – mu} {2 sigma ^ 2}) frac {1} { sqrt {t}}$$. I want to find pdf of $$X_1 ^ 2 + X_2 ^ 2$$ using $$f (t)$$. PDF of sum of independent variables is $$int _ {- infty} ^ { infty} f_ {X_1} (s) f_ {X_1} (t-s) ds$$.

Leave $$X_1, X_2 sim N (0, sigma ^ 2)$$

$$f_ {X_1} (s) f_ {X_1} (ts) = frac {1} {2 pi sigma ^ 2} exp (- frac {t} {2 sigma ^ 2}) frac {1 } { sqrt {s (ts)}}$$

Integrate this:

$$frac {1} {2 pi sigma ^ 2} exp (- frac {t} {2 sigma ^ 2}) int _ {- infty} ^ infty frac {1} { sqrt {s (ts)}} ds$$

But the integral does not converge.

Where is my mistake?

## probability theory – Reservoir sampling vs Round Robin

You are given a list of numbers (unknown length).

Let's say the length is 10.

GetRandom (List) is called once. If implemented correctly, each number has a 1/10 chance of being returned.

GetRandom (List) is called 100 times. If implemented correctly, each number will appear 10 times in the result.

Penalty fee?

Now you have to do the same for a flow of numbers.

GetRandom is called (Stream, 5). This adds 5 to the transmission. The sequence is of length N = 1, then 5 is returned (probability = 1 / N = 1)

GetRandom (Stream, 3) is called. 3 is added to the transmission. N = 2. 3 or 5 is returned (prob = 1/2).

How will it be checked if this is correct?
If GetRandom (Stream) (without adding more numbers) is called 10 times when the length of the list is 2, each number (3 and 5) must be returned ~ 5 times.

GetRandom is called (Stream, 7). 7 is added to Stream. N = 3. One of the 3 numbers (5, 3, 7) (probability = 1/3) is returned.

But how will it be checked if this is correct?
If GetRandom (Stream) is called 10 times when N = 3, each number is returned ~ 3 times.

So far so good?

Alright, here is my algorithm:

``````N = 0
Pointer = 0

GetRandom(Stream, Number = NULL):
Pointer += 1

if Number is NOT NULL:
N += 1

else:
if Pointer == N:
Pointer = 1    # Reset

return Stream(Pointer)    # Assume 1-based indexes
``````

This simply goes through all the numbers in order / round-robin.

If GetRandom (Stream) is called in a Stream with 100 numbers, 1000 times, each number will appear exactly 10 times.

If GetRandom (Stream, 77) is called in a Stream with 100 numbers (77 is number 101), while the Pointer was reset to the initial location 1. Then, when GetRandom (Stream) is called 101 times, in the call 101, 77 will go out, which satisfies the required probability of 1/101. If called 202 times, on call 202, 77 will be issued, which satisfies 2/202.

Why bother with reservoir sampling k / k + 1, etc.?

## geometry – Probability that the product of distances is less than 1

Here is a probability problem that I modified. Essentially, you start with a circle of radius $$1$$, and through a dart (we assume that you are a horrible player or that you are blindfolded so that we can neglect rotational symmetry, etc.). If the dart is out of the circle, you lose. If inside, the radius is reduced by a numerically equal factor to the distance between the target and the point where the dart fell.

Consider the first release. The initial radius is $$r_0 = 1$$, and the dart lands at a point with coordinates $$(x_0, y_0)$$, where the origin is focused on the target. Then, the next radio will be:
$$r_1 = frac {1} { sqrt {x_0 ^ 2 + y_0 ^ 2}}$$.

Then we launch the second dart, which lands at one point $$(x_1, y_1)$$. So, for this second dart to be inside the circle, we must have that:
$$sqrt {x_1 ^ 2 + y_1 ^ 2}
Then, the probability that the second dart is equal to the probability that $$(x_0 ^ 2 + y_0 ^ 2) (x_1 ^ 2 + y_1 ^ 2) <1$$ for 4 numbers $$x_0, y_0, x_1, y_1 in (0,1)$$. My question is, what is this probability?

My opinion is that it should be equal to the probability that the product of two numbers from 0 to 2 is less than 1. Is this correct?