## reference request – Finetti's style theorem for specific processes

I am new to signal processes. I know there are several theorems along the lines that if a timely process $$eta$$ satisfies:

1. Complete independence (random variables $$eta (B_1), ldots, eta (B_n)$$ are independent for measurable delimited disjoint pairs $$B_1, ldots, B_n$$) Y

2. Some conditions of regularity such as being simple and uniform $$sigma$$-finite in a Borel subset of a complete separable metric space,

so $$eta$$ It is a Poisson process.

It seems that something like the following should be known. I think (I haven't worked all the details) that yes $$eta$$ satisfies:

1. (a) A condition that could be called "complete interchangeability": for any delimited disjoint measure $$B_1, ldots, B_n$$, exist $$A_i subseteq B_i$$, with equality for at least one $$i$$such that $$eta (A_1), ldots, eta (A_n)$$ they are interchangeable random variables;

or the equivalent

1. (b) For any measurable delimited disunity $$B_1, ldots, B_n$$ with $$mathbb {E} eta (B_1) = ldots = mathbb {E} eta (B_n)$$, $$eta (B_1), ldots, eta (B_n)$$ they are interchangeable random variables;

as much as

1. Similar regularity conditions that include $$mathbb {E} eta (B) < infty$$ for all measurable limits $$B$$; Y
2. $$mathbb {E} eta ( text {integer space}) = infty$$

then there is a random scalar variable of non-negative value $$G$$ with $$mathbb {E} G = 1$$ such that conditioned on $$G$$, $$eta$$ It is a Poisson process with intensity measurement $$G mathbb {E} eta$$.

Without (3) there are simple counterexamples, e.g. $$eta = delta_x$$ where $$x$$ It is distributed according to a non-atomic probability distribution.

Could anyone provide a reference for such a point process of Finetti's theorem?

## stochastic processes: variance of a random variable obtained from a linear transformation

Edit: I needed to review this question as suggested.

Suppose there are $$N$$ Realizations of the Gaussian process denoted as vectors $$mathbf {z} _ {j} in mathbb {R} ^ {n}$$ for $$j = 1, ldots, N$$. Leave $$and$$ be a random variable such that $$y = sum_ {j = 1} ^ {N} ( mathbf {B} mathbf {z} _ {j}) (i)$$
where $$mathbf {B}$$ It is a unitary matrix. What is the variance of $$y2$$?

Explanation: Boldface represents the vector or matrix. $$( mathbf {B} mathbf {x}) (i)$$ represents the $$i$$-th vector entry $$mathbf {B} mathbf {x}$$.

## performance: conditional increments based on the core of many stochastic processes

I have written that this function is part of a research project that involves analyzing time series data from stochastic processes. We have a small number (from 1 to 3) of independent observations of a scalar time series. The observations have different lengths, and each one contains approximately $$10 ^ 4-10 ^ 5$$ data points The function below `nKBR_moments.m` it takes an array of observations cells as input, along with other configurations, and generates statistical quantities known as "moments of conditional increments". These are the variables. `M1` Y `M2`. For more details of the theory, this research paper describes a similar method.

For research purposes, the function will eventually be evaluated tens of thousands of times, on a desktop computer. An evaluation of this function takes approximately 3 seconds with the test script that I have provided below. Thoughts on optimizing code performance, memory usage or scalability are appreciated.

MATLAB function:

``````function (Xcentre,M1,M2) = nKBR_moments(X,tau_in,Npoints,xLims,h)
%Kernel based moments, n-data
%
%   Notes:
%   Calculates kernel based moments for a given stochastic time-series.
%   Uses Epanechnikov kernel with built in computational advantages. Uses
%   Nadaraya-Watson estimator. Calculates moments from n sources of data.
%
%
%   Inputs:
%   - "X"                       Observed variables, cell array of data
%   - "tau_in"                  Time-shift indexes
%   - "Npoints"                 Number of evaluation points
%   - "xLims"                   Limits in upper and lower evaluation points
%   - "h"                       Bandwidth
%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%% Processing
dX = (xLims(2)-xLims(1))/(Npoints-1); % Bins increment
Xcentre = xLims(1):dX:xLims(2); % Grid
heff = h*sqrt(5); % Effective bandwidth, for setting up bins
eta = floor(heff/dX+0.5); % Bandwidth for bins optimizing

% Epanechnikov kernel
K= @(u) 0*(u.^2>1)+3/4*(1-u.^2).*(u.^2<=1);
Ks = @(u) K(u/sqrt(5))/sqrt(5); % Silverman's definition of the kernel (Silverman, 1986)
Kh = @(u) Ks(u/h)/h; % Changing bandwidth

% Sort all data into bins
Bextend = dX*(eta+0.5); % Extend bins
edges = xLims(1)-Bextend:dX:xLims(2)+Bextend; % Edges
ndata = numel(X); % Number of data-sets
Xloc = cell(1,ndata); % Preallocate histogram location data
nXdata = cellfun(@numel,X); % Number of x data
key = 1:max(nXdata); % Data key
for nd = 1:ndata
(~,~,Xloc{nd}) = histcounts(X{nd},edges); % Sort
end
Xbinloc = eta+(1:Npoints); % Bin locations
BinBeg = Xbinloc-eta; % Bin beginnings
BinEnd = Xbinloc+eta; % Bin beginnings

% Preallocate
Ntau = numel(tau_in); % Number of time-steps
(M1,M2) = deal(zeros(Ntau,Npoints)); % Moments
(iX,iXkey,XU,Khj,yinc,Khjt) = deal(cell(1,ndata)); % Preallocate increment data

% Pre calculate increments
inc = cell(Ntau,ndata);
for nd = 1:ndata
poss_tau_ind = 1:nXdata(nd); % Possible time-shifts
for tt = 1:Ntau
tau_c = tau_in(tt); % Chosen shift
tau_ind = poss_tau_ind(1+tau_c:end); % Chosen indices
inc{tt,nd} = X{nd}(tau_ind) - X{nd}(tau_ind - tau_c);
end
end

% Loop over evaluation points
for ii = 1:Npoints

% Start and end bins
kBinBeg = BinBeg(ii);
kBinEnd = BinEnd(ii);

% Data and weights
for nd = 1:ndata
iX{nd} = and(kBinBeg<=Xloc{nd},Xloc{nd}<=kBinEnd); % Data in bins
iXkey{nd} = key(iX{nd}); % Data key
XU{nd} = X{nd}(iX{nd}); % Unshifted data
Khj{nd} = Kh(Xcentre(ii)-XU{nd}); % Weights
end

% For each shift
for tt = 1:Ntau
tau_c = tau_in(tt); % Chosen shift

% Get data
for nd = 1:ndata
XUin = iXkey{nd}; % Unshifted data indices
XUin(XUin>nXdata(nd)-tau_c) = (); % Clip overflow
yinc{nd} = inc{tt,nd}(XUin); % Increments
Khjt{nd} = Khj{nd}(1:numel(yinc{nd})); % Clipped weight vector
end

% Concatenate data
ytt = (yinc{:});
Khjtt = (Khjt{:});

% Increments and moments
sumKhjtt = sum(Khjtt);
M1(tt,ii) = sum(Khjtt.*ytt)/sumKhjtt;

y2 = (ytt - M1(tt,ii)).^2; % Squared (with correction)
M2(tt,ii) = sum(Khjtt.*y2)/sumKhjtt;
end
end
end
``````

MATLAB test script (no comments are required for this):

``````%% nKBR_testing
clearvars,close all

%% Parameters

% Simulation settings
n_sims = 10; % Number of simulations
dt = 0.001; % Time-step
tend1 = 40; % Time-end, process 1
tend2 = 36; % Time-end, process 1
x0 = 0; % Start position
eta = 0; % Mean
D = 1; % Noise amplitude
gamma = 1; % Drift slope

% Analysis settings
tau_in = 1:60; % Time-shift indexes
Npoints = 50; % Number of evaluation points
xLims = (-1,1); % Limits of evaluation
h = 0.5; % Kernel bandwidth

%% Simulating
t1 = 0:dt:tend1;
t2 = 0:dt:tend2;

% Realize an Ornstein Uhlenbeck process
rng('default')
ex1 = exp(-gamma*t1);
ex2 = exp(-gamma*t2);
x1 = x0*ex1 + eta*(1-ex1) + sqrt(D)*ex1.*cumsum(exp(gamma*t1).*(0,sqrt(2*dt)*randn(1,numel(t1)-1)));
x2 = x0*ex2 + eta*(1-ex2) + sqrt(D)*ex2.*cumsum(exp(gamma*t2).*(0,sqrt(2*dt)*randn(1,numel(t2)-1)));

%% Calculating and timing moments

tic
for ns = 1:n_sims
(~,M1,M2) = nKBR_moments({x1,x2},tau_in,Npoints,xLims,h);
end
nKBR_moments_time = toc;
nKBR_average_time = nKBR_moments_time/n_sims

%% Plotting

figure
hold on,box on
plot(t1,x1)
plot(t2,x2)
xlabel('Time')
ylabel('Amplitude')
title('Two Ornstein-Uhlenbeck processes')

figure
subplot(1,2,1)
box on
plot(dt*tau_in,M1,'k')
xlabel('Time-shift, tau')
title('M^{(1)}')
subplot(1,2,2)
box on
plot(dt*tau_in,M2,'k')
xlabel('Time-shift, tau')
title('M^{(2)}')
``````

The test script will create two figures similar to the following.

## Probability: stochastic processes and continuity of expectations

Leave $$X$$ be a continuous stochastic process in $$(0, 1)$$ such that $$mathbb E (X_t)$$ it's finite for everyone $$t in (0, 1)$$. Given any non-null subset $$Y$$ of the probability space, define $$mathbb Q_Y$$ be the measure of restricted probability $$mathbb Q_Y (E) = P (E cap Y) / P (Y)$$.

Do you still have any non-zero $$Y$$ such that the function $$f: (0, 1) a R$$ definite $$f (t)$$ $$=$$ $$mathbb E_ {Q_Y} (X_t)$$ Is it continuous a.e.?

## Stochastic processes: the probability distribution of "derivative" of a random variable.

Resignation: Cross-published in math.SE.

Let's set the stage;

Consider a stochastic PDE, which has to follow the form

$$partial_t h (x, t) = H (x, t) + chi (x, t),$$
where $$H$$ It is a deterministic function, and $$chi (x, t)$$ It is a random variable.

In my case, the approximate solution of this sPDE is known (through experimental and numerical simulations):

$$h (x, t) approx G (x, t) + epsilon (x, t),$$
where $$epsilon$$ It is a stochastic variable.

Of course, the solution of $$h$$ it is not differentiable in the usual sense, but if the underlying distribution of $$epsilon$$ It is a symmetric distribution, like Gaussian, if you observe $$partial_t h$$ long enough, deviations from $$h$$ since $$G$$ will be canceled, so that you can determine experimentally or numerically $$G$$ quite accurately

However, this forces us to know the Distribution of changes in values. $$epsilon$$.

In this sense, this is "take the derivative of". $$epsilon$$.

A brief analysis revealed to me that, if the underlying probability distribution of $$epsilon$$ is $$g$$ (Let's suppose $$epsilon$$ it's a function of only t for the sake of the argument), then

the probability that $$z-w$$ change to occur is $$g (z) g (w)$$, because if the value of $$epsilon$$ at the time $$t$$ is $$z$$, Then in $$t + dt$$, the probability that $$epsilon (t + dt) = w$$ is $$g (w)$$; therefore, considering that the probability that $$epsilon (t) = z$$ first is $$g (z)$$, then the probability that (Attention: abuse of notation) $$d epsilon = z-w$$ is $$g (w) g (z)$$ (Of course, this needs some normalization, but it is irrelevant to what I want to ask here).

For example, if my analysis is correct, the "derivative" of a Gaussian random variable remains a Gaussian.

Question:

Considering how "elementary" this idea made me wonder, is there a theory that captures the calculation in such a random variable? I would also like to integrate random variables (although I have not thought what that would mean physically or intuitively). I am looking for references / documents that deal with this type of theory; Not only do I "derive" from a random variable in a random sense, I need exactly above the way of thinking in theory.

I mean, I am aware of the existence of the calculation of Ito and the calculation of Malliavin, but every time I tried to learn what it is, or what is the underlying idea (as what it means physically to derive means in the theories) people would do. I begin to launch a terminology that I don't know. Don't get me wrong, I'm also a math student, but in math, doing theory without giving any motivation or the basic idea is so common, and I hate it in such a way that I don't read math books anymore. .

## Stochastic processes: if they exist, are the limits of "almost certain convergence" and "average convergence" the same for a sequence of random variables?

I have a sequence of random variables. $$(X_n) _ {n in mathbb {N} _0}$$ that converge both "almost sure" and "on average" to random variables $$A$$ Y $$B$$:
$$P ( lim X_n = A) = 1 text {(almost certain convergence)} \ E (| X_n – B |) rightarrow B text {(convergence in the mean)}$$

My guess would be that $$A = B$$ Almost sure, but I could not find a test.

Do you know any results about it?

What additional properties does it have? $$(X_n) _n$$ you need to have such $$A = B$$ almost sure?

Do you know any example in which $$P (A neq B)> 0$$?

## linux – "Error in graphics card (nvidia-smi prints" ERR! "in FAN and in Use)" and the processes are not deleted and the gpu is not reset

I have a problem using a gpu on the ubuntu server.
(nvidia-smi prints "ERR!" in FAN and in Uso, it is not restarting gpu, the processes are not deleted)

When I looked in Google for the problem, I discovered that I could reboot the GPU or delete the processes that use that gpu.

• So I tried to kill processes who use that gpu with
"sudo kill -9 pid"
But it does not work !!
I searched on Google how to do it when "sudo kill" does not work.

And I thought they are Zombie processes.
Then, I found Zombie processes and I removed it.

After that, there are no more Zombie processes when I searched.

But those three processes have not yet been eliminated.

• I tried restarting gpu with "nvidia-smi –gpu-reset -i 0"
• BUT it prints "GPU Reset could not be executed because GPU 00000000: 01: 00.0 is the main GPU.
"
since other people are using other gpus on that server, I just want to reset the index-0 gpu

and I found the problem in google, and the answer was to kill the processes that run in that gpu. First problem again!

((I'm not used to using gpu, is not it a good way to run several codes on the same gpu?))

## Processes: background applications are eliminated (for something other than battery optimization)

All background applications are disabled when I turn off the screen.
This is particularly annoying for my most used messaging application, Whatsapp.

I have seen the obvious option of "Battery Optimization":

According to the advice found elsewhere, I have verified that the limit of the process in the background in the developer options is the default (I have never changed this AFAIK):

And yet, it dies in the same way that battery optimization would. If I look at executing processes in the background:

When I then press the power button and turn it on again after <10s, the list is the same. But if I expect more than that:

… Whatsapp has disappeared.

This happens for all applications that I install that have some kind of useful background feature: the Battery Optimization setting seems to no longer do its job.

What else can I try to diagnose this?

NB0: Android 7.0 on DOOGEE S60

NB1: This just started happening some time ago, that is, did I work correctly when I received the phone.

NB2: I know how to use `adb`

NB3: I also like to try new applications and I have ~ 400 applications installed. Ideally, do not uninstall / disable everyone of them 1 by 1 to see what is the culprit …

## Postgresql 10 backend processes – Database administrators stack Exchange

I have a particular problem with Postgresql 10:
Postgresql creates back-end processes for each connection in different ports. The problem is that some of our applications also use many ports and sometimes collapse with pgsql, which results in a port link error. One solution is to start the applications first and then the postgres service, but I am looking for a cleaner solution. Do you know if there is a way to specify a range of ports used by postgresql for backend processes? I have not found something built in postgres so far.
Given a Gaussian process $$g: = mathcal {GP} left ( mu, Sigma right)$$,
where $$mu$$ is the average and $$Sigma$$ is the covariance function, I'm interested in estimating the average value $$L_m$$ Of the distances between up and down with a constant level. $$u$$, that is, these distances:
In this plot, I use $$u = 0$$, but ideally I would like $$u$$ be generic I suspect this is related to Rice's formula, which estimates the number of ascending crosses for a given Gaussian process and a given length domain, but I do not know how