Statistics – Formula for weighted average difference

In my application I have instances where $ m_1 ge1, m_2 ge1 $ the models have produced activity values ​​respectively (after some other calculations) $ a_1, a_2 in (0,1) $ ($ 0 $ medium inactive, $ 1 $ is totally activated) the $ m_1 $ the models represent the good ones and the $ m_2 $ the bad some so to speak. I always take the difference: $$ d = a_1-a_2 in (-1,1) $$ where values ​​closest to $ -1 $ indicate that the good In general, the models are more deactivated than the bad those of that grouping ($ m_1 $ vs $ m_2 $ models), values ​​closer to $ d = 1 $ indicate that they are more activated and values ​​closer to $ 0 $ indicates that there is no activation difference between the two groups of models.

That works fine, but I wanted take into account the number of models $ m_1, m_2 $ to counteract the bias which is being introduced as you can see in the following example: $ a_1 = 0.9, a_2 = 0.1, d = 0.8 $, with $ m_1 = 5, m_2 = $ 1000. I would like to have a smaller difference than $ 0.8 $, from the $ m_1 << m_2 $. So, I did the weighted average (type of) as: $$ d_ {w} = frac {m_1a_1-m_2a_2} {m_1 + m_2} $$
The problem is that the penalty is now too large, p. For the previous example, $ d_w = -0.095 $, which is wrong since I would never expect it to be less than zero in this case.

So I want reduce the penalty making the numbers smaller and smaller Come closerand what is better than using $ log_ {10} $ do exactly that: $$ d_ {lw} = frac {log (m_1) a_1-log (m_2) a_2} {log (m_1) + log (m_2)} $$

Now, the previous example produces $ d_ {lw} = $ 0.0889Much more sensible! the good the models are more active, but since I just arrived $ 5 $ of them vs $ 1000 $ bad some, the estimation of the activity difference is penalized.

by $ m_1 = $ 1.2 model the $ d_ {lw} $ The result is negative in the example! And if we put $ m_1 = m_2 $ I would expect the result to be equal to the original difference $ d_ {lw} = d $, but it is not 🙁 I tried to duplicate the last difference: $ d_2 = 2 * d_ {lw} $ that solved this, but now of course $ d_2 in (-2.2) $ i don't want

I'm looking for a difference function $ f (m_1, m_2, a_1, a_2) in (1,1) $, for which the following properties are true:

  • $ f (m, m, a_1, a_2) = a_1-a_2 = d $
  • $ lim_ {m_1 << m_2} f (m_1, m_2, a_1, a_2) = 0 $, (as in the previous example). The same if m_1 >> m_2.
  • The transition of the $ m_1 = m_2 $ Extreme cases (where model numbers differ too much) should be not steep – I don't know how to express this in a single word, but what I mean is that if you think the model number is starting to change from equality $ m_1 = m_2 $ the $ d $ the difference should be difficult to change and only near the ends should we begin to see a notable difference … I have also called this last property the Montana (it would be interesting to see what it means in mathematical terms) since equality is like the top of the mountain and to the right and to the left are the slopes (which in my case I want them to be passable – that is, not steep).

Go through Array inside an Array, then get the average Laravel php

I have the following array

array:3 [▼
  0 => array:1 [▼
    "value" => 1
  ]
  1 => array:1 [▼
    "value" => 1
  ]
  2 => array:1 [▼
    "value" => 1
  ]
]

To be able to average according to my logic I have to go through the array and take out each of the elements, is that correct?

in this case I tried with for and foreach but apparently I'm doing something wrong

python: replace certain values ​​with the average in a panda data frame

Hi

I have a DataFrame as shown in the image below. I would like to replace the nan values ​​of the QTDVENDADIARY column with the average of the two previous records (40 + 27/2), in the same way the price column (2.38 + 2.38 / 2) and in the EMISSAO column where the NaT value the date in column SEQ_DATA. In the same way with the following Nan value of column QTDVENDADIARIA (68 + 54/2) and PRECO (2.38 + 2.38 / 2). I tried to do it as follows:

df.fillna(df.mean(),axis=1)

however, in this way, it replaces the averages of the entire column, and on the date it does not put a value on anything.

insert image description here

What is the average decomposition rate in the population of St. Loius, Missouri?

Using the information here
http://worldpopulationreview.com/us-cities/st-louis-population/
I am trying to find the average decomposition rate. I only have problems to do it, and it is taking me forever. If someone has a faster way to calculate this, could you tell me what the rate is?

Average CPC

Hello guys!

I just wonder what everyone's average CPC is. Without trying to steal anyone's idea, I would like to have an idea of ​​what the average is for some Adsense users.

For me, my average CPC is about $ 2, and I am earning around ~ $ 60 / day. So what is your CPC?
SEMrush

exchange rate: where can I find the average monthly price published?

I am looking for simple raw data like:

2020-01,8150,USD
2019-12,7485,USD

It's fine if there is a user interface wrapped around. It doesn't have to be raw.

I am aware that I could calculate it myself using daily historical data, but I hope to find a published source, thanks.

plotting – How to plot the time-dependent parametric diagram with different parameters that take the average time?

I am trying to trace the solution given in the code with respect to "delc". Now the problem is that it can be plotted for a particular value of "t" as t = 10,20,50,60 … up to 100, but what I need is the average of all these frames. Is it possible to plot that by adjusting the axis and so that you can get a single graph in which time also varies up to 100?

 w1 = 1;
 gma1 = 0.005;
 n1 = 1;
 gma2 = 0.005;
 G1 = 0.005;
 k1 = .1;
 k2 = 0.1;
 a1 = 0.07;
 a2 = 0.58;
 k0 = 0.1;
 Q1 = 1.268;
 del0 = 1;
 N1 = 1;
 ome = 1;
 M1 = del0*(1 - Cos(ome*t));
 s = ParametricNDSolveValue({V11'(t) - V21(t)*w1 - V12(t)*w1 == 0, 
 V12'(t) - V22(t)*w1 + w1*V11(t) + gma1*V12(t) - 
  Sqrt(2)*G1*a1*V13(t) - Sqrt(2)*G1*a2*V14(t) == 0, 
 V13'(t) - V23(t)*w1 + k1*V13(t) + 
 Sqrt(2)*G1*a2*V11(t) - (-G1*Q1 + delc)*V14(t) == 0, 
 V14'(t) - V24(t)*w1 + k1*V14(t) - 
  Sqrt(2)*G1*a1*V11(t) - (G1*Q1 - delc)*V13(t) == 0, 
  V21'(t) + V11(t)*w1 + gma1*V21(t) - Sqrt(2)*G1*a1*V31(t) - 
   Sqrt(2)*G1*a2*V41(t) - w1*V22(t) == 0, 
   V22'(t) + V12(t)*w1 + gma1*V22(t) - Sqrt(2)*G1*a1*V32(t) - 
    Sqrt(2)*G1*a2*V42(t) + w1*V21(t) + gma1*V22(t) - 
   Sqrt(2)*G1*a1*V23(t) - Sqrt(2)*G1*a2*V24(t) - gma2*(2*n1 + 1) ==
   0, V23'(t) + w1*V13(t) + gma1*V23(t) - Sqrt(2)*G1*a1*V33(t) - 
   Sqrt(2)*G1*a2*V43(t) + k1*V23(t) + 
   Sqrt(2)*G1*a2*V21(t) - (-G1*Q1 + delc)*V24(t) == 0,
  V24'(t) + V14(t)*w1 + gma1*V24(t) - Sqrt(2)*G1*a1*V34(t) - 
   Sqrt(2)*G1*a2*V44(t) + k1*V24(t) - 
   Sqrt(2)*G1*a1*V21(t) - (G1*Q1 - delc)*V23(t) == 0, 
   V31'(t) + k1*V31(t) + 
   Sqrt(2)*G1*a2*V11(t) - (-G1*Q1 + delc)*V41(t) - w1*V32(t) == 0, 
   V32'(t) + k1*V32(t) + 
   Sqrt(2)*G1*a2*V12(t) - (-G1*Q1 + delc)*V42(t) + w1*V31(t) - 
   Sqrt(2)*G1*a2*V34(t) - Sqrt(2)*G1*a1*V33(t) + gma1*V32(t) == 0, 
   V33'(t) + k1*V33(t) + 
   Sqrt(2)*G1*a2*V13(t) - (-G1*Q1 + delc)*V43(t) + k1*V33(t) + 
   Sqrt(2)*G1*a2*V31(t) - (-G1*Q1 + delc)*V34(t) - k0 == 0, 
   V34'(t) + k1*V34(t) + 
   Sqrt(2)*G1*a2*V14(t) - (-G1*Q1 + delc)*V44(t) + k1*V34(t) - 
   Sqrt(2)*G1*a1*V31(t) - (G1*Q1 - delc)*V33(t) == 0, 
   V41'(t) + k1*V41(t) - 
   Sqrt(2)*G1*a1*V11(t) - (G1*Q1 - delc)*V31(t) - w1*V42(t) == 0, 
   V42'(t) + k1*V42(t) + 
   Sqrt(2)*G1*a1*V12(t) - (G1*Q1 - delc)*V32(t) + w1*V41(t) - 
   Sqrt(2)*G1*a2*V44(t) - Sqrt(2)*G1*a1*V43(t) + gma1*V42(t) == 0, 
   V43'(t) + k1*V43(t) - 
   Sqrt(2)*G1*a1*V13(t) - (G1*Q1 - delc)*V33(t) + k1*V43(t) + 
   Sqrt(2)*G1*a2*V41(t) - (-G1*Q1 + delc)*V44(t) == 0, 
   V44'(t) + k1*V44(t) - 
   Sqrt(2)*G1*a1*V14(t) - (G1*Q1 - delc)*V34(t) + k1*V44(t) - 
   Sqrt(2)*G1*a1*V41(t) - (G1*Q1 - delc)*V43(t) - k0 == 0, 
   V11(0) == 1, V12(0) == 1, V13(0) == 0, V14(0) == 0, V21(0) == 0, 
    V22(0) == 1, V23(0) == 0, V24(0) == 0, V31(0) == 0, V32(0) == 0, 
   V33(0) == 0, V34(0) == 0, V41(0) == 0, V42(0) == 0, V43(0) == 0, 
   V44(0) == 0}, {V11, V12, V13, V14, V21, V22, V23, V24, V31, V32, 
   V33, V34, V41, V42, V43, V44},{t, 0, 100},delc);
   P2 = Plot({Evaluate(1/2*(V11(t) + V22(t) - 2*V12(t))^(-1) /. s)}, {t, 
   0, 60}, PlotRange -> {0, 1}, Frame -> True,    
   FrameLabel -> {Style("Time", Bold, 20), 
   Style(" !(*SubscriptBox((S), (q)))", Bold, 20)}, 
   FrameTicksStyle -> Directive(FontSize -> 20), 
   PlotStyle -> {Thickness(0.0005), Thickness(0.008)})

What is the average profit margin in currency trading? – Discussions and help

First you must understand what the margin is.

A currency margin account is very similar to a stock margin account: the investor is taking a short-term loan from the broker. The loan is equal to the amount of leverage that the investor is assuming. Before the investor can perform an operation, he must first deposit money into the margin account. For more details visit https://www.vpsforextrader.com/

We will explain how you can use leverage and how much you can earn?

Newly Registered Customers >>

Leverage 1: 1 = 1% BENEFIT = 0.1% RISK
If we take a 1: 1 leverage it means that a 100% margin will be used. Then, 1000 USD can buy 1,000 USD worth of commodities.
Assume that the price of XYZ Commodity is 1000 USD and has an account balance of 1000 USD, so you can only buy 1 Qty. of XYZ = 1000 x 1 = 1000 USD
RISK >> If you STOP THE LOSS we maintain 0.1% (Price 1000 USD – 0.1% Risk, that is, 1 USD)
Then your loss will be 1 USD. (1000 USD – 1 USD = 999 USD)
RETURN >> If the stock price moves from 1000 to 1010 = 1% movement in the price, used 1: 1 leverage, so 100% of your 1000 USD, you will get 10 USD as a BENEFIT

Leverage 1:10 = 10% BENEFIT = 1% RISK
If we take Leverage 1:10 it means that a 10% margin will be used. Then, 1000 USD can buy 10,000 USD worth of commodities.
Assume that the price of XYZ Commodity 1000 USD and has an account balance of 1000 USD, then you can buy 10 Qty of XYZ = 1000 x 10 = 10000 USD that you can use against 1000 USD of capital.
RISK >> If you STOP THE LOSS we maintain 1% (Price 1000 USD – 0.1% of 10,000 USD, that is, 10 USD) Therefore, your LOSS will be 10 USD. 1% of your Capital 1000 USD is 10 USD. (1000 USD – 10 USD = 990 USD)

BACK >> If the stock moves from 1000 to 1010 = 1% price movement. He used 1:10 of leverage, therefore, 1% of USD 10,000 = USD 100 = 10% Return on Investment Capital USD 1000

Leverage 1: 200 = 200% BENEFIT = 20% RISK
If we take a 1: 100 leverage it means that a margin of 200% will be used. Then, 1000 USD can buy 200,000 USD worth of Commodity.
Assume that the price of XYZ Commodity 1000 USD and has an account balance of 1000 USD, then NOW you can buy 100 Qty of XYZ Price = 1000 x 200 = 200,000 USD

RISK >> If we STOP THE LOSS, we maintain 10% (Price 1000 USD- 0.2% of 200,000 USD, that is, 200 USD) Therefore, your LOSS will be 200 USD = 20% loss of your invested capital (1000 USD – 200 USD = 800 USD)

BACK >> If Stock moves from 1000 to 1010 = 1% of movement in Price. You used 1: 200 leverage, so 1% of USD 200,000 = USD 2000 of BENEFITS

flags – R marking how much data is present in a three-year average

I have a data set that considers an annual time series and a three-year moving average.

country      city          2014   2015   2016   2017   2018   2019  2014-16   2015-17   2016-18  2017-19
US           NYC            2      5      4       5     8       1      3.6       4.6       5.6      4.6 
France       Paris          NA     2      1       4     NA      1      1.5       2        2.5      1.6    
Iran         Tehran         1      NA     NA      NA     1      1      1         NA       1        1

Some of the three-year media include the 3 data, some 2, 1 or 0.
I would create a column aside for each three-year marking if the data considered 3, 2, 1 or 0, such as:

 country      city          2014   2015   2016   2017   2018   2019  2014-16  n. of data 14-16  2015-17 n.of data 20115-17 
US           NYC            2      5      4       5     8       1      3.6        3              etc       etc
France       Paris          NA     2      1       4     NA      1      1.5        2              etc       etc
Iran         Tehran         1      NA     NA      NA     1      1      1          1              etc      etc

Some clue?

On average, how many times a night do liberals wake up sweating and think there are Russians under the bed?

The left is not afraid of Russians. The left is simply ANGRY that the Russians do not like or respect them. These are the former Soviets. The leftists who set the standard. The ideology that the communist card Bernie Sanders literally believes we should emulate! "Why don't they love us?"

They are like the teenager who throws a teenage rage because the boy she likes fired her completely about another girl. Beware of the contempt of a teenager … and the American left.