github – Free git repository hosting that allows large data usage?

I’ve got a few repositories that are pushing 5-6 GB, which is more than Github allows (even with a pro account), and Gitlab allows 10GB per repository but the hook always hangs up if a single push contains more than a few GB. Are there any websites out there that can handle higher bandwidth?

numerical integration – Is there a way to speed up Integrate when the integrand contains a product of polynomials each of which having a large degree?

I have integrals of the form
$$int_0^inftymathrm d x; e^{-x^2};;(textrm{Polynomial of degree n})times(textrm{Polynomial of degree m}),$$
where $n,m$ can be as large as 300. (If this is helpful, these polynomials are Laguerre polynomials, provided by the function LaguerreL(...)).

When I use Integrate(f(x), {x, 0, Infinity}), the calculation is too slow for large polynomial degrees.

Is there a way to speed up this calculation? (I tried to use NIntegrate, but unfortunately, since the integrand is highly oscillatory at large $n,m$, I’m unable to get reliable results from numerical methods).

dnd 5e – Does spike growth inflict cumulative damage on large and bigger creatures?

Spike growth:

The ground in a 20-foot radius centered on a point within range twists and sprouts hard spikes and thorns. The area becomes difficult terrain for the duration. When a creature moves into or within the area, it takes 2d4 piercing damage for every 5 feet it travels.
The transformation of the ground is camouflaged to look natural. Any creature that can’t see the area at the time the spell is cast must make a Wisdom (Perception) check against your spell save DC to recognize the terrain as hazardous before entering it.

“That day, the druid cast spike growth where a gargantuan, half-burrowed Sandworm stood… and for the next half hour, everybody stopped playing and started frantically browsing through the manuals to figure out what to do.”

So, the question does size matter…?

I take for granted that you can choose as the epicenter of the spell the point where the creature touches the ground: it won’t influence the space where the body of the creature is, nor anything below, but the surrounding terrain on the ground level should be influenced (although you don’t really see such point, the spell doesn’t require you too, contrary to the usual routine). So, the token of this sizeable creature occupies a 16 (4×4) squares space on the grid (12 hexagons if you’re into that). At the start of its turn, it’s gonna find himself in the middle of a semi-hidden spike field. Since the spell only hurts whoever moves into or within the area, I infer that creatures who find themselves already in it and decide not to move won’t get hurt, all the more if they’re half-burrowed. Now, if the creature notices the danger (and it should since it was there) but still decides to move above the terrain (although I guess he could return underground where half his body lies without repercussions), how much does he get hurt? The spell mentions a damage x movement ratio, and with smaller creatures it’s no problem. But what about bigger monsters? Is this 2nd level spell a colossus bane, which indirectly does x4 damage to large monsters, x9 to huge ones and x16 to gargantuan ones per square (=5 feet)?

RAW, I’d rule against it: bigger creatures aren’t affected multiple times by effects that target more than one of their squares (think fireball: no matter the size, if the spell hits just a square or the whole circumference of a token, the damage only hits once). That said, seems to me like this huge AoE spell should indeed scale with the size of its victims as more spikes pierce through their flesh. Also, it wouldn’t be the first time that low level spells were hugely effective against specific creatures (heat metal against full-plated enemies comes to mind).

What do you think?

dnd 5e – Can you use a large creature’s dead body as a means to walk over a grease spell so that you are unaffected by that spell?

Here’s the spell description:

Slick grease covers the ground in a 10-foot square centered on a point within range and turns it into difficult terrain for the duration.

When the grease appears, each creature standing in its area must succeed on a Dexterity saving throw or fall prone. A creature that enters the area or ends its turn there must also succeed on a Dexterity saving throw or fall prone.

There is no rule “grease spell effect is negated when a large creature is lying upon” in any official source book. We on the internet can only tell you that spells do based on their description only, because we are not DMing your game. We cannot change or expand upon the rules and say “this is part of how the spell works”. We do not have that authority; the DM of the game does, and we aren’t that.

The Grease spell covers 10-foot square, and a large creature does not cover 10-foot square:

A creature’s space is the area in feet that it effectively controls in combat, not an expression of its physical dimensions. (PHB p. 191)

You can try to walk over it, but the result will depend on the situation — how exactly the creature is lying, what body does that creature have, et cetera. This is the DMs job to adjudicate such things, so it becomes exactly the “ask your DM” type of question.

One thing we can say though — there is no “fluff” ignorable text in that description. It says “slick grease covers the ground”, therefore this slick grease is the exact reason why a walking creature “must succeed on a Dexterity saving throw or fall prone”. If you completely cover the grease with something big and heavy enough to be a sound surface, it probably negates the spell effect (but still, ask your DM).

google analytics – How to calculate a conversion goal value for a large sample size?

I need to assign a conversion goal value for user actions made on a website.

I have already processed the data for a sample size of 210 paid invoices. The statistical analysis of the income values can be seen in the results below…

enter image description here
enter image description here

Which parameter should I use as the conversion value and why?

Please Note:
I have posted the same question to Mathematics.SE and stackoverflow due to the feedback on similar: https://meta.stackexchange.com/questions/260823/which-site-should-i-ask-this-math-question-on

performance tuning – Pair-wise equality over large sets of large vectors

I’ve got an interesting performance tuning/algorithmic problem that I’m running into in an optimization context.

I’ve got a set of ~16-50 lists of integers (usually in Range(0, 5) but no restricted to that).
The data might look like this (although obviously not random)

maxQ = 5;
ndim = 16;
nstates = 100000;
braVecs = RandomInteger({0, maxQ}, {ndim, nstates});
ketVecs = braVecs + RandomInteger({0, 1}, {ndim, nstates});

now for every element q ∈ Subsets(ndim, 4) I need to determine where every pair of braVecs and ketVecs are the same, except for the indices in q, i.e. I for every possible q I need to calculate this

qComp = Complement(Range(ndim), q);
diffs = braVecs((qComp)) - ketVecs((qComp));
Pick(Range(nstates), Times @@ (1 - Unitize(diffs)), 1)

Just as an example, this is the kind of thing I expect to get out

q = RandomSample(Range(ndim), 4);
qComp = Complement(Range(ndim), q);
diffs = braVecs((qComp)) - ketVecs((qComp));
{q, Pick(Range(nstates), Times @@ (1 - Unitize(diffs)), 1)}

{{2, 9, 6, 4}, {825, 1993, 5577, 5666, 9690, 9856, 11502, 13515, 15680, 18570, 
  19207, 23131, 26986, 27269, 31889, 39396, 39942, 51688, 52520, 54905, 55360, 
  60180, 61682, 66258, 66458, 68742, 71871, 78489, 80906, 90275, 91520, 93184}}

This can obviously be done just by looping, but I sure there is an algorithmically more efficient way to do this, maybe using a Trie or maybe using some Boolean algebra to reuse prior calculations? This is important because my ndim can get up to ~50 and there are a huge number of elements in Subsets(50, 4)… I just don’t know what the best way to approach this kind of thing is.

convex optimization – Conjugate gradient and the eigenvectors corresponding to the large eigenvalues

I am working on an optimization problem (for example, conjugate gradient) to solve $Ax=b$, where $A$ is a symmetric positive definite matrix. I can understand that the CG (conjugate gradient) has better performance when the matrix $A$ has a smaller conditioner number. But I am wondering is there a relationship between the eigenvectors corresponding to the largest few eigenvalues and the first few update directions of the CG? Any suggestions would be helpful. Thanks!

Does a large or larger creature only partially in darkness see?

I don’t remember and cannot find the ruling anywhere. If a large or larger creature is in a darkness zone (magic or mundane) only partially, can it see or not? I think it can see but I’d like to be sure.

http – Nginx showing 504 Gateway timed out when uploading large files

I have a simple PHP file upload script that uploads a file. The server consists of 1 nginx reverse proxy and another nginx server (upstream). The problem is that when I try to upload very large files ~ 2GB then I get an error:

504 Gateway Time-out

Here is my reverse proxy configuration:

server {
    listen 80;
    
    server_name upload.mycloudhost.com;

    proxy_buffer_size 1024k;
    proxy_buffers 4 1024k;
    proxy_busy_buffers_size 1024k;
    proxy_connect_timeout       600;
    proxy_send_timeout          600;
    proxy_read_timeout          600;
    send_timeout                600;
    client_body_timeout         600;
    client_header_timeout       600;
    keepalive_timeout           600;
    uwsgi_read_timeout          600;

    location / {
        proxy_set_header X-Real-IP  $remote_addr;
        proxy_set_header X-Forwarded-For $remote_addr;
        proxy_set_header Host $host;
        proxy_pass http://localhost:13670;
    }
}

And here is the other nginx server (upstream):

server {
    listen 80;
    
    server_name upload.mycloudhost.com;

    client_max_body_size 80G;
    proxy_buffer_size 1024k;
    proxy_buffers 4 1024k;
    proxy_busy_buffers_size 1024k;
    proxy_connect_timeout       600;
    proxy_send_timeout          600;
    proxy_read_timeout          600;
    send_timeout                600;
    client_body_timeout         600;
    client_header_timeout       600;
    keepalive_timeout           600;
    uwsgi_read_timeout          600;

    root /var/www/mycloudhost;
    index index.php index.html index.htm index.nginx-debian.html;

    location ~ .php$ {
        try_files $uri =404;
        fastcgi_split_path_info ^(.+.php)(/.+)$;
        fastcgi_pass php:9000;
        fastcgi_index index.php;
        include fastcgi_params;
        fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
        fastcgi_param PATH_INFO $fastcgi_path_info;
    }

    location / {
        try_files $uri $uri/ =404;
    }
}

php – Large array or a query?

I have a static word list that has 2048 words in it. The BIP39 word list if your are familiar.

This is going to be used for a specific niche tool and I would be surprised if it gets used more that a few hundred times in a week.

Better to generate an array with all 2048 words or just drop them in SQL and query them that way?