dnd 5e: How does the increase in size affect adjacent squares and enemies?

This it is. As far as I know, the rules do not specify how the creature will occupy space after being transformed. In particular, there are few specifications on how to play using thumbnails / grid in the main books.

For the award, as Rykara mentions in the related response

the grid space of a map does not represent the actual space that a creature occupies, it represents the space that that creature controls

So, there are some things to consider: First, enlarging a creature may not require it to actually occupy the suggested space for a larger creature, which is also backed by the DMG (p. 251):

For example, you can use a miniature that has a large base to represent a huge giant. That giant occupies less space on the battlefield than its size suggests, but it is still huge for purposes of rules such as confrontation.

Second, if you are using the optional "confrontation" rules (DMG, p. 252), it would make sense that the space that the creature controls is defined by the way it was faced.

Finally, for your scenarios where the creature is, for example, surrounded, it would make sense for the creatures to be pushed back, if the creature that grows is in fact becoming enlarged enough to occupy other spaces. Also, in your budget, keep in mind that

If a creature is a friend or an enemy, You cannot voluntarily end your movement in your space.

This means creatures can they occupy the same space, they simply cannot do it voluntarily (in particular, they cannot voluntarily enter the space of another creature moving and remain there). Therefore, another possible award in this case (surrounded creature) is simply to allow them to occupy the same space and, in the turns of the monsters, they are forced to move out of space (since they cannot voluntarily remain there). (The character does not have a choice unless the monsters are Small or smaller, since he cannot move through the monsters.)

For my experience1Ultimately, none of this should have a relevant impact on the game, so you can judge how you want and it should be fine if this situation appears in a real game.


1 Just as a reference, I have played many wizards with Polymorphs and Enlarge / Reduce on online tables (with grid / miniatures) and I don't remember once when the occupied spaces would have changed the game. The DM / I only increased the token size as we believed and moved on.

complexity theory: given a Boolean matrix A of size n, A ^ p can be calculated in space O (log n log p)

The solution to this problem can be found here. He says:

Multiply $ k $ matrices, we generate the result entry by entry, executing a counter $ t $ and generating the $ it $entry into the product of the first $ k – $ 1 and the $ tj $th entry in the last matrix. Inductively, we need to keep $ k $ counters that can be done in $ O (k log n) $ space. Finally, keep in mind that using the repeated square, we can calculate $ A ^ p $ using $ O ( log p) $ matrices, which are different powers of $ A $. To generate each of these matrices, we just need $ A $ and a single counter. Therefore, the total space required is $ O ( log p log n) $.

What I don't understand is that in the case of multiplying $ k $ Fixed matrices are part of the input tape, so no space is needed on the work tapes. In the case of $ A ^ p $ computing, intermediate results $ A 2 ^ t It must be stored somewhere demanding more space. So how can this be done in $ O ( log p log n) $ space?

Thank you!

Is this a good practice to get an array of unknown size from a C to Python code using Ctypes

I am new to Ctypes. I am writing a piece of code that brings an array of unknown size from C to Python using Ctypes. The unknown size here means that Python initially does not know the size of the array. The C code has the algorithm that creates the matrix.
Somehow, after many trails, I managed to write code for that:

Python code: sample.py

from ctypes import *
#import needed packages
import os



def main():
    a = 6;
    lib_path = os.path.join(os.path.dirname(__file__),  'Debug','{}.dll'.format("sum"))
    #print(lib_path)
    try:
        lib = cdll.LoadLibrary(lib_path)
    except Exception as e:
        raise RuntimeError("Could not   library with this path: {}. {}".format(lib_path, e))
    class truct(Structure):
        _fields_ = (("array", POINTER(c_double)),
                    ("len", c_int))
    va = truct()
    ty = (c_double * 1 )
    va.array = ty()
    va.len = 1
    lib.sample_function.argtypes= (POINTER(truct))
    lib.sample_function.restype = c_int
    try:
        ret = lib.sample_function(byref(va))
    except Exception as e:
        raise RuntimeError("Library call raises exception: {}".format(e)) 

C code:

typedef struct A {

    double *array;
      int a;
} A;
void sample_function2(A* structure){
    int array_local() = { 0,0,1,1,0};
    int iter2;
    for (iter2 =0;iter2<5;iter2++){
        if(array_local(iter2)){
            structure->array(iter2) = iter2;
    }else{
      structure->array(iter2) = 0;
    }}
}

int sample_function( A* structure)
{
    int n = 5;
    int iter;
    sample_function2(structure);

    return 0;
}

This code works as expected. But is it a good practice to use this method?

postgresql – The vacuum increases the swelling size as reported by the Heroku swelling SQL query

I am using the Heroku command as described here to detect swelling in the tables. One of our tables reported a swelling size of about 7 GB, but when executing the vacuum in it, then executing the same swelling command, it reports about 21 GB of "waste". I have no idea how this can happen. I understand the various conditions under which a table can swell, even this dramatically, but not after a vacuum, I have never seen such a case.

Any idea why this could happen? The table in question has some TOAST tuples, maybe that's it. Nor can I say if that query by Heroku explains the waste of TOAST, but I understand that the void would also take care of that. Local tests show that the "waste" column of that swollen query remains fairly stable for small sizes and decreases for larger values, but never increase, which makes this quite strange.

How to draw a grid size n x m in draw.io?

It seems to be a question for newbies. But I don't know how to generate the N x M grid in draw.io

My current solution for the 3×3 grid:

Step 1. Draw 9 squares

9 squares

Step 2. Drag and drop them into the grid formation.

enter the description of the image here

However, this method does not scale very well. If I want to draw a 9×9 grid, it takes me a long time.

Is there any better way?

Is there a blockchain analyzer that calculates the virtual size of the transactions?

It is not clear why someone has voted against, without leaving a comment …

When I look at the original bitcoin specifications, I can see: "Added in Bitcoin Core 0.13.0 – The size of the virtual transaction. It differs from the size of witness transactions." I tried to go through the page "https://bitcoin.org/en/release/v0.13.0", but I didn't mention the virtual size. Therefore, it seems to be hidden in one of the many SegWit additions. At https://bitcoincore.org/en/segwit_wallet_dev/ I finally found this:

Transaction Rate Estimation

- Instead of transaction size, a new metric is defined, called “virtual size” (vsize)
- vsize of a transaction equals to 3 times of the size with original serialization, plus the size with new serialization, divide the result by 4 and round up to the next integer. For example, if a transaction is 200 bytes with new serialization, and becomes 99 bytes with marker, flag, and witness removed, the vsize is (99 * 3 + 200) / 4 = 125 with round up.
- vsize of a non-segwit transaction is simply its size
- Transaction fee should be estimated by comparing the vsize with other transactions, not the size.
- Developers should be careful not to make an off-by-4-times mistake in fee estimation.

Then i looked in

  • blockchain.info
  • explorer.com block
  • live.blockcypher.com
  • www.blockcypher.com
  • www.blocktrail.com

just to discover that they don't provide this value …

Monogame / XNA: how do you change the size of a Texture2D and then scale it?

Issue:
It seems that the Draw () call allows us to use the destination rectangle to resize (5of8) or Scale (6of8 / 7of8) but not both.

Purpose:
I would like to change the size of a button from 567×634 to 400×480 and then use the scale to reduce it further by 0.8 to 320×400.

Locker size in Frankfurt (Main) Hbf [closed]

I have 2 large suitcases (size no. 2) and 1 travel bag (flight cabin size). Can I use 1 locker for everyone to fit? Otherwise, I will pack in only 2 luggage.
Thank you!

Interpolation fails on data of a certain size

I am trying to interpolate the following set of unstructured data in order to calculate and plot the gradient.

data = Import("https://raw.githubusercontent.com/tomginsberg/motorsim/master/fealite/solutions/bldc_nonlinear.txt", "Data");

data // RandomChoice
(* {0.49383, -0.114352, -0.000014} *)
data // Dimensions
(* {2255, 3} *)

ListPlot3D and ListDensityPlot works well

GraphicsRow({ListDensityPlot(#), ListPlot3D(#)} &@data)

enter the description of the image here

But the failure comes from Interpolation(data)

enter the description of the image here

However, I can interpolate a subset of 500 elements.

A = Interpolation(data((1 ;; 500)))
enter the description of the image here
A = Interpolation(data((500 ;; 1000)))
enter the description of the image here
But in A = Interpolation(data((1 ;; 1000))) I still have
enter the description of the image here

As proof of holiness, it's not just about the size I tried.

Interpolation(RandomReal(1, {4000, 3}))

And it seems to work well

Any idea what the problem may be? Thank you

cpanel: the backup file size of the website differs on the server and when it is downloaded via FTP

It is not a virus or some kind of FTP compression.

There could be a couple of things happening:

For example, the server is likely to run Linux and your desktop will probably run Windows. It is likely that your server and your desktop are configured to display the file size differently.

In some operating systems, file sizes tend to be expressed in megabytes (for example, 1 GiB = 1,024 MiB), while in other operating systems the sizes are usually expressed in megabytes (for example, 1 GB = 1,000 MB). This shows the relationships between the two systems.

Another thing that could be happening is that some operating systems sometimes report "true file size" and some tend to report "file size on disk."

These may be different numbers depending on the size of the hard disk file allocation unit, etc. When a hard disk is formatted with larger allocation units, there is more difference between the actual file size and the disk size.

In a very "loose" example, if the allocation unit is 4kb, a 10kb file will occupy three blocks (or 12kb "on disk"), but if the allocation unit is 8kb, the same 10kb file will occupy two blocks (and will have 16 kb "on disk") or if the allocation unit is 2 kb, it will occupy 5 blocks (and will have 10 kb on disk) …

In any of these cases, as long as the files are not damaged, It is nothing to worry about. It's just a "that's how it is" situation.

FileZilla (and probably most other FTP programs) can be configured to automatically convert somewhere in the configuration dialog boxes.
Here is a screenshot of my FileZilla configuration that shows the relevant configuration. Yours may vary, but I suspect it is at least similar:

enter the description of the image here