Windows – Slow transfer of files between OVH data centers

I have two dedicated servers with OVH.com, one in Canada and the other in France. OVH has an internal network called "vRack" that promises low latency between servers. Both servers have XenServer 7.4 as the operating system. Both servers have a 10Mbit port connected to "vRack".

I use VyOS 1.8 as Firewall, VyOS is installed in a virtual machine per server. Only VM VyOS has an internet connection. Each server has an internal lan.

Architecture network

Test01 / OK
When sending a file from VyOS1101 to VyOS 1201, speed up to 10Mb / s

Test02 / OK
Sending files from WEB1121 (CENTOS 7) to VyOS 1201, accelerate up to 10Mb / s

Test03 / OK
Sending files from WEB1121 (CENTOS 7) to WEB-1221 (CENTOS 7), accelerate to 10Mb / s

Test04 / SLOW
Sending files from DB1111 (WINDOWS 2018 R2) to VyOS 1201, speeds up 1Mb / s (using FileZilla)

Test05 / SLOW
Sending files from DB1111 (WINDOWS 2018 R2) to DB1211 (WINDOWS 2018 R2), speeds up 1Mb / s
(using SMB)

I've tried everything, but I can not make Windows speed up to 10Mb / s.

Thanks in advance

python – Encrypting files using pynacl

I need a fairly simple file encryptor / decoder in Python; After some research, I decided to use the tye pynacl library to read the file in blocks, rewrite them and then finally use Blake2b to generate a signature for the file. Each file is encrypted with a unique key, which will be distributed together with the encrypted file, with the RSA key encrypted using a pair of previously shared keys, and the entire message signed with ECDSA to verify that it comes from me.

The encryption / decryption example code:

import base64
import structure
import nacl.secret
import nacl.utils
import nacl.hashlib
import nacl.hash

BUFFER_SIZE = 4 * (1024 * 1024)

def read_file_blocks (file, extra_bytes = 0):
while True
data = file.read (BUFFER_SIZE + extra_bytes)
if they are not data:
break
performance data

def hmac_file (file, password):
blake = nacl.hashlib.blake2b (key = key)
with open (file, & # 39; rb & # 39;) as in_file:
for the block in read_file_blocks (in_file):
blake.update (block)
back blake.hexdigest ()

def encrypt_archive (archive_name, encrypted_name):
key = nacl.utils.random (nacl.secret.SecretBox.KEY_SIZE)
#Use 4 bytes less than the nonce size to make room for the block counter
nonce = nacl.utils.random (nacl.secret.SecretBox.NONCE_SIZE - 4)
block_num = 0

box = nacl.secret.SecretBox (key)
with open (archive_name, & # 39; rb & # 39;) as in_file, open (encrypted_name, & # 39; wb & # 39;) as out_file:
for data in read_file_blocks (in_file):
# Apply the block counter to the nonce, so each block has a unique nonce
block_nonce = nonce + struct.pack ("> I", block_num)
block = box.encrypt (data, block_nonce)
out_file.write (block.ciphertext)
block_num + = 1

hmac_key = nacl.hash.sha256 (key + nonce, encoder = nacl.encoding.RawEncoder)
output = {}
exit['key'] = base64.b64encode (key + nonce)
exit['signature'] = hmac_file (encrypted_name, hmac_key)
return output

def decrypt_archive (encrypted_name, archive_name, key_info):
key_bytes = base64.b64decode (key_info['key'])

key = key_bytes[:nacl.secret.SecretBox.KEY_SIZE]
    nonce = key_bytes[nacl.secret.SecretBox.KEY_SIZE:]

    extra_bytes = nacl.secret.SecretBox.MACBYTES
hmac_key = nacl.hash.sha256 (key_bytes, encoder = nacl.encoding.RawEncoder)
hmac = hmac_file (encrypted_name, hmac_key)
if hmac! = key_info['signature']:
print (& # 39; hmac pairing & # 39;)
he came back

block_num = 0
box = nacl.secret.SecretBox (key)
with open (encrypted_name, & # 39; rb & # 39;) as in_file, open (archive_name, & # 39; wb & # 39;) as out_file:
# nacl adds a MAC to each block, when reading the file, this must be taken into account
for data in read_file_blocks (in_file, extra_bytes = extra_bytes):
block_nonce = nonce + struct.pack ("> I", block_num)
block = box.decrypt (data, block_nonce)
out_file.write (block)
block_num + = 1

key_info = encrypt_archive ("C: \ temp \ test.csv", "C: \ temp \ test.enc")
print (key_info)
decrypt_archive ("C: \ temp \ test.enc", "C: \ temp \ test.enc.csv", key_info)

Out of the general mistakes, the two things I'm doing I'm not entirely sure are:

  1. To keep the block nonces unique, I create a random list of bytes a little smaller for the nonce than the required one, then, when I encrypt the blocks, I add the block number, as an integer of four bytes to the nonce.

  2. When generating the blake2b hash, for a key, I click on the key and the nonce of the file. This seems a bit useless in general, because if they have the key and can not replace the file. Although, I really can not think of a better alternative that does not have similar weaknesses. Should I just get rid of that bit, since NaCl does MAC per block anyway? (which I discovered only after writing the hmac code)

Deletion of duplicate audio files with different UTC time stamps in the file names, using PowerShell 5

I have a disc with many thousands of sound effects. The files have been stored on different operating systems and NAS devices over time and now there are duplicates of many files, but the names of the files contain different UTC timestamps. For example:

1-14 Busted bowling alley (2016_06_28 02_14_41 UTC) .aif

1-14 Bowling full of people (2016_02_18 05_56_59 UTC) .aif

I would like to delete the duplicate files. There are also unwanted files from the audio software and the indexing of the operating system that I would like to remove. When using a subset copied data, I tried the following PowerShell script in PS version 5.1, it seems to give me what I'm looking for:

# Remove the remaining files from the audio editors
Get-ChildItem -Name -Recurse -Filter "* .clip *" | Remove the article
Get-ChildItem -Name -Recurse -Filter "* .ptf" | Remove the article
Get-ChildItem -Name -Recurse -Filter "* .wfm" | Remove the article
Get-ChildItem -Name -Recurse -Filter "* .repeaks" | Remove the article

# Delete old indexing files
Get-ChildItem -Name -Recurse -Filter "* .DS_STORE" | Remove the article
Get-ChildItem -Name -Recurse -Filter "._. *" | Remove the article
Get-ChildItem -Name -Recurse -Filter "Thumbs * .db" | Remove the article
Get-ChildItem -Name -Recurse -Filter "* .ini" | Remove the article

# Remove UTC stamp of all files (will fail in duplicates)
Get-ChildItem -Recurse -Filter "* (?????????? ???????? UTC). *" | Rename-element-new name {$ _. Basename.substring (0, $ _. Basename.length-26) + $ _. Extension}

# Delete files that still have the UTC stamp (they were the duplicates)
Get-ChildItem -Name -Recurse -Filter "* (?????????????????? UTC). *" | Remove the article

However, I would love to get a critique of the script and learn some things. In particular, it seems a bit hacked using an expected failed command as part of the workflow. It also seemed sensible to get notes before executing it in the whole set of files.

18.04 – Missing files – Ask Ubuntu

I recently downloaded a folder with a lot of different folders and files, in one of those folders 2 files were downloaded, I created a new file and started working on it, my computer accidentally closed later and now the file I was working on disappeared along with its executable, I understand that maybe this could happen if it was not saved, but I saved it before and tried to execute it, both the file and the executable are gone, and I miss it What happens is that the other 2 files in the folder in which I was working that were there with the original downloaded folder are still there. I tried to find it with the command (search. -Name nameoffile.cpp) but nothing happens, it does not tell me if it exists or not, it executes but it does not show anything. What's going on?

wireless networks: browsers only use 1% of Internet capacity when downloading files

It just happened around yesterday, I tried to download some of the materials for my courses, but my browser seems to be using only 1% of my wifi capacity, more specifically, I could not even reach 1mb / s of download speed, I tried another computer portable, and got 13mb / s of download speed. Any ideas? Thanks!

c # – Using TPL in unpacking files

I have been working on an implementation of my own Astron library and I wanted to review my use of TPL because I'm not sure about this technology: /

My application is currently unpacking .d2p files, which is a custom file format of a French game called Dofus, it is simply a file of other inflated files from another custom .dlm file format. Here is the current output of my application:

application output
The progress bars came from here.

So everything seems to work as expected, the files are processes at the same time, 308m of .dlm files are analyzed in 10 seconds, that's exactly what I want it to do, but it is possible that I have misused the TPL. The entire project can be found at this address, but the part of the code that I want to review is from src / Astron.Unpacker / Managers / D2PManager.cs :

public class D2PManager: BaseFileManager
{
private read only ILogger _logger;
private string read-only _dlmFilesFolder;
Public D2PManager (container IContainer): base (container)
{
_logger = ServiceLocator.Logger;
_dlmFilesFolder = container.GetInstance() .DlmOutputFolder;
}

Public task async UnpackAll (string[] filesPath)
{
_logger.Log(LogLevel.Info, $ "Trying to unpack {filesPath.Length} d2p files ...");

var tasks = new list(filesPath.Length);
task.AddRange (filesPath.Select (d2PFilePath => UnpackD2PFile (d2PFilePath)));

wait Task.WhenAll (tasks) .ConfigureAwait (false);
}

Async public task UnpackD2PFile (chain route)
{
var d2PFile = new FileAccessor (path);
var metaUnpacker = new D2PFileMetadataUnpacker (_binaryFactory, _serDes);
metaUnpacker.Unpack (d2PFile);

var archiveUnpacker = new DlmArchivesUnpacker (_binaryFactory, _serDes, metaUnpacker.Value);
archiveUnpacker.Unpack (d2PFile);

var progressCount = 1;
var progressBar = new ProgressBar (PbStyle.SingleLine, archiveUnpacker.Values.Count);
progressBar.Refresh (0, Path.GetFileName (d2PFile.FullPath));
awaits Task.Delay (10); // does not print in any way
foreach (var file in archiveUnpacker.Values)
{
var filePath = (_dlmFilesFolder + archive.RelativePath) .Replace (& # 39; / & # 39 ;, & # 39; \ & # 39;);
var fileDirectory = Path.GetDirectoryName (filePath);
var deflatedStream =
new DeflateStream (new MemoryStream (archive.CompressedData), CompressionMode.Decompress);

var decompressedData = new MemoryStream ();
awaits deflatedStream.CopyToAsync (decompressedData);
if (! Directory.Exists (fileDirectory)) Directory.CreateDirectory (fileDirectory);

File.WriteAllBytes (filePath, decompressedData.GetBuffer ());
progressBar.Refresh (progressCount, filePath);
progressCount ++;
}
}
}

Here are my questions:

  • If I do not add the Task.Delay () Right after the initialization of the process bar, the files seem to be processed synchronously (the progress bar is displayed when it was completed the last time), why does it happen?
  • Is it correct to use .ConfigureAwait (false) in Task.WhenAll () ?
  • Am I starting each task the right way? I should not use Task.Run () instead with Task.WaitAll () ?

Windows 10: How would you combine two files and combine the additions in CMD?

I come to you with a problem. I have two text files, inserted below. I need to combine them, eliminate duplicates, keep the ADDITIONS and do nothing with the strings that have been removed.

THAT IS TO SAY.
(file 1)

Hello

circle

my

first name

meme

is

jeff

(file 2)

Hello

my

is

jeff

square

with a final result of

Hello

circle

my

first name

meme

is

jeff

square

It is difficult to explain. Here are the files.

https://pastebin.com/qy99XQXP
https://pastebin.com/Yctxxxtf

THANK YOU

8 – How do I cancel the base URL for private files?

Why in Drupal 8 it is possible to override the base_url for public files using the file_public_base_url But not for private files?
There are scenarios in which Drupal is installed behind a reverse proxy. Private files point to the host name where the backend server is located, instead of pointing to the frontend server.

Perhaps one solution is to use the module mod_substitute (http://httpd.apache.org/docs/2.4/mod/mod_substitute.html) but I do not know if it is the best.

If I uninstall Dropbox, does Dropbox delete files from your computer?

If I uninstall Dropbox, does Dropbox delete files from your computer?

How can I delete previously deleted files from Darktable?

Darktable has two different operations: remove Y Delete / Trash. By default, the OF THE The key is linked to the first. That only removes the information about that file from the database and does not affect the actual file. You can change this in the preferences under shortcuts:

darktable preferences dialog

Double-click on the line "delete from disk or send to trash" and then press OF THE. Now, in the future, the deletion will be "real" in the file system, not just in the database.

Unfortunately for you, there is no way to do this retroactively, because by definition now darktable does not know about those files.

Darktable includes a script to do the opposite: delete files from the database when they no longer exist on the disk. If you have a little shell shell and SQL knowledge, that example should start doing the opposite.

Or you could use the clever suggestion of @ junkyardsparkle.