File uploaded to the SharePoint library using JQuery and SharePointPlus js lib with just a second click

I'm using createFile() SharepointPlus 5.2 js library method for loading a file into a SharePoint 2013 library. It works perfect in most SP libraries. But in the newly created SP libraries it throws an error (the content of the file is required) at the first click and then performs an appropriate load on the second click. How to solve this problem?

The header section includes:

The HTML body:

The JS function:

function _uploadFile() {

  var Title = $('div(data-field="Title") input').val();

  var files = document.querySelector('#fileUploadInput').files;

  files =;
  // read the files
  Promise.all( {
    return new Promise(function(prom_res, prom_rej) {
      // use fileReader
      var fileReader = new FileReader();
      fileReader.onloadend = function(e) {
        file.content =;
      fileReader.onerror = function(e) {
  })).then(function(files) {
    // upload files
    return Promise.all( {
      return $SP().list("8D5132D2-A458-4961-BCD6-FFD9CE964C0F").createFile({ //Doc lib UID
  }).then(function(file) {
        alert(file(0).Name+" has been uploaded");
        console.log("Document " + file(0).Name+" has been uploaded to the Outgoing documents library");
    }, function(error) {
        console.log("Error: ",error);

The error in the Firefox console is

exception not detected: (SharepointPlus & # 39; createFile & # 39;): the contents of the file
it is required.

Debugging the content of the file object shows that it exists (the size in bytes is displayed). The error handler does not return an error text message. On the second click on the same form (without updating the page) load the file and give a success message.
I'm stuck, if it's a problem in SharePoint, JQuery or SharepointPlus, or a conflict. Normally, loading of the file from the first click is expected, since the same code works in other existing libraries.

builds – CMake ARM gcc and lib atomic – what is the correct way to handle this dependency?

I am working on a port (a reconstruction) to the ARM architecture of a fairly complex application. It has many conan packages as dependencies. I have realized that many dependencies cannot be built due to the atomic lib required to replace the present intrinsic of gcc in other architectures. Gtest, Grps, Benchmark, CppMicroServices, etc. They suffer this problem.

What is the preferred (best) way to handle this dependence.

I still add the following CMAKE code, but is there probably a better native way of CMAKE? :

    set(IS_ARM TRUE)

(Not to mention that this should also be against the GCC compiler identification?)

Should I disable CoW btrfs for / var / lib / docker?

I saw that it is not a good idea to use btrfs CoW functionality for large files, such as the data directories of a PostgreSQL database.

As I use docker for databases, now I wonder if I should disable CoW for everything /var/lib/docker directory. But I'm not sure, because Docker's layered file system makes use of this function, doesn't it?

Or is it possible to deactivate CoW only for some specific volumes?

c ++ – Runtime shared lib symbol search error

I received a slightly strange error in my c ++ program. Let's assume a project where I have a shared library. Inside the library I have:

class EXPORT_MACRO SomeClass
    enum ValueTypes
        Type1, Type2, Type3, TypeCount,
    typedef SomeTemplate  Types;

Within main.cpp I have the constructor of another class with the following interface:

MainClass::MainClass(SomeClass::Types i_val)

The problem is that this Code is inherited and everything works fine. But when I added Type4 to the SomeClass enum, then it looks like this:

class EXPORT_MACRO SomeClass
    enum ValueTypes
        Type1, Type2, Type3, Type4, TypeCount,
    typedef SomeTemplate  Types;

My program fails at runtime with the error: Symbol lookup error: undefined Symbol and then shattered the name of the MainClass constructor. However, it is linked all the time successfully, that is just a runtime problem.
What I tried to do:

  1. I deleted all object files and .so files related to my library.
  2. I recompiled all executables.
  3. I looked through the binaries and discovered that the constructor interface had changed after my modifications to both executables. It used to be like:
0000000000000060 T MainClass::MainClass(SomeTemplate )
                 U MainClass::MainClass(SomeTemplate )

And after adding the new Type4 it became:

0000000000000060 T MainClass::MainClass(SomeTemplate )
                 U MainClass::MainClass(SomeTemplate )

and began to fall at runtime. As you can see, the address of the library function is undefined, but it was the same before the changes. Any idea what could happen?

java – NetBeans does not generate the lib folder

Hi, I have a problem compiling a project with NetBeans 11 and jdk 13, for some strange reason compiling the folder dist only the jar and the folder lib It is not compiled together.

The message generated by NetBeans at the end of the execution is as follows:

To run this application from the command line without Ant, try:
C:Program FilesJavajdk-13/bin/java -cp C:Program FilesNetBeans-11.1netbeansjavamodulesextAbsoluteLayout.jar;C:libsireportant-1.7.1.jar;C:libsireportantlr-2.7.6....... ETC..

Anyway, apparently it tells me to open the jar specifying all libraries (this from the terminal)

I don't know if it's because of something related to PATH because I already set the JAVA_HOME = C:Program FilesJavajdk-13

Anyone with the same problem?

mariadb – mariabackup from the remote mariadb10.3 galley | & # 39; / var / lib / mysql / & # 39; (Errcode: 2 "There is no such file or directory")

The following settings apply:

  • 3-node galley group in the docker
  • backup container that mariabackup should run to retrieve data from the galley cluster
  • The network is fine. backup container can ping galley
  • mariadb / mariabackup version 10.3
  • The backup container only has mariabackup installed.

The command I used

mariabackup --backup --host= --port=3306 --user=root --password= --target-dir=/backup

When I try to make a backup from the galley cluster from the backup container, the following error appears

00) 2019-09-15 23:38:34 Connecting to MySQL server host: mdb_mariadb, user: root, password: set, port: 3306, socket: /var/run/mysqld/mysqld.sock
(00) 2019-09-15 23:38:34 Using server version 10.3.18-MariaDB-1:10.3.18+maria~bionic-log
(00) 2019-09-15 23:38:34 Warning: option 'datadir' points to nonexistent directory '/var/lib/mysql/'
(00) 2019-09-15 23:38:34 Warning: MySQL variable 'datadir' points to nonexistent directory '/var/lib/mysql/'
(00) 2019-09-15 23:38:34 Warning: option 'datadir' has different values:
  '/var/lib/mysql/' in defaults file
  '/var/lib/mysql/' in SHOW VARIABLES
mariabackup based on MariaDB server 10.3.18-MariaDB debian-linux-gnu (x86_64)
(00) 2019-09-15 23:38:34 uses posix_fadvise().
mariabackup: Can't change dir to '/var/lib/mysql/' (Errcode: 2 "No such file or directory")
(00) 2019-09-15 23:38:34 my_setwd() failed , /var/lib/mysql/

What i tried

  • investigating several essences in search of clues
  • investigating the mariadb scripts provided by the server installation
  • trying different combinations of options, but always the same error.


  • Has anyone made remote backups and can you tell me how you did it?
  • Is the approach wrong and is there a better way to make hot backups once a day and incrementally during the day?

Document library: SharePoint Online, get a list of files extracted from lib using PnP

I have a document library that has been outdated for some years. I am trying to clean it, but there are more than 1000 documents that are Checked Out to another user I am a super administrator, so I can enter and take possession / verify these documents individually. But that would take HOURS and there may be many more libraries that I need to classify.

I see there Set-PnPFileCheckedIn PowerShell command, which is good, but what I don't see is a command to take possession of unprotected files. Or a configuration in the following commands that would allow me to go through the unprotected elements:




Get-PnPFiel -asListItem

As of now, the commands I listed above only return registered items, or items that are checked out in your account with user session.

Do I need to execute a command as a user that has unprotected documents?

I know that ListItem fields have a field like CheckedOutTo but that doesn't help when you can't get a list of those unprotected items to update who is currently checking out the document.

Looking for a PnP command solution for this. If possible. I see some solutions using older SPO commands, and if that is the only way to do it, but I am trying to stay within the realm of PnP commands.

Thanks to everyone beforehand!

the dpkg / var / lib / dpkg database directory cannot be accessed: no such file or directory exists

sudo dpkg --configure -a

gave this error below
dpkg: error: database directory cannot be accessed dpkg / var / lib / dpkg: There is no such file or directory
sudo rm /var/lib/dpkg
ps aux | grep -i apt
also lsof /var/lib/dpkg/lock-frontend as well…

apt – Could not get lock / var / lib / dpkg / lock-frontend – specific for a single .deb package

I have a custom package that I am trying to install on my device.

Returns this error:

nvidia@tegra-ubuntu:~$ sudo dpkg -i MyPackage.deb 
(Reading database ... 181574 files and directories currently installed.)
Preparing to unpack .../MyPackage.deb ...
E: Could not get lock /var/lib/dpkg/lock-frontend - open (11: Resource temporarily unavailable)
E: Unable to acquire the dpkg frontend lock (/var/lib/dpkg/lock-frontend), is another process using it?
dpkg: error processing archive MyPackage.deb (--install):
 subprocess new pre-installation script returned error exit status 100

I already tried all the traditional ways to remove a lock:

None of those worked for me. I can't find any running process related to to block or lock interface records. I also noticed that this problem is related to this specific package. Any other version of this same package or any other package is installed correctly and does not cause a tany error.

Any ideas ?

After the restoration, the mysql databases are not in / var / lib / mysql but can query

I have restored a mysqldump database and the restored database and the data is there. Unfortunately, the database files are missing in / var / lib / mysql. /Etc/my.conf shows that this is the directory. This is causing problems because remote queries get a database not found. Is there any way to fix this problem? I tried to create a new database and restore the previous database, but again the files do not appear in / var / lib / mysql.