How to export import content from inside a table using mysql commands

I’m transferring a WordPress site to a new server, but I’m trying to do the process as clean as possible. However, since I have to transfer a lot of posts to the new database, and the export feature on WP doesn’t work well, I exported only the essentials tables, and then I’m configuring everything again from scratch.

The only problems is that I’m using the TablePress plugin which stores its data in the wp_posts and wp_postmeta tables, and the settings in the wp_options table.

Since I already the restored the posts using the wp_posts and wp_postmeta, the entries from TablePress plugins are already in the database, but the content doesn’t show up in the plugin, because the index and option are located in the wp_options, which I’m configuring again from scratch.

The plugin developers has suggested that I can export the tablepress_tables and tablepress_options entries from the wp_options, but neither him or I knows the syntax to export and import the entries from a tablet, in this case from the wp_options table in mysql without using phpmyadmin, since it’s not available in the server.

So I wanted to ask if anyone could help figuring out the how to export tablepress_tables and tablepress_options entries from inside the wp_options table inside a database called wordpress using command line on mysql 5.7?


export – Notebook WindowSize is limiting ImageResize and affects result of CreateDocument

In the following example CreateDocument or Export seem to be overriding the size of the graphic being written to a PDF file.

First use a screen grab to create an image and paste into a notebook:

image =<your image>;

then get the dimensions:

{pageX, pageY} = ImageDimensions(image)

Create a symbol to play with the effect of increasing the size:

sizeIncrease = 1200;
imageLarger = ImageResize( image, pageX + sizeIncrease)

If you increase the value of sizeIncrease there comes a point where MMA ignores the over-sizing in order to ensure the resulting graphic fits within the WindowSize (a bigger window size allows a larger graphic.) Now try this with a few values of sizeIncrease:

nb = CreateDocument(imageLarger);
Export("test.pdf", nb);

Again it seems the width of the image in nb is constrained by the window size, however, the size of the image in the Exported file is also being constrained. In my actual case, this means the image in the PDF is too small. At first I thought it was the exported PDF file PrintingMarginsso I tried this:

nb = CreateDocument(res, PageBreakBelow -> True, Visible -> True, 
   WindowSize -> {1200, 900});
Export("test.pdf", nb,
  PrintingOptions -> {"PrintingMargins" -> {{50, 50}, {50, 50}}, 
    "PaperOrientation" -> "Portrait"});

This, however, does not help – something else is constraining the size of the image. Any ideas on how I can get the image sized to fill the printable page?

The overall goal here is to create a function that allows me (now us) to import a PDF file created by a scanner, then binarize, crop, and resize it, before exporting it back to a now much smaller cleaned up PDF. MMA does a fantastic job of reducing dirty looking 200MB PDFs down to a few megabytes. The only problem is the size of the image in the PDF. Here is my first stab at the function. The Do Loop is to avoid reading in the whole PDF file (if I do that with a 200MB PDF the Graphics ByteCount is around 7Gb!)

compressPDF(file_String, folder_String, width_, crop_List) := Module(
  {pageCount, compressedPages, pagei, binarizedImage, pageX, pageY, 
   cropX, cropY, pageCropped, pageLarger},
  pageCount = Import(folder <> fileName, { "PageCount"});
  compressedPages = {};
   pagei = Import(folder <> fileName, { "Pages", i})((1));
   binarizedImage = Binarize(pagei);
   {pageX, pageY} = ImageDimensions(binarizedImage);
   {cropX, cropY} = crop;
   pageCropped = 
     binarizedImage, {{cropX, pageX - cropX}, {cropY, pageY - cropY}});
   pageLarger = ImageResize( pageCropped, {pageX + width});
   AppendTo(compressedPages, pageLarger);
   , {i, 1, pageCount}

magento2 – Magento 2, export data from magento 2 into a csv file using PHP

I need to export data from magento 2 into a csv file. Data needs to be like this: You see, i want to know what customer ordered what and how much.


AN is the product ID, Menge is amount, Bpr is brutto price.

How can i do this using php?

Cheers, John

samsung galaxy s 10 – Cannot export image from app

I have an app from with Samsung S10. It does not work to “export image”. I tried tell them but they lied to me and said that it works. I try again to understand what must be done, if the bug is with me or with them.

E/DatabaseUtils: Writing exception to parcel
    java.lang.SecurityException: Permission Denial: reading androidx.core.content.FileProvider uri content://se.seb.privatkund.fileprovider/external_files/Pictures/SEB/20200525-220849.jpg from pid=8357, uid=1000 requires the provider be exported, or grantUriPermission()
        at android.content.ContentProvider.enforceReadPermissionInner(
        at android.content.ContentProvider.semEnforceReadPermission(
        at android.content.ContentProvider$Transport.enforceReadPermission(
        at android.content.ContentProvider$Transport.enforceFilePermission(
        at android.content.ContentProvider$Transport.openTypedAssetFile(
        at android.content.ContentProviderNative.onTransact(
        at android.os.Binder.execTransactInternal(
        at android.os.Binder.execTransact(

numerical integration – Export data from NIntegrate and Piecewise

I use the following to calculate my Integral.


G(ω_, τC_) := Exp(τC Sqrt(1 + ω^2) (1 - ω^2))

position(x_) := Module({xx = SetPrecision(x, 20)},
  -2 (mSlope/(α π)) Exp(α xx) NIntegrate(
     G(ω, τC) Cos(
        2 τC ω Sqrt(1 + ω^2) - 
         2 ArcTan(ω) + α xx ω)/(1 + ω^2), {
ω, 0, ∞}, WorkingPrecision -> 20) + 2 mSlope xx)

positionLeft(x_) := Module({xx = SetPrecision(x, 20)},
  -2 (mSlope/(α π)) Exp(α xx) NIntegrate(
    G(ω, τC) Cos(
       2 τC ω Sqrt(1 + ω^2) - 
        2 ArcTan(ω) + α xx ω)/(1 + ω^2), 
{ω, 0, ∞}, WorkingPrecision -> 20))

W(x_) := Piecewise({{position(x), x >= 0}, {positionLeft(x), x < 0}})

τC = 1*^-6; α = 10^-3; mSlope = 1/5;

I Would like to save W(x_) for every x from -250 to 250. I have tried Export("result.txt",W(k_),{k,-200,200}) but it does not save. How do I save the data such that first column is x and second is W(x_).

export – saving edited Mathematica image at high resolution

I’ve created several graphs for a book I’m writing by using Mathematica commands to create an output image and then adding labels, etc. with Mathematica’s drawing tools. My issue is that I can’t seem to save the edited version of the images at high enough dpi resolution.

I’ve looked at the Export and ImageResolution commands, but these seem only to work on the original output image not the edited version with my added writing/labels.

Does anyone have any ideas?

export – How to import magento products’ data to Akeneo Pim?

I’ve installed the magento side connector, and found it’s expensive to buy a PIM side connector to import magento data into pim, does it exist some instructions for importing products from magento to akeneo PIM?
I’ve more then 3000 products and their attributes set in magento.

plugins – Exporting post content from wordpress using WP All Export and wpautop() php function to include paragraph tags

I have exported my post content from wordpress using WP All Export plugin, however the content of my posts has been exported as HTML but is missing paragraph tags causing all the text to run together when I import it to my new website.

I have read this guide:

And would like to apply the solution described here: when exporting the Content field.

However, my knowledge of PHP is poor at best.

Below is a screenshot of what I believe I need to do. Will this work? I can’t test it as I don’t yet have the full version of the plugin yet and I don’t want to waste money purchasing the full version of WP All Export yet in case this won’t work.

I don’t want to use this solution as I need to leave the existing wordpress website as is. How do I add paragraph tags to all of my posts after relying on wpauto? – but what I do want to do is what this post suggests at the end: “apply the autop filter in your export routine”

Can anyone please provide some guidance?

enter image description here

gcloud – Export Google Cloud SQL slow logs to ELK stack

I stumbled upon an issue and decided to ask for advice and eventually find someone with the same business need (and problem).

Summary – we’ve recently migrated the SQL service of one of our clients from a self-hosted MySQL to the Google CloudSQL MySQL 5.7 service and we’re now looking for a way to allow our client to access/view and analyze the MySQL slow logs in their self-hosted ELK stack.

This was not a problem before as we had access to the SQL service slow log file and managed to “follow”, export and index those logs in the Elasticsearch. A quick example of how a particular Elasticsearch MySQL slow log Single Document looked like before is shown below:

Please note that every single document contained:


Select/Insert/Update/Delete QUERY
and the ability to analyze data based on lock_time, rows_queried, rows_affected, rows_returned and etc.

Here is an example of how the MySQL slow logs are displayed in the GCP Log viewer to which our client doesn’t have access:

So our goal is to stream the MySQL slow logs to the ELK stack in a way similar to the one used before.

To achieve our goal we’ve tried to pull the GCP MySQL slow logs via Pub/Sub exporter ( and stream logs to the ELK stack. For that purpose, we did the following:
1. created a log sink (Log Router) by using the filter below :
and exported to Google’s Pub/Sub sink service
2. on a Google Computer Engine VM, we installed and configured file exporting service named pubsubbeat (service is similar to the filebeat’s pubsub input method) to stream the SQL slow logs from GCP to a file on the VM in question
3. configured a filebeat service to follow the logs exported by GCP on the VM and by using include_lines: (‘textPayload’) to include only the important information within each JSON object pulled by GCP

NB: GCP MySQL slow logs are accessed via google-fluentd agent and are represented using a single data type, LogEntry, which defines certain common data for all log entries as well as carrying individual payloads. Format is JSON and each log line is encapsulated to separate JSON object. Example:

Immediate problem – Pub/Sub makes some tradeoffs to achieve its high availability and throughput and unfortunately get messages out of order is one of them – in other words, the order of MySQL slow logs is mixed and we cannot properly encapsulate each separate slow log object by defining that it is starting with the “^# Time” string – instead, we have the following format:

So it would be of great help if someone shares how are they exporting multi-line log files (or MySQL slow logs directly) from GCP to an external log-analyzing system (as ELK stack) so we can get a better understanding of what’s the best approach in this situation?

powershell – Script to export site details SharePoint 2013

You can use the below script:

Function GetGeneralInfo($siteUrl, $OutputFile) 
#Write CSV- TAB Separated File) Header 
"Name `t Value" | out-file $OutputFile 
$ContentDB =  Get-SPContentDatabase -site $siteUrl 
$ContentDatabaseSize = (Math)::Round(($ContentDatabase.disksizerequired/1GB),2) 
"Database Name  `t $($ContentDB.Name)" | Out-File $OutputFile -Append 
"Database ID  `t $($ContentDB.ID)" | Out-File $OutputFile -Append 
"Site count `t $($ContentDB.CurrentSiteCount)" | Out-File $OutputFile -Append 
"Site count `t $($ContentDB.MaximumSiteCount)" | Out-File $OutputFile -Append 
"Can Migrate `t $($ContentDB.CanMigrate)" | Out-File $OutputFile -Append 
"Content DB Size `t $($ContentDatabaseSize) GB" | Out-File $OutputFile -Append 
"Database Servername `t $($ContentDB.Server)" | Out-File $OutputFile -Append 
"Connection String `t $($ContentDB.DatabaseConnectionString)" | Out-File $OutputFile -Append 
"Display Name `t $($ContentDB.DisplayName)" | Out-File $OutputFile -Append 
"Schema `t $($ContentDB.SchemaVersionXml)" | Out-File $OutputFile -Append 
GetWebSizes -StartWeb $siteUrl 

How to use the above function?

GetGeneralInfo "http://spFarm" "C:GeneralInfo.csv"

You can use call the above function inside your site collection for-each loop collection.


SharePoint 2013: Get Site information using PowerShell Script