linux – Speedtest Ookla No Timestamp in CSV output

So when running the official speedtest client from Ookla and outputting to a CSV file, I’ve noticed the output has no timestamp field. The JSON does, but I’m not particularly savvy with jq and trying to convert the JSON output to a CSV isn’t useful.

Is there a way to take the output and pipe it to a file with a timestamp in the front?

This is the output given as a JSON

{"type":"result","timestamp":"2021-07-22T16:14:17Z","ping":{"jitter":0.035999999999999997,"latency":3.9399999999999999},"download":{"bandwidth":117078051,"bytes":884657048,"elapsed":7601},"upload":{"bandwidth":117029963,"bytes":467614102,"elapsed":4006},"packetLoss":0,"isp":"CenturyLink","interface":{"internalIp":"192.168.0.35","name":"eth0","macAddr":"E4:5F:01:2F:1D:39","isVpn":false,"externalIp":"71.214.44.165"},"server":{"id":10161,"name":"CenturyLink","location":"Orlando, FL","country":"United States","host":"orlando.speedtest.centurylink.net","port":8080,"ip":"205.171.98.14"},"result":{"id":"64657421-d008-4053-9832-2d1a9b01b649","url":"https://www.speedtest.net/result/c/64657421-d008-4053-9832-2d1a9b01b649"}}

and this is the output of the CSV (with headers)

"server name","server id","latency","jitter","packet loss","download","upload","download bytes","upload bytes","share url"
"The Villages - The Villages, FL","25753","33.338","0.302","0","117318528","112406990","1488776432","1053747984","https://www.speedtest.net/result/c/8bbb92b8-880d-4021-b5e5-c90206862d18"
"CenturyLink - Orlando, FL","10161","4.013","0.399","0","76816660","112435444","1158108878","473391675","https://www.speedtest.net/result/c/17508892-6fc7-4616-84bb-810d314c50af"
"CenturyLink - Orlando, FL","10161","3.533","0.407","0","115293486","97552291","1002647576","574510787","https://www.speedtest.net/result/c/9913a846-1fbf-4d69-a1e9-27430914d397"

All I’m trying to do is to get the added timestamp data that gets outputted by JSON into a CSV format so I can further process.

Can I configure MacOS to open CSV and Excel files in Google Sheets?

Can I configure MacOS to open CSV and Excel files in Google Sheets? – Web Applications Stack Exchange

Simple JSON to CSV in MS Power Automate problem

Good Afternoon,

I am trying convert the following JSON using the URL: https://api0.solar.sheffield.ac.uk/pvlive/v3/pes/0 to CSV vis MS Power Automate.

I use HTTP to get the above URL

And this is then where it goes wrong. I Parse the JSON using the following

    {
"type": "object",
"properties": {
    "data": {
        "type": "array",
        "items": {
            "type": "array"
        }
    },
    "meta": {
        "type": "array",
        "items": {
            "type": "string"
        }
    }
}

}

which then i pass onto a CSV table which needs an array not an object.

Any help most apprecited

Bob

csv – Amasty Import Product Extension Field Mapping

Does anybody know how to properly configure Amasty Product Import Field Mappings or where to find detailed documentation about this?

On their website I can only find basic module configuration docs, nothing about how to create categories, map attributes to source data.

Platform: Magento 2.4.2.
Extension: https://amasty.com/import-products-for-magento-2.html

magento2 – How to get columns header of csv in custom import module

Hi I created custom module for import data by helping module ( https://github.com/Smile-SA/magento2-module-custom-entity-import-export)

In this case I need to get csv header names.
I tried several ways but it’s not working.

I tried getSource method but system gives source is not set .

If anyone know the answer please help me solve the issue

Thanks

php – download all csv files from ftp recursive

I am trying to get the csv files from from different folders on FTP recursiverly, but getting an error while trying to download, I searched most of the questions but not releavent to the situation where I am struck.

FTP: csv file location
/home/lanein1/ftpfiles/AIN/2021-07-14/AIN.csv
/home/lanein1/ftpfiles/AOUT/2017-07-14/AOUT.csv
/home/lanein1/ftpfiles/BIN/2021-07-14/BIN.csv
/home/lanein1/ftpfiles/AOUT/2017-07-14/BOUT.csv

I can get to the directory ftpfiles and do a pregmatch , but I am getting Invalid argument supplied foreach()

Code:

$login_result = ftp_login($con, usr,pwd);
 ftp_pasv($con, true);
 basedir = /home/lanein1/ftpfiles/ --- this is were i am stuck to pass the   different folders
 $path =  'basedir'.date('Y.m.d',strtotime("-1 days"));
 $files = ftp_nlist($con,$path);
   {
   foreach ($files as  $file)
        { 
        if (preg_match("/.csv$/i", $file))
        {
            echo" Found $filen";
          ftp_get($con, $file, FTP_BINARY);
        }
      }
}    
 ftp_close($con);

addtocart – How to keep ‘In stock’ even though the qty is 0 after a csv file import

We are currently running a cron job to import sku, qty, and price every 30 min.

If the qty for a product becomes 0, the stock status changes from ‘In stock’ to ‘Out of stock’ automatically, ‘Add to Cart’ button is gone on the product page, which makes customers impossible to add the part to the cart.
enter image description here
enter image description here

I also allowed the backorder in the configuration but no use.

enter image description here

I’d like to display the Add to cart button even though the qty is 0.

Can you please share your wisdom?

The magento ver. is 2.3.4.

Java: Is there an easy way to append a new column that will be having same value for all rows in csv?

I am processing a file as the following and closing it. Later, I do some more processing and get some ID that I want to append to all rows in the previous CSV generated. So all rows will have the same value.

My initial code of creating and appending data to csv:


    public void writeToFile(String() tIds, PrintWriter printWriter) throws DataNotFoundException {
        int rowCount = 0;
        for(String id: tIds) {
            Data data = util.getData(id);
            csvHelper.prepareFileData(data, this.stringBuilder);
            rowCount++;
            
            if (rowCount == CHUNK_SIZE) {
                printWriter.println(this.stringBuilder.toString());
                this.stringBuilder = new StringBuilder();
                rowCount = 0;
            }
        }
        printWriter.close();
    }

Now further processing returns me some processedID that I want to append to all rows as a new column.

One option is this:

public void appendAgain(String processedId) {
        
            BufferedReader br = new BufferedReader(new FileReader(feedFile));
            String output = "";
            String line;
            while ((line = br.readLine()) != null) {
                output += line.replace(",", "," + alertId + ",");
            }
            FileWriter fw = new FileWriter(feedFile, false); //false to replace file contents, your code has true for append to file contents
            fw.write(output);
            fw.flush();
            fw.close();
}

Please comment on a better way or any suggestions on the current one. Thanks!

woocommerce offtopic – Import ID Categories from CSV file

I’m trying to import a CSV file with custom header columns.

My 4 header columns are: id categoria ; id Padre ; Nombre ; Padre

It is in spanish but it translates as: id Category, parent id, Name, Parent

The issue that I’m struggling with, is that I have to match my columns with specific Woocommerce descriptions, because I’m using the basic import tool provided by Woocommerce.

I don’t know what to match id Category with. I was able to match nombre/name , id Padre/Parent and Padre/Category, because “Padre” column shows all of the category names.

ID Category column shows me the numbers I have to import as category ID, to further automatically import my products into the correct categories.

Any ideas?

Thanks in advance.

tkinter – Agregar datos de SQLite a CSV con Python

Verán tengo esta parte de código que se encarga de consultar en una BD.

Mediante un combobox se puede seleccionar un ID existente en la base y seguidamente se aprieta el botón Generar CSV y con esto se genera un documento donde se ve una columna con el ID seleccionado.

Mi problema es que, con el mismo procedimiento no solo se debe agregar el ID si no también todos los demás datos en diferentes columnas.

Por Ejemplo: Al seleccionar el ID 01 en el combobox, en el CSV se debe de mostrar

“01, Simon, Perez, Masculino, Mexico”

Cada dato en una columna. Pondría más código pero de verdad estoy perdido y es todo lo que llevo hasta el momento, Agradezco si me pudieran ayudar.
Más abajo adjunto una pequeña imagen de la BD que tengo.

from tkinter import *
import tkinter as tk
from tkinter import ttk
import os 
import pandas as pd
import sqlite3


root = Tk()
root.geometry("460x290")
root.config(bg="dark cyan")

#####---Generador de CSV---#####
def Datos1():    
      Combo3_info = Combo3.get()
      data = {"Nombre":(Combo3_info)}  
      
      archivo = pd.DataFrame(data)
      archivo.to_csv("PRUEBA_FECHA.csv", mode="a", header=not os.path.isfile("Data1.csv"))
      
      
#####---Diseño---#####
Frame1 = Frame(root, bd=2, padx=10, pady=3)
Label(Frame1, text="Seleccione un ID: ", font=('Times', 14)).grid(row=1, column=0, sticky=W, pady=10)
Combo3 = ttk.Combobox(Frame1, font=('Times', 15), width=25)
Combo3.grid(row=1, column=1, pady=10, padx=20, sticky=W)

Boton1 = Button(Frame1, text="Generar CSV", width=10, command = Datos1).grid(row=4, column=1, sticky=W, pady=10)
Frame1.place(x=20, y=40)

#####---Consulta a la DB---#####
def combo_Name():
    conn = sqlite3.connect('DB1.db')
    cur = conn.cursor()
    query = cur.execute('SELECT ID FROM Usuario')
    
    data = ()
    for row in cur.fetchall():
        data.append(row(0))
    return data

    cur.close()
    conn.close()
    
Combo3('values') = combo_Name() 
        
root.mainloop()

introducir la descripción de la imagen aquí

DreamProxies - Cheapest USA Elite Private Proxies 100 Cheap USA Private Proxies Buy 200 Cheap USA Private Proxies 400 Best Private Proxies Cheap 1000 USA Private Proxies 2000 USA Private Proxies 5000 Cheap USA Private Proxies ExtraProxies.com - Buy Cheap Private Proxies Buy 50 Private Proxies Buy 100 Private Proxies Buy 200 Private Proxies Buy 500 Private Proxies Buy 1000 Private Proxies Buy 2000 Private Proxies ProxiesLive.com Proxies-free.com New Proxy Lists Every Day Proxies123.com Best Quality USA Private Proxies