java – How to connect to a MySQL Database with JDBC using Electron

So I’ve been tinkering (fresh out of college) with creating a DB app via Electron as practice. I’ve successfully created a Java-only program that uses JDBC to connect to an MySQL database, but now I have to integrate that with an Electron GUI somehow and frankly I have no idea how to proceed.

Most things I’ve searched explain that connecting to a database via JavaScript is possible though completely not recommended due to the security issues, so I’d like to do this as securely as possible.

Would anyone know how to run Java within an Electron application to connect to a database, or is there some other method I’m overlooking that makes this seemingly Sisyphean task more simple?

Sorry for the possible ignorance here, still learning the ins and outs of different frameworks and such. Any help is appreciated.

TLDR; Need a way to ‘talk’ to my Java backend via Electron. Gotta be able to input Username/Password on Electron GUI and send that to be verified via the more secure Java backend and connect, and then send that connection back to Electron to be worked with.

-Also, I can’t create an Electron tag it seems since I’m not above 300 reputation, and so I’ve used node.js instead.

performance tuning – Typical time for JDBC import in a Postgresql Database

I need to import data from 70 XML files into a Database.

In the past I converted the XML files with a XSLT into SQL INSERT statements and imported them with psql. This had some limitations (error handling) but was quite fast.

Now I wrote a Java program to import the data. I imported 327828 rows in 70 transactions. PgAdmin4 reports after the import, that the database size is 210 MB and the XML data was 250 MB.

But the whole import took over 5 hours. This seems very slow to me. I do not understand how it is possible to spend more than 5 hours on 210 MB data. But I do not have any empirical values either.

How can I find out, if the time, the import took, is the normal behavior for a JDBC import? Is it possible to profile the import either on the database side or the client side to find any bottlenecks? Is it possible to do some kind of speed test to qualify my database setup?

jdbc – DataModeler Oracle – Creating Data Driver Jar

guys!

I’ve tried to create/import a new Data Dictionary in Data Modeler Oracle to SQLServer database. However, when I’m trying to test connection it’s returning the follow message: “Status: Failure – Test Failed: Driver Class not found. Verify the driver location” even the driver location correct.

It’s a 2016 SQLServer and I’ve used the mssql-jdbc-8.2.2.jre8.jar driver (both are in english/enu)

Thank’s since then guys!

Apache Calcite CSV packaging example in a JDBC driver

The CSV example in
https://calcite.apache.org/docs/tutorial.html
shows how to access CSV data using sqlline.

Does anyone know of any tutorial showing how to create a standalone JDBC driver from scratch for CSV sample, so it can be used eg using squirrelsql?

resttemplate – Spring Session JDBC, session replication in multiple applications

I am trying to involve Spring Session JDBC in Spring Boot in multiple apps with the same JDBC configuration.

The intention is to have 1 centralized session for multiple applications in Microservices Architecture.

I follow some tutorials and it is working very well.

Both applications are REST Service and communicate By RESTTemplate.

When I try to call the second app from My First App, I set Spring Sesson cookies on the REST call.

The first time, the call to the second application does not receive a session.
But all the next ones request your session.
I am sending the same session key in any case.

    List cookies =new ArrayList();
        Cookie() cookies2 = request.getCookies();

        for (int i = 0; i < cookies2.length; i++) {
              String name = cookies2(i).getName();
              String value = cookies2(i).getValue();
              if("SESSION".equals(name)){
                  cookies.add(name+"="+value);
                  break;
              }
            }
        System.out.println(request.getSession().getId());
        HttpHeaders headers = new HttpHeaders();
        headers.setContentType(MediaType.APPLICATION_FORM_URLENCODED);
        if(cookies.size()>0)
            headers.set("Cookie",cookies.stream().collect(Collectors.joining("")));

        String requestJson = "data="+objectMapper.writeValueAsString(user);
        System.out.println(requestJson);
        HttpEntity entity = new HttpEntity(requestJson,headers);

        ResponseEntity forEntity =restTemplate.exchange("http://SESSION-DATA-SERVICE/SessionService/sessionService/accountService",HttpMethod.POST,entity, User.class);

jdbc: how to pass the current date and time of the java servlet to db2 databse?

package controller;

import java.io.IOException;
import java.io.PrintWriter;
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.ResultSet;

import javax.servlet.ServletException;
import javax.servlet.annotation.WebServlet;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;

import bean.mybean;
import connection.connect;
import daou.mydaou;

/**
 * Servlet implementation class uregister
 */
@WebServlet("/uregister")
public class uregister extends HttpServlet {
    private static final long serialVersionUID = 1L;

    /**
     * @see HttpServlet#HttpServlet()
     */
    public uregister() {
        super();
        // TODO Auto-generated constructor stub
    }

    /**
     * @see HttpServlet#service(HttpServletRequest request, HttpServletResponse response)
     */
    protected void service(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
        // TODO Auto-generated method std
        try{
            Connection con=connect.dbcon();
            int u_id=0;
            PrintWriter out=response.getWriter();
            String name=request.getParameter("name");
            String u_mail=request.getParameter("u_mail");
            String u_password=request.getParameter("u_password");

            PreparedStatement ps=con.prepareStatement("select max(u_id) from uregister");
            ResultSet rs=ps.executeQuery();
            if(rs.next())
            {
                u_id=rs.getInt(1);
                u_id++;
                PreparedStatement p=con.prepareStatement("insert into uregister values(?,?,?,?)");
                p.setInt(1, u_id);
                p.setString(2, name);
                p.setString(3, u_mail);
                p.setString(4, u_password);

                int i=p.executeUpdate();
                if(i>0)
                {
                    response.sendRedirect("ulogin.html");
                }
                else
                {
                    response.sendRedirect("uregister.jsp");
                }
            }

        }catch(Exception e){
            e.printStackTrace();
        }       
}
}

and the table is

CREATE TABLE uregister(
u_id INT GENERATED BY DEFAULT AS IDENTITY NOT NULL,
name VARCHAR(150) NOT NULL,
u_mail VARCHAR(150) NOT NULL,
u_password VARCHAR(8) NOT NULL,
date DATE DEFAULT CURRENT_DATE,
PRIMARY KEY(u_id,u_mail)
);

Macos – How to configure JDBC on mac using eclipse and Oracle

I have used eclipse before to work on Java projects on Windows, Oracle and eclipse were already configured by my supervisor. I want to use it in MacBook Pro personal mid2014 that runs on Catalina.
I tried to configure VMWare Fusion using Windows, but the Oracle installation always ends with a TNS protocol error. Guide me to configure and connect my Mac / in a virtual machine from installation to project creation.

P.s .: Right now I have successfully installed eclipse on macOS.

Java: integration of MySQL JDBC into the application from the tutorial

I was following this tutorial for JavaFX and Gradle:

JavaFX with Gradle, Eclipse and Scene Builder in OpenJDK11

After completing the tutorial, you will have a simple GUI that will enter a lower and upper limit and generate random numbers between them. The numbers generated and their limits are saved and listed in another view.

After completing this tutorial, I wanted to integrate MySQL as a backend.

The creator of the tutorial recognized at some point that a database such as SQLite would be the preferred backend, but that it was beyond the scope of the tutorial. Fortunately, it seems that he designed the application in such a way that he uses the DAO class and I just replaced his code there with my own JDBC implementation.

package RandomNumber.repository;

import java.util.*;
import java.sql.*;

import RandomNumber.models.RandomNumber;

public class LocalRandomNumberDAO implements RandomNumberDAO {

    public LocalRandomNumberDAO() {

    }

    private List numbers = new ArrayList<>();

    private Connection openConn() throws SQLException {
        return DriverManager.getConnection("jdbc:mysql://localhost:3306/numbers?" +
                                                   "user=demo_java&password=1234");
    }

    @Override
    public boolean save(RandomNumber number) {
        boolean res = false;
        try (
            var conn = openConn();
            var stmt = conn.prepareStatement("insert into numbers_tabl values(?,?,?,?,?)");
        ) {
            stmt.setInt(1, 0);
            stmt.setInt(2, number.getNumber());
            stmt.setInt(3, number.getLowerBounds());
            stmt.setInt(4, number.getUpperBounds());
            stmt.setDate(5, java.sql.Date.valueOf(number.getCreatedAt()));
            res = stmt.execute();
        } catch (SQLException e) {
            e.printStackTrace();
        }
        return res;
    }

    @Override
    public void loadNumbers() {
        try (
            var conn = openConn();
            var stmt = conn.createStatement();
        ) {
            String strQuery = "select * from numbers_tabl;";
            var rset = stmt.executeQuery(strQuery);
            numbers.clear();
            while(rset.next()) {
                numbers.add(new RandomNumber(
                        rset.getDate("created").toLocalDate(),
                        rset.getInt("number"),
                        rset.getInt("min"),
                        rset.getInt("max")
                        )
                    );
            }
        } catch (SQLException e) {
            e.printStackTrace();
        }
    }

    @Override
    public List getNumbers() {
        return Collections.unmodifiableList(numbers);
    }
}

I am also calling loadNumbers every time the generate button is pressed, so I essentially insert 1 row and then check all the rows in the database.

Questions:
1. Should I always provide a value for the PreparedStatement Primary Key in the save method? The description of my table follows:

mysql> describe numbers_tabl;
+---------+------+------+-----+---------+----------------+
| Field   | Type | Null | Key | Default | Extra          |
+---------+------+------+-----+---------+----------------+
| id      | int  | NO   | PRI | NULL    | auto_increment |
| number  | int  | NO   |     | NULL    |                |
| min     | int  | NO   |     | NULL    |                |
| max     | int  | NO   |     | NULL    |                |
| created | date | YES  |     | NULL    |                |
+---------+------+------+-----+---------+----------------+
5 rows in set (0.00 sec)

2. What best practices am I ignoring blatantly?
3. Is there a better way? Or rather, is there a library or framework that is worth more my time?
4. This is the first time I use Java-11. I have always used virtually only the features of Java 7 or 8. Is my use of the keyword var appropriate?

sql server – JDBC TO Big Query using Dataflow

I am using the GCP data flow template provided for "jdbc to Big Query" (here) and I have entered all the relevant details. However, I get the following error after running the job in Dataflow. You are saying that I have an invalid port number. I have enabled the port and also verified that the firewall for SQL Server includes the external Compute Engine IP.
enter the description of the image here

postgresql: JDBC cannot use the current primary database by configuring multiple hosts

  • node1: 192.168.0.1 as primary
  • node1: 192.168.0.2 as standby

The JDBC string is configured like this:

jdbc:postgresql://192.168.0.1:5432,192.168.0.2:5432/mydb

Using repmgr doing replication and automatic failover.

First, use node1 as primary:

 ID | Name  | Role    | Status    | Upstream | Location | Priority | Timeline | Connection string
----+-------+---------+-----------+----------+----------+----------+----------+--------------------------------------------------------
 1  | node1 | primary | * running |          | default  | 100      | 7        | host=node1 user=repmgr dbname=repmgr connect_timeout=2
 2  | node2 | standby |   running | node1    | default  | 100      | 7        | host=node2 user=repmgr dbname=repmgr connect_timeout=2

If node1 is inactive, it can be changed to node2:

 ID | Name  | Role    | Status    | Upstream | Location | Priority | Timeline | Connection string
----+-------+---------+-----------+----------+----------+----------+----------+--------------------------------------------------------
 1  | node1 | primary | - failed  |          | default  | 100      | ?        | host=node1 user=repmgr dbname=repmgr connect_timeout=2
 2  | node2 | primary | * running |          | default  | 100      | 8        | host=node2 user=repmgr dbname=repmgr connect_timeout=2

The application may work fine at this time.

But if the recovery node1 manually:

 ID | Name  | Role    | Status    | Upstream | Location | Priority | Timeline | Connection string
----+-------+---------+-----------+----------+----------+----------+----------+--------------------------------------------------------
 1  | node1 | standby |   running | node2    | default  | 100      | 8        | host=node1 user=repmgr dbname=repmgr connect_timeout=2
 2  | node2 | primary | * running |          | default  | 100      | 8        | host=node2 user=repmgr dbname=repmgr connect_timeout=2

It seems that the application will try to connect node 1 again, then, as it is read-only mode, it cannot insert new data. Therefore, you must change the JDBC connection order as:

jdbc:postgresql://192.168.0.2:5432,192.168.0.1:5432/mydb

Leave node2 first. Restart the application. Works.

I even tried adding parameters to the base of the connection string in the original form (node1, order of node2):

jdbc:postgresql://192.168.0.1:5432,192.168.0.2:5432/mydb?targetServerType=master&loginTimeout=10&connectTimeout=10&tcpKeepAlive=true

The application searches for lost data and will create data again.

So, is the chaning order method the only way in this case? Can't choose the current primary database correctly?