API for SQL queries with PHP PDO

Any feedback and criticism is very much welcome! I’m writing a very simple CRUD application, and I’m wondering if the way I’m using static methods throughout the code makes sense. I’d very much like to simplify the code, if possible.


<?php

class Database {

    private const DB_DSN = "127.0.0.1";
    private const DB_USER = "root";
    private const DB_PASSWORD = "root";
    private const DB_NAME = "testDB";
    private const DB_PORT = 8889;

    static function connect() {
        return new PDO(
            "mysql:host=".self::DB_DSN.";port=".self::DB_PORT.
            ";dbname=".self::DB_NAME,
            self::DB_USER,
            self::DB_PASSWORD,
            array(PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION),
        );
    }

    static function create() {
        $pdo = new PDO(
            "mysql:host=".self::DB_DSN.";port=".self::DB_PORT,
            self::DB_USER,
            self::DB_PASSWORD,
            array(PDO::ATTR_ERRMODE => PDO::ERRMODE_EXCEPTION),
        );
        $pdo->exec("CREATE DATABASE IF NOT EXISTS ".self::DB_NAME);
        return self::connect();
    }
}

class User {

    private const TABLE_NAME = "users";

    public $uid;
    public $username;
    public $email;
    public $hash;
    public $created;
    public $verified;
    public $pdo;

    private function __construct() {
    }

    public static function signUp($pdo, $username, $email, $password) {

        if (self::exists($pdo, $username)) {
            return false;
        }

        $hash = password_hash($password, PASSWORD_DEFAULT);
        $stmt = $pdo->prepare("INSERT INTO `users` SET `username`=?, `email`=?, `hash`=?");
        $stmt->execute(($username, $email, $hash));
        return self::fromUId($pdo, $pdo->lastInsertId());
    }

    public static function login($pdo, $username, $password) {

        if (!self::exists($pdo, $username)) {
            return false;
        }

        $stmt = $pdo->prepare("SELECT * FROM `users` WHERE `username` = ?");
        $stmt->execute(($username));
        $user = $stmt->fetchObject("User");
        $user->pdo = $pdo;
        return password_verify($password, $user->hash) ? $user : false;
    }

    public static function exists($pdo, $username) {

        $stmt = $pdo->prepare("SELECT `uid` FROM `users` WHERE `username` = ?");
        $stmt->execute(($username));
        return $stmt->fetch(PDO::FETCH_COLUMN);
    }

    public static function fromUId($pdo, $uid) {

        $stmt = $pdo->prepare("SELECT * FROM `users` WHERE `uid` = ?");
        $stmt->execute(($uid));
        $user = $stmt->fetchObject("User");
        $user->pdo = $pdo;
        return $user;
    }

    public function verify() {

        $stmt = $this->pdo->prepare("UPDATE `users` SET `verified` = 1 WHERE `uid` = ?");
        if ($stmt->execute(($this->uid))) {
            $this->verified = true;
            return true;
        } else {
            return false;
        }
    }
}

$db = Database::create();

$db->exec(
    "CREATE TABLE IF NOT EXISTS `users` (
        `uid` int NOT NULL PRIMARY KEY AUTO_INCREMENT,
        `username` varchar(100) NOT NULL UNIQUE,
        `email` varchar(100) NOT NULL UNIQUE,
        `verified` boolean DEFAULT 0,
        `hash` varchar(255) NOT NULL,
        `created` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP
    )"
);

$db->exec(
    "CREATE TABLE IF NOT EXISTS `images` (
        `id` int NOT NULL PRIMARY KEY AUTO_INCREMENT,
        `uid` int NOT NULL,
        `image` varchar(255) NOT NULL,
        `like_count` int NOT NULL DEFAULT 0
    )"
);

$user = User::signUp($db, 'JohnDoe', 'john.doe@sample.org', '12345');

?>
```

hash – HaveIBeenPwned API – Code Review Stack Exchange

Please find my code for securely checking Haveibeenpwned for breached passwords

This code will only send a partial HASH over the internet (using HTTPS) and the second part of the HASH will be checked on the users side. Therefore, the full password will never be seen by anyone including haveibeenpwned

import hashlib
import urllib.request

def check_password(firstHash, lateHash):
    link = f"https://api.pwnedpasswords.com/range/{firstHash}"
    f = urllib.request.urlopen(link)
    for line in f.readlines():
        searchedHash, hashCount = str(line, "utf-8").strip().split(":")
        if searchedHash.lower() == lateHash.lower():
            return (f"This password has been breached {hashCount} time(s)")
    return ("This password has not been breached")

def hash_password(password):
    encoded_password = password.encode("UTF-8")
    SHA1Hash = hashlib.sha1()
    SHA1Hash.update(encoded_password)
    return SHA1Hash.hexdigest()

def main():
    password = input("Please enter your password: ")
    fullHash = hash_password(password)
    firstHash, lateHash = fullHash(0:5), fullHash(5:)
    print(check_password(firstHash, lateHash))

if __name__ == "__main__":
    main()

i want acces data from an application using python api

I have an application which has quiz with 10 question and all qts have radio buttons eg each qts has 
4 radio buttons and quiz results have to be mailed to teachers ,admin and parents so applications 
options Here are
1.email
2.run at server
Now I need this data from this application to have a seperate analytics dashboard so I have created 
Aws Ubuntu server,database ,table I need to know how can I get this data through python API kindly 
help its humble request.
import urllib.request
import json

url = 'https://hacker-news.firebaseio.com/v0/topstories.json?print=pretty'

def response(url):
    with urllib.request.urlopen(url) as response:
        return response.read()
res = response(url)
print(json.loads(res))

This is what I am getting every where I have tried this but I need a architecture where I shall get data from application I the application has send to server option where actually they need a server address how I can get this data because all API scripts says fetch from url or get from url kindly help

8 – Cart API doesn’t fire the “update cart” event

Following this documentation I have created a custom event subscriber for add, update and remove items from cart. It works good and events are fired in every case you add,update or remove items from the cart on the cart page or from the shop view.
I have installed Commerce Cart Fly Out, that uses Commerce Cart Api and a fly-out window to show the cart, and It uses REST to update and remove. If you remove an item, then my remove event is fired. But if you update the quantity, update event is not fired. I have followed, with xdebug, the execution progress until class CartUpdateItemsResource and its patch method, that save the order item. The line is updated well but my update event is not fired. I don’t know the reason.

This is my custom Event subscriber:

/**
 * Cart Event Subscriber.
 */
class MyCartEventSubscriber implements EventSubscriberInterface {

  protected $database;

  protected $messenger;

  protected $cartManager;

  public function __construct(MessengerInterface $messenger, CartManagerInterface $cart_manager) {
    $this->messenger = $messenger;
    $this->cartManager = $cart_manager;
    $this->database = Drupal::database();
  }

  public static function getSubscribedEvents() {
    return (
      CartEvents::CART_ENTITY_ADD => (('addToCart', 100)),
      CartEvents::CART_ORDER_ITEM_REMOVE => (('removeFromCart', 100)),
      CartEvents::CART_ORDER_ITEM_UPDATE => (('updateCart', 100)),
    );
  }

  public function addToCart(CartEntityAddEvent $event) {
    Drupal::logger('CartEvent')->notice('add to cart');
    // ... some code
  }

  public function removeFromCart(CartOrderItemRemoveEvent $event) {
    Drupal::logger('CartEvent')->notice('remove from cart');
    // ... some code
  }

  public function updateCart(CartOrderItemUpdateEvent $event) {
    Drupal::logger('CartEvent')->notice('update cart item');
    // ... some code
  }
}

reactjs – Ability to specify existing dynamodb source when adding Graphql api

We are facing issue while creating aws graphql appSync api using amplify. We are not able to connect to the existing tables of dynamoDB. Using graphql transformers (e.g “@modal”) create new table in dynamoDB.
Here is my schema code.

type Planet @model @key(fields: ("guid")) { guid: String! name: String! description: String }

This schema code create new table in dynamoDB but i don’t want to create new table i want to connect with exiting table “Planet”

API key vs JWT – which authentication to use and when

I have read multiple pages/blog posts on API key vs JWT and still I’m confused when to use one of them. Most recent one are saying that JWT became a standard for API authentication but then it became confusing for me in few cases described below.

JWT “no-brainer” choice is for any UI app which will need to authenticate user as well any API calls which require authorization on the API not just authentication.

Then to voice came up APIs which requires only authentication and do not need to identify individual user. JWT in that case looks like an overkill.

On another hand how does it look when JWT is used by API (direct call, no UI) and is not “static” and that will require to generate refresh token for it. How in a general way API should to handle it? Should be that done per each request?

Can a Postgresql trigger send a row to an api via a post request

I have an application where it would be really nice to be able to use an insert trigger on a table to format and send a post request to an external API. This would save me having to write the code to do this work in another application.

I have had a look around and can see some information about using Postgres in a RESTFUL manner, but this seems to be mainly aimed at the database being the server for the api, which in this case isn’t the direction I need.

I have thought about using NOTIFICATIONS to trigger an external app and have played with a nodejs app to handle these, which seems to work (as an aside, does anyone know of a way of subscribing to notifications from a golang app?) so that is an option too.

Any pointers appreciated.

Thanks.

shipment – Post tracking in bulk REST API

We are currently create tracking one by one by using POST rest/V1/order/:orderId/ship:

{
 "notify": true, 
 "tracks": [ 
   { 
     "track_number": "1234567890",
     "title": "DPDPRE", 
     "carrier_code": "custom" 
   } 
 ] 

}

it works fine,
But now we need to POST in Bulk, we have tried /bulk/V1/order/byOrderId/ship:

    [
    {
        "orderId": "284",
        "shipment": {
            "notify": true,
            "tracks": [
                {
                    "track_number": "1234567890",
                    "title": "CODE",
                    "carrier_code": "null"
                }
            ]
        }
    },
    {
        "orderId": "285",
        "shipment": {
            "notify": true,
            "tracks": [
                {
                    "track_number": "0987654321",
                    "title": "CODE",
                    "carrier_code": "null"
                }
            ]
        }
    }
]

it ends with no error, but nothing is updated, the following job remains in progress
Topic async.magento.sales.api.shiporderinterface.execute.post
Thanks for your help

node.js – NextJS API endpoint for a ‘form backend’ service

(This is using NextJs but there was no tag for that available)

I’m building a form backend service, similar to Formspree. The user can create a form however they like on their frontend and submit it to my API endpoint to be collected and stored. This becomes tricky when needing to handle files as well as plain body fields. The code I have below seems to work fine but it has become excessively messy.

import nextConnect from 'next-connect';
import bodyParser from 'body-parser';
import { nanoid } from 'nanoid';
import multer from 'multer';
import aws from 'aws-sdk';
import multerS3 from 'multer-s3';
import middleware from '../../../../middlewares/middleware';
import trimObject from '../../../../helpers/trimObject';
import runMiddleware from '../../../../helpers/runMiddleware';
import { MAX_FILE_SIZE, USER_FILE_LIMIT } from '../../../../constants';

const spacesEndpoint = new aws.Endpoint('nyc3.digitaloceanspaces.com');
const s3 = new aws.S3({
  accessKeyId: process.env.AWS_ACCESS_KEY_ID, // Set access key here
  secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
  endpoint: spacesEndpoint,
});
const handler = nextConnect();

handler.use(middleware);
export const config = {
  api: {
    bodyParser: false,
  },
};

// Catch posts submissions
handler.post(async (req, res) => {
  try {
    // Get the form and make sure it exists
    const { formId } = req.query;
    const form = await req.db.collection('forms').findOne({ _id: formId });
    if (!form) {
      return res.redirect(302, `${process.env.NEXT_PUBLIC_WEB_URI}/invalid?error=404`);
    }

    // Get the form creator and subscription level
    const user = await req.db.collection('users').findOne({
      _id: form._creator,
    });
    if (!user) {
      return res.redirect(302, `${process.env.NEXT_PUBLIC_WEB_URI}/invalid?error=404`);
    }

    let isFileUploadAllowed = false;

    // Determine if the form is multipart
    const isMultipart = req.headers('content-type').toLocaleLowerCase().startsWith('multipart/form-data');

    if (isMultipart) {
      // Get the file sizes of all the users files
      const userFiles = await req.db.collection('files').aggregate(
        (
          { $match: { _owner: form._creator } },
          {
            $group: {
              _id: null,
              totalSize: {
                $sum: '$size',
              },
            },
          },
        ),
      ).toArray();

      // Check that the user hasn't exceed thier file upload limit
      const totalFileSize = userFiles.length && userFiles(0).totalSize;
      if (totalFileSize < USER_FILE_LIMIT) {
        isFileUploadAllowed = true;
      }
    }

    // Create the transaction
    const session = req.dbClient.startSession();
    try {
      const transactionOptions = {
        readPreference: 'primary',
        readConcern: { level: 'local' },
        writeConcern: { w: 'majority' },
      };

      const transactionResults = await session.withTransaction(async () => {
        const submissionId = nanoid();

        // Is multipart form and fileupload allowed
        if (isFileUploadAllowed && isMultipart) {
          await runMiddleware(
            req,
            res,
            multer({
              limits: { fileSize: 26214400 },
              storage: multerS3({
                s3,
                bucket: process.env.AWS_S3_BUCKET,
                acl: 'public-read',
                key(_, file, cb) {
                  cb(null, `${user._id}/${form._id}/${submissionId}/${file.originalname}`);
                },
              }),
            }).array('_file', 5),
          );

          // Create File objects and insert them
          let files = ();
          req.files.forEach((file) => {
            const { key, size, originalname } = file;
            files = (...files, {
              name: originalname,
              key,
              size,
              _id: nanoid(),
              _created: new Date(),
              _submission: submissionId,
              _form: form._id,
              _owner: form._creator,
            });
          });

          await req.db.collection('files').insert(files, { session });
        } else if (isMultipart) {
          // Is multipart form and fileupload not allowed, errors if any files attached
          await runMiddleware(
            req,
            res,
            multer().none(),
          );
        } else {
          // Is not multipart form form
          await runMiddleware(
            req,
            res,
            bodyParser.urlencoded({ extended: false }),
          );
        }
        // Create Submission object
        const body = trimObject(req.body);
        const submission = {
          ...body,
          _id: submissionId,
          _created: new Date(),
          _form: formId,
        };

        await req.db.collection('submissions').insertOne(submission, { session });
      }, transactionOptions);

      // Transaction was successful, send to thank you apge
      if (transactionResults) {
        return res.redirect(302, `${process.env.NEXT_PUBLIC_WEB_URI}/thanks`);
      }
      return res.redirect(302, `${process.env.NEXT_PUBLIC_WEB_URI}/invalid?error=500`);
    } catch (e) {
      // Error when uploading files
      if (e instanceof multer.MulterError) {
        // Files included when not allowed
        if (e.code === 'LIMIT_UNEXPECTED_FILE') {
          return res.redirect(302, `${process.env.NEXT_PUBLIC_WEB_URI}/invalid?error=400`);
        }
        // Any other error uploading file (most likely file size)
        return res.redirect(302, `${process.env.NEXT_PUBLIC_WEB_URI}/invalid?error=413`);
      }

      // Error not caused by file uploads
      return res.redirect(302, `${process.env.NEXT_PUBLIC_WEB_URI}/invalid?error=500`);
    } finally {
      await session.endSession();
    }
  } catch (e) {
    // Catch all send to 500 error invalid page
    return res.redirect(302, `${process.env.NEXT_PUBLIC_WEB_URI}/invalid?error=500`);
  }
});

// Catch all other methods
handler.all(async (req, res) => res.redirect(302, `${process.env.NEXT_PUBLIC_WEB_URI}/invalid?error=405`));

export default handler;

cloud computing – I am not able to access compute engine API with gcloud?

I have an instance running with access scope ‘Set Access for each API’, and explicitly allowing Compute Engine API with Read Write access.

So I logged inside the instance, and I tried to run this command:-

gcloud compute instances list

and I got an error:

- Required 'compute.zones.list' permission for 'projects/dotted-howl-xxx'

My user is having explicitly allowing access to compute Engine API but still I am getting the error. I shouldn’t get this error right? What am I missing here?