Why SWIFT converted base64 image is smaller than the actual image?


I’m learning swift at the moment.

In my current project, I’m using this library: https://github.com/mikaoj/BSImagePicker

The above library allows me to select multiple images from the Gallery.

Everything works fine and as they should.

Now, I need to convert the selected images into base64 so I can upload them onto my server later on.

I can convert the images into Base64 format using the code below.

But the issue that I have is that when I view the base64 images in my browser, they are way too small (width & height).

I need to convert the images to base64 and keep their original width/height or at least 50% of their width/height.

This is my current code:

   @IBAction func callImagePicker(_ sender: UIButton) {
    
        let imagePicker = ImagePickerController()

        presentImagePicker(imagePicker, select: { (asset) in
       // User selected an asset. Do something with it. Perhaps begin processing/upload?
            
   PHImageManager.default().requestImage(for: asset, targetSize: PHImageManagerMaximumSize, contentMode: .aspectFit, options: nil) { (image, info) in
       // Do something with image
    let imageData = image?.jpegData(compressionQuality: 10)
    
    // Convert image Data to base64 encodded string
    let imageBase64String = imageData?.base64EncodedString()
   print(imageBase64String ?? "Could not encode image to Base64")
    
    
    
   }
            
            
            
        }, deselect: { (asset) in
            // User deselected an asset. Cancel whatever you did when asset was selected.
        }, cancel: { (assets) in
            // User canceled selection.
        }, finish: { (assets) in
            // User finished selection assets.
 
            
        })
        
        
    }

Could someone please advice on this issue?