Creating a virtual camera that shows an app (override web whatsapp QR code)

I’m trying to override the problem of connecting to web whatsapp on a phone without a camera in order to scan the QR code.
It’s probably not important but the computer runs linux.

My idea: creating a virtual device that the phone (samsung galaxy s21) will see as a camera, and in this camera there will be:

Option 1: the remote controlled desktop of the linux computer (by vnc or similar).

Option 2: the android phone will be connected by USB to the computer and it will see the desktop as a “camera”.

Any other idea can be good, but it must be a synchronized whatsapp and not on an android emulator.

Thanks!

camera settings – How do i keep the self timer selected with the Nikon D5600?

This is really very annoying. If i set the release mode to timer and take a photo, the release mode reverts back to single shot. And i have to set it to timer mode again each time i want to take another shot.

When i know , i am going to be shooting all my photos in timer delay mode, i still have to turn it back to timer delay , before every shot. Is this common for all Nikon cameras ? How about other DSLRs?

Am i missing something in the settings ?

Camera icon was deleted

So my father managed to delete the camera icon from his app drawer. I searched the app drawer to see if it got hidden in a folder.. I checked his app settings to see if the camera is enabled. His camera is enabled. Double tapping the lock button still brings up the camera. I tried to reset app defaults and permissions that didn’t do anything. I booted into safe mode it was not there. The camera app isn’t set as default. So I tried downloading another camera app to see if I could trick it into allowing me to make the camera default again. No such luck. I have no idea what else try short of factory reseting the phone which he doesn’t want to do

unity – Camera doesnt track player properly on different resolution

Hello i have 2d mobile game where camera follow players car, on my mobile phone with resolution 2960×1440 everything works fine but when i start the game on tablet with resolution 2160×1620
the camera shows only a half of the car… i made some drawings for better imagination 🙂

first is my mobile resolution

and second is my tablet resolution

I was trying to do it by myself for a few days but i am new in unity and i do not fully understand Camera class

i have this script on my camera:

public GameObject playerPos;
private Vector3 lastPlayerPosition;
private float distanceToMove;
void Start()
{
    lastPlayerPosition = playerPos.transform.position;
}
void Update()
{
    
        distanceToMove = playerPos.transform.position.x - lastPlayerPosition.x;
        transform.position = new Vector3(transform.position.x + distanceToMove,         transform.position.y, transform.position.z);
        lastPlayerPosition = playerPos.transform.position;
         
}

java – Android studio Application Stops responding when image is imported manually from Gallery or Camera app

So my application is an Image to text and text-to-speech app. The idea is to scan text from any source and it is read out to the user. The basic functionality of the app is working, when I click on “Capture Image” on the main screen, it opens the camera, takes a pic, and extracts text. But I also added another function where the user can manually add pictures from his gallery by clicking on the gallery icon.

So the problem is when I select the gallery and choose my own image, it won’t add to the main image view canvas, instead, it crashes the application. There is no error in the code, but it just won’t run.

Here are 20-second clips I have attached on Google Drive to show you the precise problem for a better understanding for you all. I’m very close! APP LOGCAT ERROR MESSAGE IN THE BOTTOM

https://drive.google.com/drive/folders/18oCeFkVEvi1xPv_O4DEI3EZz0TtVyZ36?usp=sharing

Kindly please let me know what the problem is, as Android studio does not give any warnings or errors when the project is run. By crashing I meant, it gives the message “app not responding” error message.

MainActivity.Java

package com.example.textrecognition;

import androidx.annotation.NonNull;
import androidx.annotation.Nullable;
import androidx.appcompat.app.AlertDialog;
import androidx.appcompat.app.AppCompatActivity;
import androidx.core.app.ActivityCompat;
import androidx.core.content.ContextCompat;

import android.Manifest;
import android.content.ContentValues;
import android.content.Context;
import android.content.DialogInterface;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.graphics.Bitmap;
import android.net.Uri;
import android.os.Bundle;
import android.os.PersistableBundle;
import android.provider.MediaStore;
import android.speech.tts.TextToSpeech;
import android.view.Menu;
import android.view.MenuItem;
import android.view.View;
import android.widget.EditText;
import android.widget.ImageView;
import android.widget.TextView;
import android.widget.Toast;

import java.lang.Object;
import java.util.Locale;


import com.google.android.gms.tasks.OnFailureListener;
import com.google.android.gms.tasks.OnSuccessListener;
import com.google.android.gms.tasks.Task;
import com.google.firebase.ml.vision.FirebaseVision;
import com.google.firebase.ml.vision.common.FirebaseVisionImage;
import com.google.firebase.ml.vision.text.FirebaseVisionText;
import com.google.firebase.ml.vision.text.FirebaseVisionTextRecognizer;
import com.theartofdev.edmodo.cropper.CropImage;
import com.theartofdev.edmodo.cropper.CropImageView;

public class MainActivity extends AppCompatActivity implements TextToSpeech.OnInitListener {
    ImageView imageView;
    TextView textView;
    private TextToSpeech engine;


    //Action Bar Menu
    @Override
    public boolean onCreateOptionsMenu(Menu menu) {
        //Inflate Menu 
        getMenuInflater().inflate(R.menu.menu_main, menu);
        return true;

    }

    //Handle Action Bar clicks
    @Override
    public boolean onOptionsItemSelected(@NonNull MenuItem item) {
        int id = item.getItemId();
        if (id == R.id.addImage) {
            showImageImportDialog();
        }
        if (id == R.id.settings) {
            Toast.makeText(this, "Settings", Toast.LENGTH_SHORT).show();
        }
        return super.onOptionsItemSelected(item);
    }

    private void showImageImportDialog() {
        //Options or Items displayed in dialog once it is clicked

        String() items = {"Camera", "Gallery"};
        AlertDialog.Builder dialog = new AlertDialog.Builder(this);

        //Set TITLE
        dialog.setTitle("Select Image");

        dialog.setItems(items, new DialogInterface.OnClickListener() {

            @Override
            public void onClick(DialogInterface dialog, int which) {
                if (which == 0) {
                    //Camera Option Clicked
                    if (!checkCameraPermission()) {
                        ///Camera permission is not allowed, thats why we request it here
                        requestCameraPermission();
                    } else {
                        //Permission allowed, take picture
                        pickCamera();
                    }
                }

                if (which == 1) {
                    //Gallery Option Clicked
                    if (!checkStoragePermission()) {
                        //Storage Permissions Granted
                        requestStoragePermission();
                    } else {
                        //Permission allowed, take picture
                        pickGallery();
                    }
                }
            }
        });
        dialog.create().show(); //SHOW DIALOG
    }

    private void pickGallery() {
        //Intent to Pick image from gallery
        Intent intent = new Intent(Intent.ACTION_PICK);
        //Set intent type to image
        intent.setType("image/*");
        startActivityForResult(intent, IMAGE_PICK_GALLERY_CODE);

        int GET_FROM_GALLERY = 3;
        startActivityForResult(new Intent(Intent.ACTION_PICK, android.provider.MediaStore.Images.Media.INTERNAL_CONTENT_URI), GET_FROM_GALLERY);
    }

    private void pickCamera() {
        //Takes Image from Camera and saves it in storage for HIGH QUALITY
        ContentValues values = new ContentValues();
        values.put(MediaStore.Images.Media.TITLE, "NewPic"); //TITLE OF THE PIC

        values.put(MediaStore.Images.Media.DESCRIPTION, "Images to text"); //Description
        image_uri = getContentResolver().insert(MediaStore.Images.Media.EXTERNAL_CONTENT_URI, values);

        Intent cameraIntent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);
        cameraIntent.putExtra(MediaStore.EXTRA_OUTPUT, image_uri);
        startActivityForResult(cameraIntent, IMAGE_PICK_CAMERA_CODE);

    }

    private void requestStoragePermission() {
        ActivityCompat.requestPermissions(this, storagePermission, STORAGE_REQUEST_CODE);
    }

    private boolean checkStoragePermission() {
        boolean result = ContextCompat.checkSelfPermission(this,
                Manifest.permission.WRITE_EXTERNAL_STORAGE) == (PackageManager.PERMISSION_GRANTED);
        return result;
    }

    private void requestCameraPermission() {
        ActivityCompat.requestPermissions(this, cameraPermission, CAMERA_REQUEST_CODE);
    }

    private boolean checkCameraPermission() {

        boolean result = ContextCompat.checkSelfPermission(this,
                Manifest.permission.CAMERA) == (PackageManager.PERMISSION_GRANTED);

        boolean result1 = ContextCompat.checkSelfPermission(this,
                Manifest.permission.WRITE_EXTERNAL_STORAGE) == (PackageManager.PERMISSION_GRANTED);
        return result && result1;

    }
    //Also add gallery permission.


    EditText mResultEt;
    ImageView mPreviewIv;

    ///******So why are these specific codes used bro*******RESEARCH******
    private static final int CAMERA_REQUEST_CODE = 200;
    private static final int STORAGE_REQUEST_CODE = 400;
    private static final int IMAGE_PICK_GALLERY_CODE = 1000;
    private static final int IMAGE_PICK_CAMERA_CODE = 1001;


    String cameraPermission();

    String storagePermission();
    Uri image_uri;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        setContentView(R.layout.activity_main);
        engine = new TextToSpeech(this, this);

//        mResultEt = findViewById(R.id.textId);
        mPreviewIv = findViewById(R.id.imageId);

        //camera permission
        cameraPermission = new String(){Manifest.permission.CAMERA,
                Manifest.permission.WRITE_EXTERNAL_STORAGE};

        //Storage Permission
        storagePermission = new String(){Manifest.permission.WRITE_EXTERNAL_STORAGE};


        //Find Image view
        imageView = findViewById(R.id.imageId);
        //Find text view
        textView = findViewById(R.id.textId);

        //check app level permission is granted for camera
        if (checkSelfPermission(Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED) {
            //Grants Permission and Prompts User
            requestPermissions(new String(){Manifest.permission.CAMERA}, 101);

        }
    }

    //    TextToSpeech tts;
    public void doProcess(View view) {

        //Open Camera in Phone. Intent object is created to open camera once the capture image button is clicked
        Intent intent = new Intent(MediaStore.ACTION_IMAGE_CAPTURE);

        //Activity result specifically used to continue using the app post taking pic
        startActivityForResult(intent, 101);

    }

    @Override
    public void onActivityResult(int requestCode, int resultCode, @Nullable Intent data)
    { if(resultCode == RESULT_OK) {
        super.onActivityResult(requestCode, resultCode, data);

//        if(requestCode == IMAGE_PICK_CAMERA_CODE) {
//            //Crops Image
//            CropImage.activity(data.getData()).setGuidelines(CropImageView.Guidelines.ON)
//                    .start(this);
//        }
//        if(requestCode == IMAGE_PICK_GALLERY_CODE) {
//            //Image received from gallery now cropped
//            CropImage.activity(image_uri)
//                    .setGuidelines(CropImageView.Guidelines.ON) //Enabled image guidelines
//                    .start(this);
//        }
//    }
//    if(requestCode == CropImage.CROP_IMAGE_ACTIVITY_REQUEST_CODE) {
//        CropImage.ActivityResult result = CropImage.getActivityResult(data);
    }
//    else if(resultCode == CropImage.CROP_IMAGE_ACTIVITY_RESULT_ERROR_CODE) {
//        Exception error = mResultEt.getError();
//    }
//        mResultEt.setText(getSupportActionBar().toString());





        Bundle bundle = data.getExtras();

        //From bundle extract image
        Bitmap bitmap = (Bitmap) bundle.get("data");

        //Set Image In ImageView
        imageView.setImageBitmap(bitmap);

        //Now we process image to extract text using Google ML Kit//

        //Create a firebase vision object
        FirebaseVisionImage firebaseVisionImage = FirebaseVisionImage.fromBitmap(bitmap);
        //2ndStep Get an Instance of FirebaseVision
        FirebaseVision firebaseVision = FirebaseVision.getInstance();

        //3rd Create an instance of firebasevision text recognizer
        FirebaseVisionTextRecognizer firebaseVisionTextRecognizer = firebaseVision.getOnDeviceTextRecognizer();

        //4Th Step Create a task to process the image
        Task<FirebaseVisionText> task = firebaseVisionTextRecognizer.processImage(firebaseVisionImage);

        //5th Step, if task is successful


        task.addOnSuccessListener(new OnSuccessListener<FirebaseVisionText>() {
            @Override
            public void onSuccess(FirebaseVisionText firebaseVisionText) {
                String s = firebaseVisionText.getText();
                textView.setText(s);

                //Conversion of text to speech
                String rawText = String.valueOf(textView.getText());

                speakText(rawText);
                ;               /* tts = new TextToSpeech(getApplicationContext(), new TextToSpeech.OnInitListener() {
                    @Override
                    public void onInit(int i) {
                        if (i == TextToSpeech.SUCCESS) {
                            //SELECTING LANGUAGE
                            int lang = tts.setLanguage(Locale.ENGLISH);

                        }
                    }
                });*/


            }


        });

        //6th. If task is failed
        task.addOnFailureListener(new OnFailureListener() {
            @Override
            public void onFailure(@NonNull Exception e) {
                Toast.makeText(getApplicationContext(), e.getMessage(), Toast.LENGTH_LONG).show();
            }
        });

    }

    public void speakText(String textContents) {

        //String textContents = text.getText().toString();
        engine.speak(textContents, TextToSpeech.QUEUE_FLUSH, null, null);

    }

    @Override
    protected void onPostCreate(@Nullable Bundle savedInstanceState) {
        super.onPostCreate(savedInstanceState);
    }

    @Override
    public void onInit(int i) {


        if (i == TextToSpeech.SUCCESS) {
            //Setting speech Language
            engine.setLanguage(Locale.ENGLISH);
            engine.setPitch(1);
        }
    }

    ///Handle Permission Result
    @Override
    public void onRequestPermissionsResult(int requestCode, @NonNull String() permissions, @NonNull int() grantResults) {
        super.onRequestPermissionsResult(requestCode, permissions, grantResults);
        switch (requestCode) {
            case CAMERA_REQUEST_CODE:

                if (grantResults.length > 0) {
                    boolean cameraAccepted = grantResults(0) ==
                            PackageManager.PERMISSION_GRANTED;
                    //Changed from 0 to 1
                    boolean writeStorageAccepted = grantResults(1) ==
                            PackageManager.PERMISSION_GRANTED;

                    if (cameraAccepted && writeStorageAccepted) {
                        pickCamera();
                    } else {
                        Toast.makeText(this, "permission denied", Toast.LENGTH_SHORT).show();

                    }
                }
                break;

            case STORAGE_REQUEST_CODE: {
                boolean writeStorageAccepted = grantResults(0) ==
                        PackageManager.PERMISSION_GRANTED;
                if (writeStorageAccepted) {
                    pickGallery();
                } else {
                    Toast.makeText(this, "permission denied", Toast.LENGTH_SHORT).show();

                }
            }
            break;

        }


    }


}

Activity_MainXML

<?xml version="1.0" encoding="utf-8"?>

<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
    xmlns:app="http://schemas.android.com/apk/res-auto"
    xmlns:tools="http://schemas.android.com/tools"
    android:layout_width="match_parent"
    android:layout_height="match_parent"
    android:orientation="vertical"
    tools:context=".MainActivity">

    <!-- Added scroll view, made sure scroll view has only one child in it. Very time consuming -->
    <ScrollView
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        tools:ignore="UselessParent">

        <LinearLayout
            android:layout_width="match_parent"
            android:layout_height="wrap_content"
            android:orientation="vertical"
            tools:ignore="UselessLeaf" >
            <TextView
                android:layout_width="wrap_content"
                android:layout_height="wrap_content"
                android:text="Image Preview:"
                android:verticalScrollbarPosition="defaultPosition"
                android:textColor="@color/design_default_color_primary_dark"
                android:textSize="22sp"
                tools:ignore="HardcodedText">

            </TextView>


            <ImageView

                android:id="@+id/imageId"
                android:layout_width="353dp"
                android:layout_height="368dp"
                android:layout_margin="26dp" />

            <Button
                android:layout_width="match_parent"
                android:layout_height="63dp"
                android:onClick="doProcess"
                android:text="Capture Image" />

            <TextView
                android:id="@+id/textId"
                android:layout_width="wrap_content"
                android:layout_height="wrap_content"
                android:text="Result:"
                android:textColor="@color/design_default_color_primary_dark"
                android:textSize="22sp" />


        </LinearLayout>
    </ScrollView>

</LinearLayout>

AndroidManifest XML


<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
    package="com.example.textrecognition">

    <!-- User Permissions to give access to camera -->
    <uses-permission android:name="android.permission.CAMERA" />
    <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/>


    <application
        android:allowBackup="true"
        android:icon="@mipmap/ic_launcher"
        android:label="Dream Eye Text Recognizer"
        android:roundIcon="@mipmap/ic_launcher_round"
        android:supportsRtl="true"
        android:theme="@style/Theme.TextRecognition">
        <activity android:name=".SplashScreen">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />

                <category android:name="android.intent.category.LAUNCHER" />
            </intent-filter>
        </activity>

        <activity
            android:name=".ui.login.LoginActivity"
            android:label="@string/title_activity_login" />

        <meta-data
            android:name="com.google.firebase.ml.vision.DEPENDENCIES"
            android:value="ocr" />

        <activity android:name=".MainActivity">
            <intent-filter>
                <action android:name="android.intent.action.MAIN" />

                <category android:name="android.intent.category.DEFAULT" />
            </intent-filter>
        </activity>

        <!-- Crop image Activity  -->
        <activity android:name="com.theartofdev.edmodo.cropper.CropImageActivity"
            android:theme="@style/Base.Theme.AppCompat"/> <!-- or AppTheme optional (needed if default theme has no action bar) -->

    </application>

</manifest>

*Build Gradle.App

// Top-level build file where you can add configuration options common to all sub-projects/modules.
buildscript {
    repositories {
        google()
        jcenter()
    }
    dependencies {

        classpath "com.android.tools.build:gradle:4.1.1"
        classpath 'com.google.gms:google-services:4.3.3'

        // NOTE: Do not place your application dependencies here; they belong
        // in the individual module build.gradle files
    }
}

allprojects {
    repositories {
        google()
        jcenter()
    }
}

task clean(type: Delete) {
    delete rootProject.buildDir
}

LOGCAT REPORT WHEN APP CRASHES

2021-07-17 17:01:38.730 18758-18758/com.example.textrecognition E/AndroidRuntime: FATAL EXCEPTION: main
   Process: com.example.textrecognition, PID: 18758
   java.lang.RuntimeException: Failure delivering result ResultInfo{who=null, request=3, result=-1, data=Intent { dat=content://media/external/images/media/306689 flg=0x1 (has extras) }} to activity {com.example.textrecognition/com.example.textrecognition.MainActivity}: java.lang.NullPointerException: null reference
       at android.app.ActivityThread.deliverResults(ActivityThread.java:4610)
       at android.app.ActivityThread.handleSendResult(ActivityThread.java:4652)
       at android.app.servertransaction.ActivityResultItem.execute(ActivityResultItem.java:49)
       at android.app.servertransaction.TransactionExecutor.executeCallbacks(TransactionExecutor.java:108)
       at android.app.servertransaction.TransactionExecutor.execute(TransactionExecutor.java:68)
       at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1948)
       at android.os.Handler.dispatchMessage(Handler.java:106)
       at android.os.Looper.loop(Looper.java:214)
       at android.app.ActivityThread.main(ActivityThread.java:7050)
       at java.lang.reflect.Method.invoke(Native Method)
       at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:493)
       at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:965)
    Caused by: java.lang.NullPointerException: null reference
       at com.google.android.gms.common.internal.Preconditions.checkNotNull(Unknown Source:2)
       at com.google.firebase.ml.vision.common.FirebaseVisionImage.<init>(com.google.firebase:firebase-ml-vision@@24.0.1:40)
       at com.google.firebase.ml.vision.common.FirebaseVisionImage.fromBitmap(com.google.firebase:firebase-ml-vision@@24.0.1:3)
       at com.example.textrecognition.MainActivity.onActivityResult(MainActivity.java:259)
       at android.app.Activity.dispatchActivityResult(Activity.java:7762)
       at android.app.ActivityThread.deliverResults(ActivityThread.java:4603)
       at android.app.ActivityThread.handleSendResult(ActivityThread.java:4652) 
       at android.app.servertransaction.ActivityResultItem.execute(ActivityResultItem.java:49) 
       at android.app.servertransaction.TransactionExecutor.executeCallbacks(TransactionExecutor.java:108) 
       at android.app.servertransaction.TransactionExecutor.execute(TransactionExecutor.java:68) 
       at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1948) 
       at android.os.Handler.dispatchMessage(Handler.java:106) 
       at android.os.Looper.loop(Looper.java:214) 
       at android.app.ActivityThread.main(ActivityThread.java:7050) 
       at java.lang.reflect.Method.invoke(Native Method) 
       at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:493) 
       at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:965)

equipment protection – Is there any camera safety precautions i should keep in mind when taking long exposure photographs of the night sky?

I plan to take long exposure photos of the night sky, anywhere between 1-2 mins to 8-10 mins ( i will figure out the exact time through trial and error ).

Is there any specific precautions i should take purely for the safety of camera ?

I am not asking about tips to get good image quality ( using tripods, correct aperture, pick dark location, cloudless sky etc.).

I am also not asking about physical safety of the object ( falling over with wind, getting stolen etc. ) or about carrying spare batteries . I will be accounting for all these other issues.

I am talking purely about any possible damage to the mechanisms or working of the camera. Like maybe some harm to camera, due to shutter staying open for long, or sensor being exposed for long , or some other part/mechanism/electronics leading to some problem ( i have no idea what).

I am using Nikon D5600 with the default kit lens Nikkor AF-P 18-55 mm 1:3.5:5.6 G VR

canon 700d – Canon700D: Could I take a milky way with this camera?


Your privacy


By clicking “Accept all cookies”, you agree Stack Exchange can store cookies on your device and disclose information in accordance with our Cookie Policy.




nikon – Is it possible to focus on a fixed distance for self portrait using Point and Shoot camera for a decent photo?

I have Nikon B500 point and shoot camera. I have tripod too. It’s pretty nice camera if you have to take photos of others. But I want to take self portraits. There’s no other person available to take photos. Even if I find one, they’ll get irritated soon.

Now, the camera has many focus settings which you can use while taking photo of a person. All you have to do is just focus on subject/person (by half-press the shutter button) and click it. That’s it.

But in case of self portrait, you can’t focus on yourself, because you’ll be the same person.

I use sef-timer, which is great, but problem is, it takes a blurry photo, because it doesn’t know where to focus. There are many focus settings like (you don’t necessarily have to go through them, just providing in case you need details):

enter image description here
enter image description here

enter image description here

For me, default settings works fine most of the time. My current settings are PRE-AF autofocus and Face Priority.

But the reason I’m getting blurred/not so good photo is because Face priority determines focus only before timer starts. But at that moment, I’m not in the frame. So it just doesn’t work.

I have SnapBridge app too, for remote photography, but it doesn’t give flexibility to change focus when I’m in frame or once the timer has started. I mean it’s just for clicking the photo. So it’s quite useless for me.

I hope you understood me problem. There isn’t any problem with camera, it’s just I don’t know what can I do make the camera somehow focus on distance where I’m standing.

So, is there any other way to somehow fix the focus at some point and then use self timer so my image is sharp because focus is proper? Somewhat like they can do in DSLR with a fixed length focus? Or any other workaround that fixes this problem?

Please confirm whether the Canon 430EX can be used as off camera flash with the Canon EOS 1300d


Your privacy


By clicking “Accept all cookies”, you agree Stack Exchange can store cookies on your device and disclose information in accordance with our Cookie Policy.




Can anyone tell what dashcam camera this could possibly me?

Dashcam image

I’m trying to find out what kind of dashcam camera this is, probably a tough one to solve