seo – How to avoid robots from indexing pages of my app through alternate URLs?

I have this web app that is accessible through those different URLs.

URL_1: www.mycustomdomain.com            // THIS IS MY CUSTOM DOMAIN (PUBLIC TO THE USERS)
URL_2: www.my-project.firebaseapp.com    // THIS IS A DEFAULT FIREBASE URL
URL_3: my-project.web.app                // THIS IS A DEFAULT FIREBASE URL

Obviously I want Google to index my pages using my custom domain, and not the Firebase default URLs.

I don’t publicize the default URLs anywhere, but what if Google discovers them at some point? What is the proper way of letting Google know that those pages/URLs should NOT be indexed?

OPTION #1

Should I always add a <link rel="canonical" href={"https://www.mycustomdomain.com/some-page"}/> for each page?

This way, if Google happens to crawl the default URLs, it should see that canonical pointing to the custom domain, right? Is this the correct approach?

OPTION #2

Since I’m dynamically generating the robots.txt, I could detect that the request is coming from URL_2 or 3 and simply block the entire website from being crawled, by adding Disallow: /.

How would you solve this?