Note: this answer is only about Google, I am not aware of how others handle this.
Google bot can now run JS on crawled pages, by using a headless browser (fully automated browser instance). But as it requires more resources (therefore, money) To run that browser, you will visit your pages less frequently than the normal crawler, which only analyzes the initial HTML.
In any case, if you want to have a better SEO, there are several options available, and it is up to you to decide which ones you are willing to take. Here are some:
Make your website work without JS
In plus cases (and your website definitely falls into that category as far as I can see), JS is not really necessary to fill your HTML with data. It is to add interactivity or to obtain data from an API, which could be done on the server side. In this situation, it is always good to turn off JS in your browser and see if everything is still working. The content should already be on your page, even if it's not pretty, and it should work. JS should only be there to improve everything for the user. The links should work, the forms should work. If you have buttons that lead to other pages, they should be
links, with a real URL, even if your application is a SPA.
This will help bots understand your page and the architecture of your site (i.e. the site map). Not only will it help people with disabilities (blind, for example) who use a screen reader. And Google takes that into account, which is good.
Serving dynamic content
The first obvious option that comes to mind is to process the data server side and render HTML pages on the fly. If you are only comfortable with JS, you can search NodeJS and create a server using a module such as ExpressJS or HapiJS, for example. There are tons of tutorials that you can find. Or, if you know other languages, you can do it in Python, PHP, Java …
Another solution, usually for more complex applications, is the representation of the server side. You run the page on your server in a browser-like context (It's not really a browser, but JS runs in the context of your page), JS fills the DOM and the updated HTML is what is sent to the client. JS can be used on the client side to do more work. It is becoming quite popular with existing frames. For example, if you write an application using VueJS, you can use NuxtJS or Vue-server-render to pre-render the HTML in a state that is only obtained after running JS. If that is not something you want to analyze, you can also use a paid service like SEO4Ajax, which will run your pages in a browser and maintain a cached version of the final HTML, so that it can serve bots such as the Google bot. We used that in our company for a while, it worked very well.
Leaving a good impression
Finally, if you want to have a higher ranking, your site must be of high quality. For example, at this time, it is not compatible with https. You are losing points here.
It doesn't have much text, which makes it less valuable.
Doesn't have a
, or really any page structure apart from
It does not have Open Graph labels, which will help to display beautiful thumbnails on Twitter, Facebook or other services to share:
It has no structured data to help trackers understand what your page is about, and which ones could be used to display your website in a pleasant way in search results:
There are tons of other things to take care of, but the most important thing is to know that trackers become more and more intelligent. Therefore, no matter what you do, if your website is of poor quality and does not have great content, people will not like it. And if they don't like it, there's a good chance that Google won't do it either. Create your site with passion, see what crawlers suggest you do to improve it (for example, PageSpeedIndex, Google guidelines) and your website will eventually grow. It must be fast, it must be light. It should work fine on mobile devices.