javascript: What would be the best approach to normalize data for an LSTM model (using Tensorflow) with this wide range of values?

I am new to machine learning, so I am still trying to understand the concepts, keep this in mind if my question may not be as concise as necessary.

I am building a Tensorflow JS model with LSTM layers for time series prediction (RNN).

The data set used is applied every hundreds of milliseconds (at random intervals). However, the data produced can come in very wide ranges, e.g. Most of the data received will be of value 20, 40, 45, etc. However, sometimes this value will reach 75,000 at the end.

Therefore, the data range is from 1 to 75,000.

When I normalize this data using a standard min / max method to produce a value between 0-1, the normalized data for most data requests will be in many small and significant decimals. for example: & # 39; 0.0038939328722009236 & # 39;

So my questions are:

1) Is this minimum / maximum the best approach to normalize this type of data?

2) Will the RNN model work with so many significant decimals and precision?

3) Should I also normalize the output tag? (of which there will be 1 exit)