Probability: Approaching the uniform / arbitrary distribution with the Gaussian mixing model?

so Silverman in his 1986 book mentioned about approximate distributions with the Gaussian mixing models, but did not go into the subject very deeply … I wonder, let's say they gave me a box of N-dimensional uniforms (and as a additional point). extension to this, any arbitrary distribution) $ u ( bar {x}) $ , Is there a clear way to do it with a Gaussian k-kernel mixing model (truncated) $ g ( bar {x}) = sum_ {i} omega_i N ( mu_i, sigma ^ 2_i) $?

I tried this by trying to minimize some kind of measurable divergence between the two, for example, the Hellinger distance or the Kullback-Leibler divergence, but the analytical solutions do not seem plausible and in terms of numerical approaches, the computational costs rise too quickly to As the number of cores / dimension increases.

I wonder if there have been previous studies on this topic before, but I have not been successful in finding something useful … if someone can give me some clues or perhaps refer me to the relevant literature on this, it would be great appreciated!