robots.txt query string disallow except one case


I can’t find solution for my task.

For example, I have 2 pages: «/» and «/some-category/». Both of them have GET parameters: «page» and several filters.

I need the result: «/», «/?page=x», «/some-category/» — allowed; «/?page=x&», «/some-category/?» — disallowed.

I found this: «Disallow: /*?*». It prevents crawling for any query strings on the website, and it’s good, but what should I supplement to get the desired result?