Google Search Console’s “Core Web Vitals” is showing these two graphs.
Notice that the number of “good” URLs in one graph exactly match the number of “bad” URLs in the other.
Each day always has the same number on each graph, so it’s not likely a random coincidence.
The reports provide only one example, and it is the same URL in both cases (https://rbutterworth.nfshost.com/Tables/compose/).
The page is static, with no scripts or forms.
The site has hundreds of other pages (all also static without forms), so what is so special about these reported pages that every one of them would be good in one context and bad in the other?