I have a server-client setup where each client has a number of screens attached, and the screens together form the display. As such, the visuals displayed by each client needs to be roughly in sync. Luckily the domain is not high speed, so I don’t have to have them all perfectly in sync, but obviously more in sync is better.
I am targeting a 100ms lead time between receiving states and acting upon them, I’m operating in a LAN so that is plenty, packets are sent out at 100Hz and interpolation is handled through a buffer which automatically selects the right packets. So I don’t have to worry about that. Visuals are running at 60 FPS so there is minimal difference frame to frame.
Right now there are two basic solutions: 1. have each client sync with a time server over the internet or locally, or 2. emulate the same logic in the code and do the synchronization as part of establishing connection. What would be the best way to synchronize time?
I’m concerned that I’m running into an XY problem here assuming that synchronizing time this way is a good idea at all.
I have noticed that existing questions on this topic are about 1 server connecting to several clients, each showing their own visuals. I think this problem of requiring sync between clients to maintain visual continuity is a bit different, so maybe there will be different approaches. I am aware of other questions that discuss the more common situation though.