On my backend NodeJS server I need to be able to open up and keep open 1-20k websocket connections at the same time. I will receive a small amount of data from these websocket connections every few seconds for about 4 hours a day per connection at random times. Once I receive the data I will be making a normal REST API call and sending the data to redis to be cached/mongodb to be persisted. How could I achieve this in a scalable way? The only way I can think of right now would be to store all connections in an array. I am sure the array would work fine for a few connections but I would assume it would give me issues when I have 20k connections in there.
This would be an example of how I can connect to the websockets:
const streamingUrl = `wss://streamingurl/streaming/${uniqueConnectionID}`
const ws = new WebSocket(streamingUrl, {
perMessageDeflate: false
})
ws.on('message', async function (data) {
const d = JSON.parse(data)
if(d.data) {
const response = await fetch(url)
const newData = await response.json()
addDataToRedisCache(newData)
addDataToMongoDb(newData)
}
})
I am wondering at a high level how I could design my backend to be able to handle this scenario?
Thanks,
R