Awareness update causes N^2 messages, which leads CPU issue on server


When an awareness update is sent to server, the server broadcasts to all connected users (N).
For each users, after they recieve new updates, they send awareness updates to the server again,
which make the server broadcast again (N x N)

In general, the websocket server seems like being loaded as O(N^3), because awareness updates usually occurs more linearly as the number of user grows.

This makes high cpu load on the server with medium sized rooms. (20 users in the same room)


I recently deployed a Yrb-actioncable server and Yjs based client system. This service is targeted to school, so one collaborative room is about 20 users.

I have found that the actioncable (websocket) server has unexpected high cpu load, just for a few classes (3~5)

I checked the log, It seems like same message is repeated N times

Each “receive” log makes a single broadcasting, so it is actually doing N*N single message transmissions.

I think this is not a optimal behavior, and started to investigate this issue.
(Still I am not sure that this repeated message transmissions are the main cause of cpu load.)


I actually deployed Yrb-actioncable, but it is just a different version of Yrb-websocket. So I will cover only y-websocket here.

When an awareness update is occured by setLocalState, it causes ‘update’ event of the Awareness object.

Websocket server listens this ‘update’ event, and send a awareness update message to the websocket server.

The server recieves this message, and broadcasts to all connected users (in the same room)

Each clients recieves the message from the server.

And applies update to its Awareness object.

And this causes update event again.

Now each clients sends a new awareness update message to the server, overall N awareness updates, and these mesages are broadcasted again.

And this time, there’re no changes, so it is finished here.


I think awareness update messages shoud not send when the awareness object is updated by external message.