Hi @dmonad – FYI, just published a rough Ace Editor binding for Yjs, y-ace!
I’m not so hip with writing tests etc, so I wasn’t sure it made sense to fork/PR to your empty y-ace repo, or simply provide this as a starting ground for you or others to polish in the way the other bindings have been made? Luckily Federico of room.sh was kind enough to share the binding they had for the very first versions of Yjs v13.0 (which have changed drastically to the now v13.4+), but it offered a great starting point to work with. The past days I’ve managed to sync cursors and selection via custom technique within the ace-binding (probably kinda sloppy). Initially tried to implement the existing ace-collab-ext project, but just couldn’t get it working… nevertheless the awareness protocol worked great for my needs.
The project I’m planning to try and implement Yjs for is built upon socket.io for existing websocket everything… curious if there’s a documented way to continue using it with Yjs? I found your gist on using socket.io but it wasn’t clear at all to me how to implement… Curious if the simplest approach is leaving my user/roles/status/etc logic within the existing socket.io and simply have the editor syncing taking place across y-websockets? Maybe a bad idea to run those multiple websockets? With socket.io, I’m using the namespace’s function within a class for isolating rooms, variables, etc… (P5LIVE COCODING server) – Do you have an example for doing basic emit/broadcast within the y-websocket server or most things are passed through the awarness states? I utilize socket.io’s ability to broacast to all, just the user, everyone but the user, etc… not sure if this can be done within y-websockets? Any tips are appreciated.
Lastly, do you have links/documentation for using combinations of webRTC and websocket providers? I’ve tried searching the forum for ‘toggle, swap, hotswap, switch’ etc – wondering if there’s a technique for using webRTC if less than say 5 peers, but switch provider to websockets the moment it gets to be more? If its more stable to use websockets for larger group size, then I look forward to experiement with having a parallel layer of synced shared values like midi data which def would benefit from webRTC lag (or lack thereof).