Sounds like a security nightmare
The approach I'm looking at / developing at the moment is I'll grab the frame buffer of the app, run my own self developed micro http server, the the httpserver will then upon connect stream what it claims is video file but will in fact be a live encoded video that just started at the beginning of that connection of the frame buffers given to the httpserver.
If that works then the browser will feedback events like mouse movements and keys back to the httpserver through websocket and then the httpserver back to the app.
Actually got the custom http server / live encoding approach to work... Had to prebuffer 500k of data to get the browser to accept it (wget etc would work without pre buffering the send). Now the browser refuses to play without a buffer (even though I gave it one). So the best I can get (so far) is like a 3 second lag. Even like 200ms is unacceptable.
The more I get into this the more I think the browsers are INTENTIONALLY trying to bug and stop live streaming. My guess is so big names like youtube / twitch can have the market.
So far it looks like the only real way is to write your own native deencoder with WebAssembly, something I had never heard of until a few days ago even though it's been supported for many years, and then deencode and render yourself.
(post is archived)