-
Notifications
You must be signed in to change notification settings - Fork 31
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to send stream to Client side ? #19
Comments
Hi @afnan, great question. Though I have not personally tried to stream the camera video, there is certainly no reason why this wouldn't work. Thoughts for socket.io/WebSocket
Easiest approach - MJPEGThe easiest solution is probably to use an MJPEG H.264 might workThe above solution is easy but quite likely not very performant. If you can guarantee low latency between your client and server (eg. both on same network), it might pass. Otherwise you might look into H.264 streaming, and decoding client side via a library like Broadway. This will likely have its own complexities. WebRTCFinally, it's worth noting that the industry doesn't (usually) use TCP sockets at all for live video streaming - rather you send video data over UDP using protocols like RTP and RTCP. This is because TCP guarantees ordered packet delivery whereas UDP does not. If your connection has packet loss, TCP will keep retrying until it is successful (slow), whereas UDP will fire and forget. This is usually preferred for real-time video - If a video frame is lost, you might notice a blip but your feed continues in real time. With TCP, you could get some signifiant video delay. If you want to do this in the browser, you're looking at WebRTC. In essence your Node server would act as one WebRTC peer, and the client's browser would be the other peer. WebRTC has a ton of complexities you will need to work through, including signalling and session negotiation on top of the RTP/RTCP data. I haven't seen any working examples of streaming video as a Node WebRTC peer yet. I might start by looking into node-webrtc or mediasoup. If anybody has seen or built something like this, please share! |
@gregnr Thank you for the reply. I have tried taking jpeg image from the stream and send it videoStream.on method. However, as it is mentioned in the document the picture capture is a slow process. It works but the latency is high especially on Pi4. Is there a way to convert the binary data received
|
Hi @afnan, can I check with you whether how did you continue with this? I'm trying to stream video from pi to my server as well. Not sure have you manage to get this working? |
@afnan The |
How would you do this with express? I have tried
but this only gives me a still from the video stream - at least I assume that is what I am getting since it does not 'move'! Many thanks for this great package. UPDATE: UPDATE 2: Furthermore, I think this approach will be unsuitable for handling multiple requests to the video feed. However, that is a moot point until I can work out how to get data on 'frame' event...
on the client:
|
Yeah, I have the same interest of the OP, and I tried the approach of @axwaxw with the exact same issue (stream event non triggering).
// --> https://github.com/component/emitter/
let Emitter = Emitter = require('emitter');
class FrameEmitter{
constructor(){
this.streamCamera = new StreamCamera({
codec: Codec.MJPEG,
width: 640,
height: 480
});
this._callbacks = {};
this.streamCamera.startCapture().then(() => this._takePic());
this.emitter = new Emitter();
}
_takePic(){
this.lastTime = process.hrtime();
this.streamCamera.takeImage().then((img) => this._onFrame(img));
}
_onFrame(data){
this.emitter.emit("frame",data);
this._takePic();
}
on(fn){
this.emitter.on("frame", fn);
}
off(fn){
this.emitter.off("frame", fn);
}
stop(){
return this.streamCamera.stopCapture(); // returns a promise!
}
}
// create a singleton emitter
let frameEmitter= new FrameEmitter(); Use with the code of @axwaxw (ref) by simply replacing: /*REPLACE THIS CODE*/
let frameEmitter = videoStream.on('frame', frameHandler);
req.on('close', () => {
frameEmitter.removeListener('frame', frameHandler);
console.log('Connection terminated: ' + req.hostname);
}); /*WITH THIS CODE*/
frameEmitter.on(frameHandler);
req.on("close", () => {
frameEmitter.off(frameHandler);
if (isVerbose) console.log("Connection terminated: " + req.hostname);
}); |
Thanks @digEmAll - please also check out this issue |
make sure you stop the stream on request closure
|
@axwaxw late to the party - but building on your approach, here's how i stream to multiple clients (~3 simultaneous works fine with some increasing lag as more clients join - of course lowering width/height helps). TLDR: keep track of all connecting client ips via array, startup / stop stream object with reference to it.
|
I also found that - for reasons I can't quite explain - using the standard
made the stream unviewable on mobile devices (iphone / ipad). Removing it allowed streaming to both mobile and desktop.
As an alternative - after reading this great explainer I added a chunking header
|
I am interested in displaying live video stream to the client. I was thinking socket.io to send the stream. Can it be done?
The text was updated successfully, but these errors were encountered: