Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to send stream to Client side ? #19

Open
afnan opened this issue Oct 1, 2020 · 10 comments
Open

How to send stream to Client side ? #19

afnan opened this issue Oct 1, 2020 · 10 comments

Comments

@afnan
Copy link

afnan commented Oct 1, 2020

I am interested in displaying live video stream to the client. I was thinking socket.io to send the stream. Can it be done?

@gregnr
Copy link
Contributor

gregnr commented Oct 14, 2020

Hi @afnan, great question. Though I have not personally tried to stream the camera video, there is certainly no reason why this wouldn't work.

Thoughts for socket.io/WebSocket

  • You will want to ensure your socket (socket.io or WebSocket) is set to binary mode.
  • Official stream support (ie, .pipe()) may not be possible with socket.io. It should be doable with WebSockets though if you use the ws node lib (ie. cameraStream.pipe(webSocketStream)).
  • If you can't use .pipe(), you will need to subscribe to the .on('data', ...) or .on('frame', ...) camera stream event and manually emit the binary data to your socket.io or WebSocket channel.

Easiest approach - MJPEG

The easiest solution is probably to use an MJPEG StreamCamera, subscribe to the .on('frame', ...) event to capture each JPEG frame, emit each JPEG image via your socket channel, subscribe to the channel on your client, and render the JPEG image on a canvas every time a frame comes in. You will likely need to convert the JPEG to a Blob or use a third party JPEG decoder in order to apply it to the canvas.

H.264 might work

The above solution is easy but quite likely not very performant. If you can guarantee low latency between your client and server (eg. both on same network), it might pass. Otherwise you might look into H.264 streaming, and decoding client side via a library like Broadway. This will likely have its own complexities.

WebRTC

Finally, it's worth noting that the industry doesn't (usually) use TCP sockets at all for live video streaming - rather you send video data over UDP using protocols like RTP and RTCP. This is because TCP guarantees ordered packet delivery whereas UDP does not. If your connection has packet loss, TCP will keep retrying until it is successful (slow), whereas UDP will fire and forget. This is usually preferred for real-time video - If a video frame is lost, you might notice a blip but your feed continues in real time. With TCP, you could get some signifiant video delay.

If you want to do this in the browser, you're looking at WebRTC. In essence your Node server would act as one WebRTC peer, and the client's browser would be the other peer. WebRTC has a ton of complexities you will need to work through, including signalling and session negotiation on top of the RTP/RTCP data.

I haven't seen any working examples of streaming video as a Node WebRTC peer yet. I might start by looking into node-webrtc or mediasoup. If anybody has seen or built something like this, please share!

@afnan
Copy link
Author

afnan commented Oct 14, 2020

@gregnr Thank you for the reply. I have tried taking jpeg image from the stream and send it videoStream.on method. However, as it is mentioned in the document the picture capture is a slow process. It works but the latency is high especially on Pi4.

Is there a way to convert the binary data received on the method to a picture at the client-side? I assume if I use .on('data',...) the frame is not complete hence the image does not render? i have tried parsing the sent data to blob and to canvas

videoStream.on("frame", data => console.log("New data", data)); never gets fired

@knivore
Copy link

knivore commented Nov 3, 2020

Hi @afnan, can I check with you whether how did you continue with this? I'm trying to stream video from pi to my server as well. Not sure have you manage to get this working?

@gregnr
Copy link
Contributor

gregnr commented Nov 4, 2020

@afnan The on('frame') method should fire when a JPEG frame is captured. Can you confirm you are using the MJPEG video encoding?

@axwaxw
Copy link

axwaxw commented Dec 2, 2020

How would you do this with express?

I have tried

const app = express(); 
app.get('/stream', function (req, res) {
  fs.createReadStream('video-stream.mjpeg').pipe(res)
});

but this only gives me a still from the video stream - at least I assume that is what I am getting since it does not 'move'!

Many thanks for this great package.

UPDATE:
I think I need to use the pattern employed in this project (which uses a different camera package):

UPDATE 2:
using the pattern found in this project, I thought that the code shown below would work. However, I do not seem to be getting any data on the 'frame' event.

Furthermore, I think this approach will be unsuitable for handling multiple requests to the video feed. However, that is a moot point until I can work out how to get data on 'frame' event...

const { StreamCamera, Codec } = require("pi-camera-connect");
const express = require('express')
const app = express()

app.listen(3000, () => console.log(`Listening on port 3000!`));

const runApp = async () => {
  const streamCamera = new StreamCamera({
    codec: Codec.MJPEG,
    width: 1280,
    height: 720,
    fps: 15,
  });

  await streamCamera.startCapture();

  const videoStream = streamCamera.createStream();

  app.get("/stream.mjpg", (req, res) => {

    res.writeHead(200, {
      'Cache-Control': 'no-store, no-cache, must-revalidate, pre-check=0, post-check=0, max-age=0',
      Pragma: 'no-cache',
      Connection: 'close',
      'Content-Type': 'multipart/x-mixed-replace; boundary=--myboundary'
    });

    console.log('Accepting connection: ' + req.hostname);
    let isReady = true;

    let frameHandler = (frameData) => {
      try {
        if (!isReady) {
          return;
        }
        isReady = false;
        console.log('Writing frame: ' + frameData.length);
        res.write('--myboundary\nContent-Type: image/jpg\nContent-length: ${frameData.length}\n\n');
        res.write(frameData, function () {
          isReady = true;
        });
      }
      catch (ex) {
        console.log('Unable to send frame: ' + ex);
      }
    }

    let frameEmitter = videoStream.on('frame', frameHandler);

    req.on('close', () => {
      frameEmitter.removeListener('frame', frameHandler);
      console.log('Connection terminated: ' + req.hostname);
    });
  })
}
runApp();

on the client:

<html>
  <body>
    <img src="http://[192.168.0.1:3000]/stream.mjpg"> // substitute the ip address of the pi
  </body>
</html>

@digEmAll
Copy link

digEmAll commented Dec 27, 2020

Yeah, I have the same interest of the OP, and I tried the approach of @axwaxw with the exact same issue (stream event non triggering).
At the moment I solved the problem using takeImage() (wrapped in the FrameEmitter class below) instead of using the stream event.
This works fine even with multiple connections but there are two problems:

  • first, the delay is high (a couple of seconds, but could be my wi-fi latency, can someone confirm?)
  • second I'm not sure if this might lead to a stack overflow for too much recursion...
// --> https://github.com/component/emitter/
let Emitter = Emitter = require('emitter');

class FrameEmitter{
    constructor(){
        this.streamCamera = new StreamCamera({
            codec: Codec.MJPEG,
            width: 640,
            height: 480
        });
        this._callbacks = {};
        this.streamCamera.startCapture().then(() => this._takePic());
        this.emitter = new Emitter();
    }
    _takePic(){
        this.lastTime = process.hrtime();
        this.streamCamera.takeImage().then((img) => this._onFrame(img));
    }
    _onFrame(data){
        this.emitter.emit("frame",data);
        this._takePic();
    }
    on(fn){
        this.emitter.on("frame", fn);
    }
    off(fn){
        this.emitter.off("frame", fn);
    }
    stop(){
        return this.streamCamera.stopCapture(); // returns a promise!
    }
}
// create a singleton emitter
let frameEmitter= new FrameEmitter();

Use with the code of @axwaxw (ref) by simply replacing:

    /*REPLACE THIS CODE*/
    let frameEmitter = videoStream.on('frame', frameHandler);

    req.on('close', () => {
      frameEmitter.removeListener('frame', frameHandler);
      console.log('Connection terminated: ' + req.hostname);
    });
     /*WITH THIS CODE*/
     frameEmitter.on(frameHandler);

     req.on("close", () => {
       frameEmitter.off(frameHandler);
       if (isVerbose) console.log("Connection terminated: " + req.hostname);
     });

@axwaxw
Copy link

axwaxw commented Dec 27, 2020

Thanks @digEmAll - please also check out this issue

@neonwatty
Copy link

neonwatty commented Oct 5, 2022

make sure you stop the stream on request closure

     const stopStream = async () => {
         await streamCamera.stopCapture();
     }

     req.on("close", () => {
       frameEmitter.off(frameHandler);
       if (isVerbose) console.log("Connection terminated: " + req.hostname);

       // stop stream
       stopStream();
     });

@neonwatty
Copy link

@axwaxw late to the party - but building on your approach, here's how i stream to multiple clients (~3 simultaneous works fine with some increasing lag as more clients join - of course lowering width/height helps).

TLDR: keep track of all connecting client ips via array, startup / stop stream object with reference to it.

const streamCamera = new StreamCamera({
  codec: Codec.MJPEG,
  width: 500,
  height: 500,
  fps: 15,
});

const startupStream = async () =>{
  await streamCamera.startCapture();
}

const stopStream = async () =>{
  await streamCamera.stopCapture();
}

let activeIps = [];

app.get("/stream", async (req, res) => {
  // get host and requesting ips
  const hostName = req.hostname;
  const requestingIp = req.ip.toString().replace('::ffff:', '');
  console.log('Accepting connection: to camera ' + hostName + ' from ' + requestingIp);

  // startup camera if not already
  if (activeIps.length == 0)
  {
    startupStream()
  }

  //  - /  +  client IP to current activeIps array
  activeIps = activeIps.filter(x => x !== requestingIp)
  activeIps.push(requestingIp)

  // write data for res
  res.writeHead(200, {
    'Cache-Control': 'no-store, no-cache, must-revalidate, pre-check=0, post-check=0, max-age=0',
    Pragma: 'no-cache',
    Connection: 'close',
    'Content-Type': 'multipart/x-mixed-replace; boundary=--myboundary'
  });

  let isReady = true;
  let frameHandler = (frameData) => {
    try {
      if (!isReady) {
        return;
      }
      isReady = false;
      res.write('--myboundary\nContent-Type: image/jpg\nContent-length: ${frameData.length}\n\n');
      res.write(frameData, function () {
        isReady = true;
      });

    }
    catch (ex) {
      console.log('Unable to send frame: ' + ex);
    }
  }

  let frameEmitter = streamCamera.on('frame', frameHandler);

  // on close, update activeIps list and stop streaming object if activeIps empty
  req.on('close', () => {
    frameEmitter.removeListener('data', frameHandler);
    activeIps = activeIps.filter(x => x !== requestingIp)
    if (activeIps.length==0) {
      stopStream()
    }
    console.log('Connection terminated: ' + requestingIp);
  });
})

@neonwatty
Copy link

I also found that - for reasons I can't quite explain - using the standard Content-length in the hedaer here

      res.write('--myboundary\nContent-Type: image/jpg\nContent-length: ${frameData.length}\n\n');

made the stream unviewable on mobile devices (iphone / ipad).

Removing it allowed streaming to both mobile and desktop.

      res.write('--myboundary\nContent-Type: image/jpg\n\n');

As an alternative - after reading this great explainer I added a chunking header Transfer-Encoding: chunked as a fallback, and this also works

      res.write('--myboundary\nContent-Type: image/jpg\nTransfer-Encoding: chunked\n\n');

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants