2
votes

I'm trying to make a basic online video editor with nodeJS and ffmpeg.

To do this I need 2 steps:

  1. set the in-and-out times of the videos from the client, which requires the client to view the video at specific times, and switch the position of the video. Meaning, if a single video is used as an input, and split it into smaller parts, it needs to replay from the starting time of the next edited segment, if that makes sense.

  2. send the input-output data to nodejs and export it with ffmpeg as a finished vide.

At first I wanted to do 1. purely on the client, then upload the source video(s) to nodeJS, and generate the same result with ffmpeg, and send back the result.

But there are may problems with video processing on the client side in HTML at the moment, so now I have a change of plans: to do all of the processing on the nodeJS server, including the video playing.

This is the part I am stuck at now. I'm aware that ffmpeg can be used in many different ways from nodeJS, but I have not found a way to play a .mp4 webm video in realtime with ffmpeg, at a specific timestamp, and send the streaming video (again, at a certain timestamp) to the client.

I've seen the pipe:1 attribute from ffmpeg, but I couldn't find any tutorials to get it working with an mp4 webm video, and to parse the stdout data somehow with nodejs and send it to the client. And even if I could get that part to work, I still have no idea to play the video, in realtime, at a certain timestamp.

I've also seen ffplay, but that's only for testing as far as I know; I haven't seen any way of getting the video data from it in realtime with nodejs.

So:

how can I play a video, in nodeJS, at a specific time (preferably with ffmpeg), and send it back to the client in realtime?

What I have already seen:

Best approach to real time http streaming to HTML5 video client

Live streaming using FFMPEG to web audio api

Ffmpeg - How to force MJPEG output of whole frames?

ffmpeg: Render webm from stdin using NodeJS

No data written to stdin or stderr from ffmpeg

node.js live streaming ffmpeg stdout to res

Realtime video conversion using nodejs and ffmpeg

Pipe output of ffmpeg using nodejs stdout

can't re-stream using FFMPEG to MP4 HTML5 video

FFmpeg live streaming webm video to multiple http clients over Nodejs

http://www.mobiuso.com/blog/2018/04/18/video-processing-with-node-ffmpeg-and-gearman/

stream mp4 video with node fluent-ffmpeg

How to get specific start & end time in ffmpeg by Node JS?

Live streaming: node-media-server + Dash.js configured for real-time low latency

Low Latency (50ms) Video Streaming with NODE.JS and html5

Server node.js for livestreaming

HLS Streaming using node JS

Stream part of the video to the client

Video streaming with HTML 5 via node.js

Streaming a video file to an html5 video player with Node.js so that the video controls continue to work?

How to (pseudo) stream H.264 video - in a cross browser and html5 way?

Pseudo Streaming an MP4 file

How to stream video data to a video element?

How do I convert an h.264 stream to MP4 using ffmpeg and pipe the result to the client?

https://medium.com/@brianshaler/on-the-fly-video-rendering-with-node-js-and-ffmpeg-165590314f2

node.js live streaming ffmpeg stdout to res

Can Node.js edit video files?

1

1 Answers

2
votes

This question is a bit broad, but I've built similar things and will try to answer this in pieces for you:

  1. set the in-and-out times of the videos from the client, which requires the client to view the video at specific times, and switch the position of the video. Meaning, if a single video is used as an input, and split it into smaller parts, it needs to replay from the starting time of the next edited segment, if that makes sense.

Client-side, when you play back, you can simply use multiple HTMLVideoElement instances that reference the same URL.

For the timing, you can manage this yourself using the .currentTime property. However, you'll find that your JavaScript timing isn't going to be perfect. If you know your start/end points at the time of instantiation, you can use Media Fragment URIs:

video.src = 'https://example.com/video.webm#t=5.5,30';

In this example, the video starts at 5.5 seconds, and stops at 30 seconds. You can use the ended event to know when to start playing the next clip. This isn't guaranteed to be perfectly frame-accurate, but is pretty good for something like a live preview.

But there are may problems with video processing on the client side in HTML at the moment, so now I have a change of plans: to do all of the processing on the nodeJS server,...

Not a bad plan, if consistency is important.

... including the video playing.

There is a serious tradeoff you're making here, as far as latency to controlling that video, and quality of preview. I'd suggest a hybrid approach where editing is done client-side, but your final bounce/compositing/whatever is done server-side.

This isn't unlike how desktop video editing software works anyway.

This is the part I am stuck at now. I'm aware that ffmpeg can be used in many different ways from nodeJS, but I have not found a way to play a .mp4 webm video in realtime with ffmpeg, at a specific timestamp, and send the streaming video (again, at a certain timestamp) to the client.

Is it MP4, or is it WebM? Those are two distinct container formats. WebM is easily streamable, as piped directly out of FFmpeg. MP4 requires futzing with the MOOV atom (-movflags faststart), and can be a bit of a hassle.

In any case, sounds like you just need to set timestamps on the input:

ffmpeg -ss 00:01:23 -i video.mp4 -to 00:04:56 -f webm -

I've seen the pipe:1 attribute from ffmpeg, but I couldn't find any tutorials to get it working with an mp4 webm video, and to parse the stdout data somehow with nodejs and send it to the client.

Just use a hyphen - as the output filename and FFmpeg will output to STDOUT. Then, there's nothing else you need to do in your Node.js application... pipe that output directly to the client. Untested, but you're looking for something like this, assuming a typical Express app:

app.get('/stream', (req, res, next) => {
  const ffmpeg = child_process.spawn('ffmpeg', [
    '-i', 'video.mp4',
    '-f', 'webm',
    '-'
  ]);

  res.set('Content-Type', 'video/webm'); // TODO: Might want to set your codecs here also

  ffmpeg.stdout.pipe(res);
});

And even if I could get that part to work, I still have no idea to play the video, in realtime, at a certain timestamp.

Well, for this, you're just playing a stream so you can just do:

<video src="https://your-nodejs-server.example.com/stream" preload="none" />

The preload="none" part is important, to keep it "live".

An alternative to all of this is to set up a GStreamer pipeline, and probably utilize its built-in WebRTC stack. This is not trivial, but has the advantage of potentially lower latency, and automatic handling of "catching up" to live video from the server. If you use the normal video tag, you'll have to handle that yourself by monitoring the buffered data and managing the playback speed.

I've also seen ffplay...

FFplay isn't relevant to your project.

Hopefully this pile of notes will give you some things to consider and look at.