My Node.js app uses FFmpeg to capture video of a DirectShow device and then output segments for live streaming (HLS). At the moment I'm outputting the segments to files, however if I could output them via a pipe it would allow me to efficiently send the segment via a websocket instead of hosting a HTTP server.
I've tried using this command:
ffmpeg -y -f dshow -i video=FFsource:audio=Stereo Mix (Realtek High Definition Audio) -vcodec libvpx -acodec libvorbis -threads 0 -b:v 3300k -cpu-used 5 -keyint_min 150 -g 150 -map 0 -flags:v +global_header -f segment -
However it gives the error "Could not write header for output file #0 (incorrect codec parameters ?): Muxer not found". This commands works for outputting to files (by replacing '-' with 'seg_%03d.webm').
Does FFmpeg not support pipes for segmented video, or is there something wrong with the command? Thanks.