1
votes

I'm very new with ffmpeg. Consider the following case:

I have several onvif ip camera connected to the network with an IIS server inside it. I'd like to allow client to streaming to any of ip camera inside the network but it must through the IIS server.

So basically each of ip camera will stream to IIS server in single stream and IIS server will re-distribute to many client who request it. My question is how to setup iis server to works with this scenario? And an example of ffmpeg command line to read from rtsp ip camera and send it the iis server which will re-distribute it to client.

1

1 Answers

1
votes

You can use HTTP live streaming for this scenario, either HLS or DASH. HTTP streaming adds some latency so you need to do a bit of research on how to tweak the encoding parameters for low-latency.

The basic idea is that you need to segment the incoming stream and make those segments and playlist/manifest available via your existing web server infrastructure.

Example for FFmpeg and HLS:

ffmpeg -i rtsp://input_stream.sdp -c:v libx264 -r 25 -g 25 -c:a libfdk_aac -hls_time 1 -hls_list_size 4 -hls_wrap 8 /path/to/webroot/live/playlist.m3u8

On the client you will then use the URL http://domain.com/live/playlist.m3u8. HLS in not supported natively on all devices so get a web player like JWplayer or clappr. The client needs 3 segments to start the playback.

FFmpeg HLS

For DASH the idea is similar but you also need to use MP4Box.