2
votes

I have completed an RTMP player on iOS, using FFmpeg to decode flv1 video and speex audio. Now I want to capture iOS camera and decode H.264 video and AAC audio, then publish video and audio stream to RTMP server, Red5 server as the player programe used before. I know that I should recompile FFmpeg, adding libx264 and libaacplus to support iOS video and audio decoding. But then how to publish RTMP live stream? Using RTMP_Write()? RTMP_SendPacket()? Please just tell me some thoughts or solutions, or it's very generous of you to show me some code. Thanks!

Reference: capture camera and publish video with librtmp

1

1 Answers

0
votes

FFmpeg supports rtmp input and output both with an internal protocol ("rtmp") and from an external library ("librtmp"). The only reason I know of to choose the internal or librtmp version over the other is for specific server support -- i.e. one may work better for you than another for a given server.

In FFmpeg, RTMP video is muxed to flv and so long as your output path/uri begins with "rtmp://..." it should just work for you. Nothing is stopping you from using librtmp directly, of course -- but why bother?

Configuring your server to accept streams, and to know what endpoint to view the stream on, can be it's own little adventure.

(Disclaimer: I'm pretty much doing this right now, so I know it's possible and straightforward.)