I am working on a hobby project the goal for which is to develop an Android application capable of streaming live feeds captured through web cams in a LAN setting using FFMpeg as the underlying engine. So far, I did the following -
A. Compiling and generating FFMpeg related libraries for the following releases -
FFMpeg version: 2.0
NDK version: r8e & r9
Android Platform version: android-16 & android-18thisthisthisthis
Toolchain version: 4.6 & 4.8
Platform built on: Fedora 18 (x86_64)
B. Creating the files Android.mk & Application.mk in appropriate path.
However, when it came to writing the native code for accessing appropriate functionality of FFMpeg from the application layer using Java, I'm stuck with following questions -
a) Which all of FFMpeg's features I need to make available from native to app layer for streaming real-time feeds?
b) In order to compile FFMpeg for Android, I followed this link. Whether the compilation options are sufficient for handling *.sdp streams or do I need to modify it?
c) Do I need to make use of live555?
I am totally new to FFMpeg and Android application development and this is going to be my first serious project for Android platform. I have been searching for relevant tutorials dealing with RTSP streaming using FFMpeg for a while now without much success. Moreover, I tried the latest development build of VLC player and found it to be great for streaming real-time feeds. However, it's a complex beast and the goal for my project is of quite limited nature, mostly learning - in a short time span.
Could you suggest some pointers (e.g. links, documents or sample code) on how can I write the native code for utilizing FFMpeg library and subsequently use those functionality from the app layer for streaming real-time feeds? Moreover, will really appreciate if you could let me know the kind of background knowledge necessary for this project from a functional standpoint (in a language agnostic sense).