I'm trying to implement efficient audio buffer for radio streaming. Initally I was trying to do buffering as follows
So initially I have my InputStream
which is taken from HttpUrlConnection
. The data is being read from input stream by circular buffer which fills up the buffers. In this case I have 3 buffers but the amount and size of buffers can be easily changed. So after all buffers are filled up I start to write data from my first buffer to the OutputStream
which is connected to InputStream
, so whenever I write something to the OutputStream
it can be then read from InputStream
. MediaCodec
is responsible for reading the data from InputStream
and decoding the data which is then being passed to AudioTrack
.
The problem with this set up is that after a while OutputStream eventually reaches the end of circular buffer, so there is no more "extra buffer" in between "OutputStream.giveMeNextBufferWithData"
and "CircularBuffer.readDataFromInputStreamAndCreateBuffer"
(pseudo code).
I've tried increasing the amount and size of each buffer, but that didn't help. It just delays the amount of time before OutputStream
gets "hungry" for more data.
There is one very good library https://code.google.com/p/aacdecoder-android/ which works very well when it comes to buffer audio, but unfortunately I couldn't figure out how the buffering is done there and it doesn't work that well under some circumstances.
So is there any "perfect" algorithm for such a task ? I was thinking about double/triple buffering but I'm not too sure about it and considering that I've been searching around the net and implementing different solutions with no success for last few days I eventually decided to ask her.
I hope I explained everything well.
Thank you all in advance !