The title pretty much sums up what I'm trying to achieve. I am trying to use Michael Tyson's TPCircularBuffer inside of a render callback while the circular buffer is getting filled with incoming audio data. I want to send the audio from the render callback to the output element of the RemoteIO audio unit so I can hear it through the device speakers.
The audio is interleaved stereo 16 bit coming in as packets of 2048 frames. Here's how I've set up my audio session:
#define kInputBus 1
#define kOutputBus 0
NSError *err = nil;
NSTimeInterval ioBufferDuration = 46;
AVAudioSession *session = [AVAudioSession sharedInstance];
[session setCategory:AVAudioSessionCategoryPlayback withOptions:AVAudioSessionCategoryOptionMixWithOthers error:&err];
[session setPreferredIOBufferDuration:ioBufferDuration error:&err];
[session setActive:YES error:&err];
AudioComponentDescription defaultOutputDescription;
defaultOutputDescription.componentType = kAudioUnitType_Output;
defaultOutputDescription.componentSubType = kAudioUnitSubType_RemoteIO;
defaultOutputDescription.componentManufacturer = kAudioUnitManufacturer_Apple;
defaultOutputDescription.componentFlags = 0;
defaultOutputDescription.componentFlagsMask = 0;
AudioComponent defaultOutput = AudioComponentFindNext(NULL, &defaultOutputDescription);
NSAssert(defaultOutput, @"Can't find default output.");
AudioComponentInstanceNew(defaultOutput, &remoteIOUnit);
UInt32 flag = 0;
OSStatus status = AudioUnitSetProperty(remoteIOUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Input, kOutputBus, &flag, sizeof(flag));
size_t bytesPerSample = sizeof(AudioUnitSampleType);
AudioStreamBasicDescription streamFormat = {0};
streamFormat.mSampleRate = 44100.00;
streamFormat.mFormatID = kAudioFormatLinearPCM;
streamFormat.mFormatFlags = kAudioFormatFlagsCanonical;
streamFormat.mBytesPerPacket = bytesPerSample;
streamFormat.mFramesPerPacket = 1;
streamFormat.mBytesPerFrame = bytesPerSample;
streamFormat.mChannelsPerFrame = 2;
streamFormat.mBitsPerChannel = bytesPerSample * 8;
streamFormat.mReserved = 0;
status = AudioUnitSetProperty(remoteIOUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Output, kInputBus, &streamFormat, sizeof(streamFormat));
AURenderCallbackStruct callbackStruct;
callbackStruct.inputProc = render;
callbackStruct.inputProcRefCon = self;
status = AudioUnitSetProperty(remoteIOUnit, kAudioUnitProperty_SetRenderCallback, kAudioUnitScope_Global, kOutputBus, &callbackStruct, sizeof(callbackStruct));
And here's where the audio data gets loaded into the circular buffer and used in the render callback:
#define kBufferLength 2048
-(void)loadBytes:(Byte *)byteArrPtr{
TPCircularBufferProduceBytes(&buffer, byteArrPtr, kBufferLength);
}
OSStatus render(
void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData)
{
AUDIOIO *audio = (__bridge AUDIOIO *)inRefCon;
AudioSampleType *outSample = (AudioSampleType *)ioData->mBuffers[0].mData;
//Zero outSample
memset(outSample, 0, kBufferLength);
int bytesToCopy = ioData->mBuffers[0].mDataByteSize;
SInt16 *targetBuffer = (SInt16 *)ioData->mBuffers[0].mData;
//Pull audio
int32_t availableBytes;
SInt16 *buffer = TPCircularBufferTail(&audio->buffer, &availableBytes);
memcpy(targetBuffer, buffer, MIN(bytesToCopy, availableBytes));
TPCircularBufferConsume(&audio->buffer, MIN(bytesToCopy, availableBytes));
return noErr;
}
There is something wrong with this setup because I am not getting any audio through the speakers, but I'm also not getting any errors when I test on my device. As far as I can tell the TPCircularBuffer is being filled and read from correctly. I've followed the Apple documentation for setting up the audio session. I am considering trying to set up an AUGraph next but I want to see if anyone could suggest a solution for what I'm trying to do here. Thanks!