0
votes

In a nutshell: Is there a way to capture/manipulate all audio produced by an app using RemoteIO?

I can get render callbacks which allow me to send audio to the speaker by hooking into RemoteIO's output bus for the input scope. But my input buffer in that callback does not contain the sound being produced elsewhere in the app by an AVPlayer. Is manipulating all app audio even possible?

Here is my setup:

-(void)setup
{
    OSStatus status = noErr;

    AudioComponentDescription remoteIODesc;
    fillRemoteIODesc(&remoteIODesc);
    AudioComponent inputComponent = AudioComponentFindNext(NULL, &remoteIODesc);

    AudioComponentInstance remoteIO;
    status = AudioComponentInstanceNew(inputComponent, &remoteIO);
    assert(status == noErr);

    AudioStreamBasicDescription desc = {0};
    fillShortMonoASBD(&desc);

    status = AudioUnitSetProperty(remoteIO,
                                  kAudioUnitProperty_StreamFormat,
                                  kAudioUnitScope_Input,
                                  0,
                                  &desc,
                                  sizeof(desc));
    assert(status == noErr);

    AURenderCallbackStruct callback;
    callback.inputProc = outputCallback;
    callback.inputProcRefCon = _state;

    status = AudioUnitSetProperty(remoteIO,
                                  kAudioUnitProperty_SetRenderCallback,
                                  kAudioUnitScope_Input,
                                  0,
                                  &callback,
                                  sizeof(callback));
    assert(status == noErr);

    status = AudioUnitInitialize(remoteIO);
    assert(status == noErr);

    status = AudioOutputUnitStart(remoteIO);
    assert(status == noErr);
}
1

1 Answers

2
votes

Short answer : no, it doesn't work that way, unfortunately. You won't be able to add arbitrary processing to audio you're producing through AVFoundation (as of iOS 6).

You're misunderstanding the purpose of the RemoteIO unit. The RemoteIO gives you access to 2 things: the audio input hardware and the audio output hardware. As in, you can use the RemoteIO to get audio from the microphone, or send audio to the speakers. The RemoteIO unit won't let you grab audio that other parts of your app (e.g. AVFoundation) are sending to the hardware. Without getting too into it, this is because AVFoundation doesn't use the same audio pathway that you're using with the RemoteIO.

To manipulate audio on the level you want, you're going to have to go deeper than AVFoundation. The Audio Queue Services are the next layer down, and will give you access to the audio in the form of the Audio Queue Processing Tap. This is probably the simplest way to start processing audio. There's not too much documentation on it yet, though. Probably the best source at the moment is the header AudioToolbox.framework/AudioQueue.h Note that this was only introduced in iOS 6.

Deeper than that are Audio Units. This is where the RemoteIO unit lives. You can use the AUFilePlayer to produce sound from an audio file, then feed that audio to other Audio Units to process it (or do it yourself). This will be quite a bit more tricky / verbose than AVFoundation (understatement), but if you've already got a RemoteIO unit set up then you can probably handle it.