1
votes

I have been struggling with this yesterday and would really appreciate the help.

I have a multi channel mixer audio unit and the callback assigned to each channel fills the needed audio buffer when called. I am trying to record within the same callback by writing the data to a file.

At the moment the audio records as noise if I dont call AudioUnitRender and if I do call it I get two errors. Error 10877 and error 50.

the recording code in the callback looks like this

if (recordingOn) 
{
    AudioBufferList *bufferList = (AudioBufferList *)malloc(sizeof(AudioBuffer));

    SInt16 samples[inNumberFrames]; 
    memset (&samples, 0, sizeof (samples));

    bufferList->mNumberBuffers = 1;
    bufferList->mBuffers[0].mData = samples;
    bufferList->mBuffers[0].mNumberChannels = 2;
    bufferList->mBuffers[0].mDataByteSize = inNumberFrames*sizeof(SInt16);

    OSStatus status;
    status = AudioUnitRender(audioObject.mixerUnit,     
                             ioActionFlags, 
                             inTimeStamp, 
                             inBusNumber, 
                             inNumberFrames, 
                             bufferList);

    if (noErr != status) {
        printf("AudioUnitRender error: %ld", status); 
        return noErr;
    }

    ExtAudioFileWriteAsync(audioObject.recordingFile, inNumberFrames, bufferList);
}

Is it correct to write the data in each channels callback or should I connect it to the remote I/O unit?

I am using LPCM and the ASBD for the recording file (caf) is

recordingFormat.mFormatID = kAudioFormatLinearPCM;
recordingFormat.mFormatFlags = kAudioFormatFlagIsSignedInteger |
kAudioFormatFlagIsBigEndian | kAudioFormatFlagIsPacked;
recordingFormat.mSampleRate = 44100;
recordingFormat.mChannelsPerFrame = 2;
recordingFormat.mFramesPerPacket = 1;
recordingFormat.mBytesPerPacket = recordingFormat.mChannelsPerFrame * sizeof (SInt16);
recordingFormat.mBytesPerFrame = recordingFormat.mChannelsPerFrame * sizeof (SInt16);
recordingFormat.mBitsPerChannel = 16;

I am not really sure what I am doing wrong.

How does stereo effect the way the recorded data must be handled before writing to the file?

2

2 Answers

2
votes

There are a couple of issue's. If you are trying to record your final "mix" you can add a callback on the I/O unit using AudioUnitAddRenderNotify(iounit,callback,file). The callback then simply takes the ioData and passes it to ExtAudioFileWriteAsync(...). So, you don't need to create any buffers too. A side note :allocating memory in the render thread is bad. You should avoid all system calls in the render callback. There is no guarantee these calls will execute within the very tight deadline the audio thread has. Hence why there is a ExtAudioFileWriteAsync, it takes this into consideration and writes to disk in another thread.

0
votes

I found a demo codes, maybe useful 4 U;

DEMO URL:https://github.com/JNYJdev/AudioUnit

OR

blog: http://atastypixel.com/blog/using-remoteio-audio-unit/

static OSStatus recordingCallback(void *inRefCon, 
                          AudioUnitRenderActionFlags *ioActionFlags, 
                          const AudioTimeStamp *inTimeStamp, 
                          UInt32 inBusNumber, 
                          UInt32 inNumberFrames, 
                          AudioBufferList *ioData) {
// Because of the way our audio format (setup below) is chosen:
// we only need 1 buffer, since it is mono
// Samples are 16 bits = 2 bytes.
// 1 frame includes only 1 sample

AudioBuffer buffer;

buffer.mNumberChannels = 1;
buffer.mDataByteSize = inNumberFrames * 2;
buffer.mData = malloc( inNumberFrames * 2 );

// Put buffer in a AudioBufferList
AudioBufferList bufferList;
bufferList.mNumberBuffers = 1;
bufferList.mBuffers[0] = buffer;

// Then:
// Obtain recorded samples

OSStatus status;

status = AudioUnitRender([iosAudio audioUnit], 
                     ioActionFlags, 
                     inTimeStamp, 
                     inBusNumber, 
                     inNumberFrames, 
                     &bufferList);
checkStatus(status);

// Now, we have the samples we just read sitting in buffers in bufferList
// Process the new data
[iosAudio processAudio:&bufferList];

// release the malloc'ed data in the buffer we created earlier
free(bufferList.mBuffers[0].mData);

return noErr;
}