2
votes

Simple question. How do I play multi channel audio files (>2 channels) using AVAudioEngine so that I can hear all channels on default 2-channel output (headphones/speaker). Following code (stripped of error checking for presenting) plays the file first two channels but I can only hear it when headphones are plugged in.

AVAudioFile *file = [[AVAudioFile alloc] initForReading:[[NSBundle mainBundle] URLForResource:@"nums6ch" withExtension:@"wav"] error:nil];
AVAudioEngine *engine = [[AVAudioEngine alloc] init];
AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
AVAudioMixerNode *mixer = [[AVAudioMixerNode alloc] init];
[engine attachNode:player];
[engine attachNode:mixer];
AVAudioFormat *processingFormat = [[AVAudioFormat alloc] initWithCommonFormat:AVAudioPCMFormatFloat32 sampleRate:file.processingFormat.streamDescription->mSampleRate channels:2 interleaved:false];
[engine connect:player to:mixer format:processingFormat];
[engine connect:mixer to:engine.outputNode format:processingFormat];
[engine startAndReturnError:nil];
[player scheduleFile:file atTime:nil completionHandler:nil];
[player play];

I tried a lot of combinations with formats for both player->mixer and mixer->output connections but they either result with same thing as code above or more likely a crash with either:

ERROR:     [0x19c392310] AVAudioNode.mm:495: AUGetFormat: required condition is false: nil != channelLayout && channelLayout.channelCount == asbd.mChannelsPerFrame
*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: nil != channelLayout && channelLayout.channelCount == asbd.mChannelsPerFrame'

or AUSetFormat OSStatus code -10868 (which is format not supported if I'm not mistaken). Using file.processingFormat for any connection crashes app with the first error above. Also, the crash occurs on [player play];

There must be some way of playing it like I want because I'm able to do it without problem using AUGraph, however, since AVAudioEngine provides one feature I have to include, I'm stuck with it.

Multi channel audio files I use to check my code can be found here

UPDATE Ok, so hearing audio in headphones only was probably due to me forgetting setting audio session to active in my test app. But I still only hear first two channels...

2

2 Answers

2
votes

I had a similar issue in Swift. The error was 'com.apple.coreaudio.avfaudio', reason: 'required condition is false:!nodeimpl->SslEngineImpl()'.

The task was to play two audio files, one after another. If I hit stop after playing the first audio file and then played the second audio file, the system crashed.

I found that in a function I created I had audioEngine.attachNode(audioPlayerNode) which means that the audioPlayerNode was being attached to the audioEngine once and then exited. So I moved this attachment to the viewDidLoad() function so that it gets passed every time.

1
votes

So here's what I managed so far. It's far from perfect but it somewhat works.

To get all channels you need to use AVAudioPCMBuffer and store two channels from file in each. Also, for each channel pair you need separate AVAudioPlayerNode, then just connect each player to AVAudioMixerNode and we're done. Some simple code for 6-channel audio:

AVAudioFormat *outputFormat = [[AVAudioFormat alloc] initWithCommonFormat:AVAudioPCMFormatFloat32 sampleRate:file.processingFormat.sampleRate channels:2 interleaved:false];
AVAudioFile *file = [[AVAudioFile alloc] initForReading:[[NSBundle mainBundle] URLForResource:@"nums6ch" withExtension:@"wav"] error:nil];
AVAudioPCMBuffer *wholeBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:file.processingFormat frameCapacity:(AVAudioFrameCount)file.length];
AVAudioPCMBuffer *buffer1 = [[AVAudioPCMBuffer alloc] initWithPCMFormat:outputFormat frameCapacity:(AVAudioFrameCount)file.length];
AVAudioPCMBuffer *buffer2 = [[AVAudioPCMBuffer alloc] initWithPCMFormat:outputFormat frameCapacity:(AVAudioFrameCount)file.length];
AVAudioPCMBuffer *buffer3 = [[AVAudioPCMBuffer alloc] initWithPCMFormat:outputFormat frameCapacity:(AVAudioFrameCount)file.length];
memcpy(buffer1.audioBufferList->mBuffers[0].mData, wholeBuffer.audioBufferList->mBuffers[0].mData, wholeBuffer.audioBufferList->mBuffers[0].mDataByteSize);
memcpy(buffer1.audioBufferList->mBuffers[1].mData, wholeBuffer.audioBufferList->mBuffers[1].mData, wholeBuffer.audioBufferList->mBuffers[1].mDataByteSize);
buffer1.frameLength = wholeBuffer.audioBufferList->mBuffers[0].mDataByteSize/sizeof(UInt32);
memcpy(buffer2.audioBufferList->mBuffers[0].mData, wholeBuffer.audioBufferList->mBuffers[2].mData, wholeBuffer.audioBufferList->mBuffers[2].mDataByteSize);
memcpy(buffer2.audioBufferList->mBuffers[1].mData, wholeBuffer.audioBufferList->mBuffers[3].mData, wholeBuffer.audioBufferList->mBuffers[3].mDataByteSize);
buffer2.frameLength = wholeBuffer.audioBufferList->mBuffers[0].mDataByteSize/sizeof(UInt32);
memcpy(buffer3.audioBufferList->mBuffers[0].mData, wholeBuffer.audioBufferList->mBuffers[4].mData, wholeBuffer.audioBufferList->mBuffers[4].mDataByteSize);
memcpy(buffer3.audioBufferList->mBuffers[1].mData, wholeBuffer.audioBufferList->mBuffers[5].mData, wholeBuffer.audioBufferList->mBuffers[5].mDataByteSize);
buffer3.frameLength = wholeBuffer.audioBufferList->mBuffers[0].mDataByteSize/sizeof(UInt32);

AVAudioEngine *engine = [[AVAudioEngine alloc] init];
AVAudioPlayerNode *player1 = [[AVAudioPlayerNode alloc] init];
AVAudioPlayerNode *player2 = [[AVAudioPlayerNode alloc] init];
AVAudioPlayerNode *player3 = [[AVAudioPlayerNode alloc] init];
AVAudioMixerNode *mixer = [[AVAudioMixerNode alloc] init];
[engine attachNode:player1];
[engine attachNode:player2];
[engine attachNode:player3];
[engine attachNode:mixer];
[engine connect:player1 to:mixer format:outputFormat];
[engine connect:player2 to:mixer format:outputFormat];
[engine connect:player3 to:mixer format:outputFormat];
[engine connect:mixer to:engine.outputNode format:outputFormat];
[engine startAndReturnError:nil];

[player1 scheduleBuffer:buffer1 completionHandler:nil];
[player2 scheduleBuffer:buffer2 completionHandler:nil];
[player3 scheduleBuffer:buffer3 completionHandler:nil];
[player1 play];
[player2 play];
[player3 play];

Now this solution is far from perfect since there will be a delay between pairs of channels because of calling play for each player at different time. I also still can't play 8-channel audio from my test files (see link in OP). The AVAudioFile processing format has 0 for channel count and even if I create my own format with correct number of channels and layout, I get error on buffer read. Note that I can play this file perfectly fine using AUGraph.

So I will wait before accepting this answer, if you have better solution please share.

EDIT

So it appears that both my unable to sync nodes problem and not being able to play this particular 8-channel audio are bugs (confirmed by Apple developer support).

So little advice for people meddling with audio on iOS. While AVAudioEngine is fine for simple stuff, you should definitely go for AUGraph with more complicated stuff, even stuff that's suppose to work with AVAudioEngine. And if you don't know how to replicate certain things from AVAudioEngine in AUGraph (like myself), well, tough luck.