4
votes

I've got a motu UltraLite mk4 USB audio interface attached to a Mac running MacOS 10.13.3.

I'm trying to play three stereo sound files, each to its own stereo speaker.

To start with less complexity, I'm just trying to get the stereo sound of one AVAudioPlayerNode to come out of something other than the default output channels 0 and 1.

USB audio interface front

USB audio interface back

I'm hoping to accomplish this with AVAudioEngine + a bit of low level Core Audio.

So far, I've successfully managed to play audio to my USB audio interface, by configuring AVAudioEngine's output node:

AudioUnit audioUnit = [[self.avAudioEngine outputNode] audioUnit];
OSStatus error = AudioUnitSetProperty(audioUnit,
    kAudioOutputUnitProperty_CurrentDevice,
    kAudioUnitScope_Global,
    0,
    &deviceID,
    sizeof(deviceID));
if (error) {
    NSLog(@"Failed to set desired output audio device: %d", (int)
}

I've also successfully managed to set up a channel map which lets one player play a stereo file to channels 4+5 of my USB audio interface, as described by theanalogkid in Apple's dev forums.

The trouble with the channel map is though that it is globally applied to the output of AVAudioEngine, and it affects all AVAudioPlayerNodes. So this is not the solution.

So instead of using that channel map on AVAudioEngine's outputNode's AudioUnit, I tried to create an AVAudioFormat with a custom channel layout that specifies the discrete channel indices 4 and 5:

// Create audio channel layout struct with two channels
int numChannels = 2;
AudioChannelLayout *audioChannelLayout = calloc(1, sizeof(AudioChannelLayout) + (numChannels - 1) * sizeof(AudioChannelDescription));
audioChannelLayout->mNumberChannelDescriptions = numChannels;
audioChannelLayout->mChannelLayoutTag = kAudioChannelLayoutTag_UseChannelDescriptions;
audioChannelLayout->mChannelBitmap = 0;

// Configure channel descriptions
audioChannelLayout->mChannelDescriptions[0].mChannelFlags = kAudioChannelFlags_AllOff;
audioChannelLayout->mChannelDescriptions[0].mChannelLabel = kAudioChannelLabel_Discrete_4;
audioChannelLayout->mChannelDescriptions[0].mCoordinates[0] = 0;
audioChannelLayout->mChannelDescriptions[0].mCoordinates[1] = 0;
audioChannelLayout->mChannelDescriptions[0].mCoordinates[2] = 0;

audioChannelLayout->mChannelDescriptions[1].mChannelFlags = kAudioChannelFlags_AllOff;
audioChannelLayout->mChannelDescriptions[1].mChannelLabel = kAudioChannelLabel_Discrete_5;
audioChannelLayout->mChannelDescriptions[1].mCoordinates[0] = 0;
audioChannelLayout->mChannelDescriptions[1].mCoordinates[1] = 0;
audioChannelLayout->mChannelDescriptions[1].mCoordinates[2] = 0;

// Create AVAudioChannelLayout
AVAudioChannelLayout *avAudioChannelLayout = [[AVAudioChannelLayout alloc] initWithLayout:audioChannelLayout];

// Create AVAudioFormat
AVAudioOutputNode *outputNode = engine.avAudioEngine.outputNode;
AVAudioFormat *outputHWFormat = [outputNode outputFormatForBus:0];
AVAudioFormat *targetFormat = [[AVAudioFormat alloc] initWithStreamDescription:player.avAudioFile.processingFormat.streamDescription
                                                                 channelLayout:avAudioChannelLayout];

Then I connected the player's mixer to the engine's main mixer, using the output hardware format:

[engine connect:speakerMixer to:[engine mainMixerNode] format:outputHWFormat];

And the player to its mixer, using the custom AVAudioFormat with the tweaked channel layout:

[engine connect:player to:speakerMixer format:targetFormat];

Result? Sound plays, but still only out of the default 0 and 1 channels of the USB audio interface.

If I apply the targetFormat to the mixer-to-mainMixer connection, the USB audio interface receives no sound at all.

I've also attempted to apply a channel map to the AVAudioMixerNode's underlying AVAudioUnit, however I can't seem to obtain the underlying bare-metal AudioUnit from AVAudioUnit in order to apply a channel map. Perhaps this is not possible with AVAudioEngine. If so, are you aware of any other way to do this?

Probably I've overlooked something critical. I've spent many days researching this, and feel stuck. Appreciating any help I can get.

1
I think you need to set the bus 0 input scope kAudioUnitProperty_StreamFormat on the output node AudioUnit to describe your 6 (right?) channels. Not sure if AVAudioEngine will be happy with that. If not you could switch to CoreAudio. This seems to be at odds with linked post, although that refers to iOS multiroute AudioSession category, which probably does this step for you.Rhythmic Fistman
@RhythmicFistman The output channels are indexed from 0 to 19, and the first 8 are routed to the analog outputs of the USB audio interface in discrete order. When I apply a channel map to the outputNode's underlying AudioUnit, it works. How would you be able to let individual player nodes or mixer nodes route their audio to specific channels if you apply the format globally?codingChicken
This I don’t know. Could the bus argument be used for that when connecting nodes?Rhythmic Fistman

1 Answers

5
votes

Many years ago there was a warning in the documentation of AVAudioEngine that an app should have only one instance.

Apparently, this limitation disappeared, and I've successfully created multiple instances, each with their own channel map applied to the engine's outputNode's AudioUnit.

The documentation still claims that the engine's outputNode is a "singleton", though my tests on the Mac revealed that every AVAudioEngine instance has its own output node instance and that it is possible to set a channel map on one instance without affecting the output of another.

This isn't exactly what I had been looking for, though it does seem to work. I'd still prefer a solution where it is possible to route channels from player or mixer nodes to specific output channels of the audio hardware and do it all with one single AVAudioEngine instance. On the other hand, right now I struggle to come up with a good reason why it would be terrible to have multiple AVAudioEngine instances running.