0
votes

I use audiounit to record the voice and use audioqueue to play the audio data. When I set:

[session setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];

The volume of played audio will be very low. But when I set:

[session setCategory:AVAudioSessionCategoryMultiRoute error:&error];

Though the volume will be normal. But the audio data output from the iOS microphone will not be the same long. With the former setting, the audiodata length will be 4096, but the latter will output the length in 3760 and 3764. This will cause crash when I encode the audiodata.

I find the solution said that I should open the audio player before open the audio recorder, and then the problem will be solved. Unfortunately, I must open the audio recorder first. So I don't know how to set the audiosession can I get a loud volume and the audiodata in the same length.

1
When I use an iPad as test device, I can use PlayAndRecord to get nomal volume.Between

1 Answers

0
votes

3760-3764 is due to the resampling from 48k to your 44.1k sample rate.

Use a circular/ring buffer/fifo to avoid problems encoding (e.g. always take exactly 4096 samples out of the buffer/fifo only when that much or more is already in the buffer/fifo).

Also, try setting your preferred sample rate before enabling your AudioSession.