6
votes

in my program I want the user to be able to:

  • record his voice,
  • pause the recording process,
  • listen to what he recorded
  • and then continue recording.

I have managed to get to the point where I can record and play the recordings with AVAudioRecorder and AVAudioPlayer. But whenever I try to record, pause recording and then play, the playing part fails with no error.

I can guess that the reason it's not playing is because the audio file hasn't been saved yet and is still in memory or something.

Is there a way I can play paused recordings? If there is please tell me how

I'm using xcode 4.3.2

3

3 Answers

10
votes

If you want to play the recording, then yes you have to stop recording before you can load the file into the AVAudioPlayer instance.

If you want to be able to playback some of the recording, then add more to the recording after listening to it, or say record in the middle.. then you're in for some trouble.

You have to create a new audio file and then combine them together.

This was my solution:

// Generate a composition of the two audio assets that will be combined into
// a single track
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack* audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                 preferredTrackID:kCMPersistentTrackID_Invalid];

// grab the two audio assets as AVURLAssets according to the file paths
AVURLAsset* masterAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:self.masterFile] options:nil];
AVURLAsset* activeAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:self.newRecording] options:nil];

NSError* error = nil;

// grab the portion of interest from the master asset
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, masterAsset.duration)
                    ofTrack:[[masterAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
                     atTime:kCMTimeZero
                      error:&error];
if (error)
{
    // report the error
    return;
}

// append the entirety of the active recording
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, activeAsset.duration)
                    ofTrack:[[activeAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
                     atTime:masterAsset.duration
                      error:&error];

if (error)
{
    // report the error
    return;
}

// now export the two files
// create the export session
// no need for a retain here, the session will be retained by the
// completion handler since it is referenced there

AVAssetExportSession* exportSession = [AVAssetExportSession
                                       exportSessionWithAsset:composition
                                       presetName:AVAssetExportPresetAppleM4A];
if (nil == exportSession)
{
    // report the error
    return;
}


NSString* combined = @"combined file path";// create a new file for the combined file

// configure export session  output with all our parameters
exportSession.outputURL = [NSURL fileURLWithPath:combined]; // output path
exportSession.outputFileType = AVFileTypeAppleM4A; // output file type

[exportSession exportAsynchronouslyWithCompletionHandler:^{

    // export status changed, check to see if it's done, errored, waiting, etc
    switch (exportSession.status)
    {
        case AVAssetExportSessionStatusFailed:
            break;
        case AVAssetExportSessionStatusCompleted:
            break;
        case AVAssetExportSessionStatusWaiting:
            break;
        default:
            break;
    }
    NSError* error = nil;

    // your code for dealing with the now combined file
}];

I can't take full credit for this work, but it was pieced together from the input of a couple of others:

AVAudioRecorder / AVAudioPlayer - append recording to file

(I can't find the other link at the moment)

3
votes

We had the same requirements for our app as the OP described, and ran into the same issues (i.e., the recording has to be stopped, instead of paused, if the user wants to listen to what she has recorded up to that point). Our app (project's Github repo) uses AVQueuePlayer for playback and a method similar to kermitology's answer to concatenate the partial recordings, with some notable differences:

  • implemented in Swift
  • concatenates multiple recordings into one
  • no messing with tracks

The rationale behind the last item is that simple recordings with AVAudioRecorder will have one track, and the main reason for this whole workaround is to concatenate those single tracks in the assets (see Addendum 3). So why not use AVMutableComposition's insertTimeRange method instead, that takes an AVAsset instead of an AVAssetTrack?

Relevant parts: (full code)

import UIKit
import AVFoundation

class RecordViewController: UIViewController {

    /* App allows volunteers to record newspaper articles for the
       blind and print-impaired, hence the name.
    */
    var articleChunks = [AVURLAsset]()

    func concatChunks() {
        let composition = AVMutableComposition()

        /* `CMTimeRange` to store total duration and know when to
           insert subsequent assets.
        */
        var insertAt = CMTimeRange(start: kCMTimeZero, end: kCMTimeZero)

        repeat {
            let asset = self.articleChunks.removeFirst()

            let assetTimeRange = 
                CMTimeRange(start: kCMTimeZero, end: asset.duration)

            do {
                try composition.insertTimeRange(assetTimeRange, 
                                                of: asset, 
                                                at: insertAt.end)
            } catch {
                NSLog("Unable to compose asset track.")
            }

            let nextDuration = insertAt.duration + assetTimeRange.duration
            insertAt = CMTimeRange(start: kCMTimeZero, duration: nextDuration)
        } while self.articleChunks.count != 0

        let exportSession =
            AVAssetExportSession(
                asset:      composition,
                presetName: AVAssetExportPresetAppleM4A)

        exportSession?.outputFileType = AVFileType.m4a
        exportSession?.outputURL = /* create URL for output */
        // exportSession?.metadata = ...

        exportSession?.exportAsynchronously {

            switch exportSession?.status {
            case .unknown?: break
            case .waiting?: break
            case .exporting?: break
            case .completed?: break
            case .failed?: break
            case .cancelled?: break
            case .none: break
            }
        }

        /* Clean up (delete partial recordings, etc.) */
    }

This diagram helped me to get around what expects what and inherited from where. (NSObject is implicitly implied as superclass where there is no inheritance arrow.)

enter image description here


Addendum 1: I had my reservations regarding the switch part instead of using KVO on AVAssetExportSessionStatus, but the docs are clear that exportAsynchronously's callback block "is invoked when writing is complete or in the event of writing failure".

Addendum 2: Just in case if someone has issues with AVQueuePlayer: 'An AVPlayerItem cannot be associated with more than one instance of AVPlayer'

Addendum 3: Unless you are recording in stereo, but mobile devices have one input as far as I know. Also, using fancy audio mixing would also require the use of AVCompositionTrack. A good SO thread: Proper AVAudioRecorder Settings for Recording Voice?

1
votes

RecordAudioViewController.h

 #import <UIKit/UIKit.h>
 #import <AVFoundation/AVFoundation.h>
 #import <CoreAudio/CoreAudioTypes.h>

   @interface record_audio_testViewController : UIViewController <AVAudioRecorderDelegate> {

IBOutlet UIButton * btnStart;
IBOutlet UIButton * btnPlay;
IBOutlet UIActivityIndicatorView * actSpinner;
BOOL toggle;

//Variables setup for access in the class:
NSURL * recordedTmpFile;
AVAudioRecorder * recorder;
NSError * error;

 }

 @property (nonatomic,retain)IBOutlet UIActivityIndicatorView * actSpinner;
 @property (nonatomic,retain)IBOutlet UIButton * btnStart;
 @property (nonatomic,retain)IBOutlet UIButton * btnPlay;

 - (IBAction) start_button_pressed;
 - (IBAction) play_button_pressed;
 @end

RecordAudioViewController.m

  @synthesize actSpinner, btnStart, btnPlay;
   - (void)viewDidLoad {
    [super viewDidLoad];

//Start the toggle in true mode.
toggle = YES;
btnPlay.hidden = YES;

//Instanciate an instance of the AVAudioSession object.
AVAudioSession * audioSession = [AVAudioSession sharedInstance];
//Setup the audioSession for playback and record. 
//We could just use record and then switch it to playback leter, but
//since we are going to do both lets set it up once.
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error: &error];
//Activate the session
[audioSession setActive:YES error: &error];

  }


 - (IBAction)  start_button_pressed{

if(toggle)
{
    toggle = NO;
    [actSpinner startAnimating];
    [btnStart setTitle:@"Stop Recording" forState: UIControlStateNormal ];  
    btnPlay.enabled = toggle;
    btnPlay.hidden = !toggle;

    //Begin the recording session.
    //Error handling removed.  Please add to your own code.

    //Setup the dictionary object with all the recording settings that this 
    //Recording sessoin will use
    //Its not clear to me which of these are required and which are the bare minimum.
    //This is a good resource: http://www.totodotnet.net/tag/avaudiorecorder/
    NSMutableDictionary* recordSetting = [[NSMutableDictionary alloc] init];
    [recordSetting setValue :[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey];
    [recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey]; 
    [recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];

    //Now that we have our settings we are going to instanciate an instance of our recorder instance.
    //Generate a temp file for use by the recording.
    //This sample was one I found online and seems to be a good choice for making a tmp file that
    //will not overwrite an existing one.
    //I know this is a mess of collapsed things into 1 call.  I can break it out if need be.
    recordedTmpFile = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingPathComponent: [NSString stringWithFormat: @"%.0f.%@", [NSDate timeIntervalSinceReferenceDate] * 1000.0, @"caf"]]];
    NSLog(@"Using File called: %@",recordedTmpFile);
    //Setup the recorder to use this file and record to it.
    recorder = [[ AVAudioRecorder alloc] initWithURL:recordedTmpFile settings:recordSetting error:&error];
    //Use the recorder to start the recording.
    //Im not sure why we set the delegate to self yet.  
    //Found this in antother example, but Im fuzzy on this still.
    [recorder setDelegate:self];
    //We call this to start the recording process and initialize 
    //the subsstems so that when we actually say "record" it starts right away.
    [recorder prepareToRecord];
    //Start the actual Recording
    [recorder record];
    //There is an optional method for doing the recording for a limited time see 
    //[recorder recordForDuration:(NSTimeInterval) 10]

}
else
{
    toggle = YES;
    [actSpinner stopAnimating];
    [btnStart setTitle:@"Start Recording" forState:UIControlStateNormal ];
    btnPlay.enabled = toggle;
    btnPlay.hidden = !toggle;

    NSLog(@"Using File called: %@",recordedTmpFile);
    //Stop the recorder.
    [recorder stop];
}
  }

  - (void)didReceiveMemoryWarning {
// Releases the view if it doesn't have a superview.
[super didReceiveMemoryWarning];

// Release any cached data, images, etc that aren't in use.
  }

  -(IBAction) play_button_pressed{

//The play button was pressed... 
//Setup the AVAudioPlayer to play the file that we just recorded.
AVAudioPlayer * avPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:recordedTmpFile error:&error];
[avPlayer prepareToPlay];
[avPlayer play];

  }

   - (void)viewDidUnload {
// Release any retained subviews of the main view.
// e.g. self.myOutlet = nil;
//Clean up the temp file.
NSFileManager * fm = [NSFileManager defaultManager];
[fm removeItemAtPath:[recordedTmpFile path] error:&error];
//Call the dealloc on the remaining objects.
[recorder dealloc];
recorder = nil;
recordedTmpFile = nil;
  }


  - (void)dealloc {
[super dealloc];
  }

 @end

RecordAudioViewController.xib

take 2 Buttons. 1 for begin recording and another for Play recording