I am trying to implement video capture in my app using AVFoundation. I have the following code under viewDidLoad:
session = [[AVCaptureSession alloc] init];
movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
videoInputDevice = [[AVCaptureDeviceInput alloc] init];
AVCaptureDevice *videoDevice = [self frontFacingCameraIfAvailable];
if (videoDevice)
{
NSError *error;
videoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if (!error)
{
if ([session canAddInput:videoInputDevice])
[session addInput:videoInputDevice];
else
NSLog (@"Couldn't add input.");
}
}
AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
NSError *audioError = nil;
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&audioError];
if (audioInput)
{
[session addInput:audioInput];
}
movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
Float64 TotalSeconds = 35; //Total seconds
int32_t preferredTimeScale = 30; //Frames per second
CMTime maxDuration = CMTimeMakeWithSeconds(TotalSeconds, preferredTimeScale);
movieFileOutput.maxRecordedDuration = maxDuration;
movieFileOutput.minFreeDiskSpaceLimit = 1024 * 1024;
if ([session canAddOutput:movieFileOutput])
[session addOutput:movieFileOutput];
[session setSessionPreset:AVCaptureSessionPresetMedium];
if ([session canSetSessionPreset:AVCaptureSessionPreset640x480]) //Check size based configs are supported before setting them
[session setSessionPreset:AVCaptureSessionPreset640x480];
[self cameraSetOutputProperties];
[session startRunning];
This code is in the implementation for a button that is to start the capture, among other things:
NSString *outputPath = [[NSString alloc] initWithFormat:@"%@%@", NSTemporaryDirectory(), @"output.mov"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager fileExistsAtPath:outputPath])
{
NSError *error;
if ([fileManager removeItemAtPath:outputPath error:&error] == NO)
{
//Error - handle if requried
}
}
[outputPath release];
//Start recording
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
[outputURL release];
When I try to run it on a device, it crashes when I try to load the view that all of this is supposed to happen on. Xcode gives me a "Thread 1: EXC_BAD_ACCESS (code=1, address=0x4) at:
AVFoundation`-[AVCaptureDeviceInput _setDevice:]:
(stuff)
0x3793f608: ldr r0, [r1, r0]
The error is given at that last line. I assume this has something to do with the AVCaptureDeviceInput somewhere, but I am blanking on what it could be. Does anyone have any idea what I am missing here? Thanks.
Edit: After fiddling with breakpoints, I've figured out that the crash happens at this line:
AVCaptureDevice *videoDevice = [self frontFacingCameraIfAvailable];
So something to do with that method? Here's the implementation file that I have for it, maybe something is wrong there.
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices)
{
if (device.position == AVCaptureDevicePositionFront)
{
captureDevice = device;
break;
}
}
// couldn't find one on the front, so just get the default video device.
if ( ! captureDevice)
{
captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
}
return captureDevice;
Edit 2: Could it maybe be that I am using
AVCaptureDevice *videoDevice = [self frontFacingCameraIfAvailable];
and the 'self' is throwing a wrench in it somehow? I know that when creating a CALayer it is possible to do
CALayer *aLayer = [CALayer layer];
but I don't know the AVCaptureDevice equivalent of this, if there is one. I am not sure what else it could be, by all accounts my code seems fine and I've tried cleaning the project, restarting Xcode, restarting the computer, etc.
release
. – Dima