I am trying to use the new AVFoundation framework
for taking still pictures with the iPhone.
With a button press this methos is called. I can hear the shutter sound but I can't see the log output. If I call this method several times the camera preview will freeze.
Is there any tutorial out there how to use captureStillImageAsynchronouslyFromConnection
?
[[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:
[[self stillImageOutput].connections objectAtIndex:0]
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer,
NSError *error) {
NSLog(@"inside");
}];
- (void)initCapture { AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] error:nil]; AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init]; captureOutput.alwaysDiscardsLateVideoFrames = YES; dispatch_queue_t queue; queue = dispatch_queue_create("cameraQueue", NULL); [captureOutput setSampleBufferDelegate:self queue:queue]; dispatch_release(queue); NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; [captureOutput setVideoSettings:videoSettings]; self.captureSession = [[AVCaptureSession alloc] init]; self.captureSession.sessionPreset = AVCaptureSessionPresetLow; [self.captureSession addInput:captureInput]; [self.captureSession addOutput:captureOutput]; self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: self.captureSession]; [self.prevLayer setOrientation:AVCaptureVideoOrientationLandscapeLeft]; self.prevLayer.frame = CGRectMake(0.0, 0.0, 480.0, 320.0); self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; [self.view.layer addSublayer: self.prevLayer]; // Setup the default file outputs AVCaptureStillImageOutput *_stillImageOutput = [[[AVCaptureStillImageOutput alloc] init] autorelease]; NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil]; [_stillImageOutput setOutputSettings:outputSettings]; [outputSettings release]; [self setStillImageOutput:_stillImageOutput]; if ([self.captureSession canAddOutput:stillImageOutput]) { [self.captureSession addOutput:stillImageOutput]; } [self.captureSession commitConfiguration]; [self.captureSession startRunning]; }
captureStillImageAsynchronouslyFromConnection
at the same time as getting a video feed using AVCaptureVideoDataOutput's delegate,captureOutput
and while using a previewLayer. I get a 5megapixel image from stillImage, an 852x640 for the video feed and previewLayer. – mahboudz