To specifically answer your question, no, there isn't a way to save an image from a AVCaptureMetadataOutput instance.
However, as codingVoldemort's excellent example shows, you can create an AVCaptureStillImageOutput instance and add it to your AVCaptureSession outputs. Once your app has detected some metadata, you can immediately trigger a capture on that CaptureStillImageOutput instance.
Here's a little more explicit solution using codingVoldemort's initial code as a base:
First, wherever you establish your AVCaptureSession, add an AVCaptureStillImageOutput to it:
_session = [[AVCaptureSession alloc] init];
_output = [[AVCaptureMetadataOutput alloc] init];
[_output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[_session addOutput:_output];
_stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[_session addOutput:_stillImageOutput];
Now, in - captureOutput: didOutputMetadataObjects, you can capture a still image when the method is triggered:
AVCaptureConnection *stillImageConnection = [_stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
[stillImageConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
[stillImageConnection setVideoScaleAndCropFactor:1.0f];
[_stillImageOutput setOutputSettings:[NSDictionary dictionaryWithObject:AVVideoCodecJPEG
forKey:AVVideoCodecKey]];
_stillImageOutput.outputSettings = @{AVVideoCodecKey: AVVideoCodecJPEG, AVVideoQualityKey:@1};
[stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (error) {
NSLog(@"error: %@", error);
}
else {
NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image =[UIImage imageWithData:jpegData];
//Grabbing the image here
dispatch_async(dispatch_get_main_queue(), ^(void) {
//Update UI if necessary.
});
}
}
];