1
votes

I have a barcode scanner I wrote using the some of the new AVCapture APIs in IOS7. Everything works great, but would love to grab the image after I get the met data from the capture output. The method below is the delegate where I do my lookup on SKU, etc and would like to grab the image as well. Is it possible to so this from this method?

 - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection
    {
    ...
    }
3

3 Answers

3
votes

To specifically answer your question, no, there isn't a way to save an image from a AVCaptureMetadataOutput instance.

However, as codingVoldemort's excellent example shows, you can create an AVCaptureStillImageOutput instance and add it to your AVCaptureSession outputs. Once your app has detected some metadata, you can immediately trigger a capture on that CaptureStillImageOutput instance.

Here's a little more explicit solution using codingVoldemort's initial code as a base:

First, wherever you establish your AVCaptureSession, add an AVCaptureStillImageOutput to it:

_session = [[AVCaptureSession alloc] init];

_output = [[AVCaptureMetadataOutput alloc] init];
[_output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[_session addOutput:_output];

_stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[_session addOutput:_stillImageOutput];

Now, in - captureOutput: didOutputMetadataObjects, you can capture a still image when the method is triggered:

AVCaptureConnection *stillImageConnection = [_stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
[stillImageConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
[stillImageConnection setVideoScaleAndCropFactor:1.0f];
[_stillImageOutput setOutputSettings:[NSDictionary dictionaryWithObject:AVVideoCodecJPEG
                                                                        forKey:AVVideoCodecKey]];
_stillImageOutput.outputSettings = @{AVVideoCodecKey: AVVideoCodecJPEG, AVVideoQualityKey:@1};

[stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageConnection
                                              completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
                                                  if (error) {
                                                      NSLog(@"error: %@", error);
                                                  }
                                                  else {
                                                      NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                                                      UIImage *image =[UIImage imageWithData:jpegData];
                                                      //Grabbing the image here
                                                      dispatch_async(dispatch_get_main_queue(), ^(void) {

                                                        //Update UI if necessary.


                                                      });


                                                  }
                                              }

 ];
2
votes

Try this method:

-(void)captureZoomedImage:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    // Find out the current orientation and tell the still image output.
    AVCaptureConnection *stillImageConnection = [stillImageOutput connectionWithMediaType:AVMediaTypeVideo];
    UIDeviceOrientation curDeviceOrientation = [[UIDevice currentDevice] orientation];
    AVCaptureVideoOrientation avcaptureOrientation = [self avOrientationForDeviceOrientation:curDeviceOrientation];
    [stillImageConnection setVideoOrientation:avcaptureOrientation];
    [stillImageConnection setVideoScaleAndCropFactor:1.0f];
    [stillImageOutput setOutputSettings:[NSDictionary dictionaryWithObject:AVVideoCodecJPEG
                                                                    forKey:AVVideoCodecKey]];
     stillImageOutput.outputSettings = @{AVVideoCodecKey: AVVideoCodecJPEG, AVVideoQualityKey:@1};

    [stillImageOutput captureStillImageAsynchronouslyFromConnection:stillImageConnection
                                                  completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
                                                      if (error) {
                                                          [self displayErrorOnMainQueue:error withMessage:@"Take picture failed"];
                                                      }
                                                      else {
                                                          NSData *jpegData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                                                          UIImage *image =[UIImage imageWithData:jpegData];
                                                          //Grabbing the image here
                                                          dispatch_async(dispatch_get_main_queue(), ^(void) {

                                                            //Update UI if necessary.


                                                          });


                                                      }
                                                  }

     ];

}
0
votes

I would like to translate Tim's answer to Swift =) Here is the 1-st section:

let session = AVCaptureSession()
var metadataOutput = AVCaptureMetadataOutput()
var stillCameraOutput = AVCaptureStillImageOutput()
let sessionQueue = dispatch_async(dispatch_get_main_queue(), nil)

metadataOutput.setMetadataObjectsDelegate(self, queue: sessionQueue)
        if session.canAddOutput(metadataOutput) {
            session.addOutput(metadataOutput)
        }
session.addOutput(stillCameraOutput)

And here is the 2-nd one:

var image = UIImage()
let stillImageConnection = stillCameraOutput.connectionWithMediaType(AVMediaTypeVideo)
stillImageConnection.videoOrientation = .Portrait
stillImageConnection.videoScaleAndCropFactor = 1.0
stillCameraOutput.captureStillImageAsynchronouslyFromConnection(stillImageConnection, completionHandler: { (imageDataSampleBuffer, error) in
                    if (error != nil) {
                        print("There are some error in capturing image")
                    } else {
                        let jpegData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
                        image = UIImage(data: jpegData)!
                    }
                })