1
votes

My goal is to write a custom camera view controller that:

  1. Can take photos in all four interface orientations with both the back and, when available, front camera.
  2. Properly rotates and scales the preview "video" as well as the full resolution photo.
  3. Allows a (simple) effect to be applied to BOTH the preview "video" and full resolution photo.

My previous effort is documented in this question. My latest attempt was to modify Apple's sample GLVideoFrame (from WWDC 2010). However, I have not been able to get the iPhone 4 to display the preview "video" properly when the session preset is AVCaptureSessionPresetPhoto.

Has anyone tried this or know why the example doesn't work with this preset?

Apple's example uses a preset with 640x480 video dimensions and a default texture size of 1280x720. The iPhone 4 back camera delivers only 852x640 when the preset is AVCaptureSessionPresetPhoto.

iOS device camera video/photo dimensions when preset is AVCaptureSessionPresetPhoto:

  • iPhone 4 back: video is 852x640 & photos are 2592x1936
  • iPhone 4 front: video & photos are 640x480
  • iPod Touch 4G back: video & photos are 960x720
  • iPod Touch 4G front: video & photos are 640x480
  • iPhone 3GS: video is 512x384 & photos are 2048x1536

Update

I got the same garbled video result when switching Brad Larson's ColorTracking example (blog post) to use the AVCaptureSessionPresetPhoto.

2
I know you were in the class when I showed this off as an example of OpenGL ES 2.0 processing of images: sunsetlakesoftware.com/2010/10/22/… , but have you tried tweaking that for the presets you want? If I recall correctly, AVCaptureSessionPresetHigh returned a 720p video frame for the rear camera.Brad Larson
I will definitely try that. Modifying the WWDC example code to use the 1280x720 video size caused it to choke, but the Photo preset actually delivers frames at a lower resolution than that. It's all garbled, so it's not clear if it's processing them at 15-30 FPS or not.gerry3
Do you have both the front- and back-facing cameras attached to your capture session? Maybe it's using some kind of lowest-common-denominator image size.Brad Larson
To be clear, I wasn't questioning or confused about the video dimensions, I was just letting everyone know what they were. I will add all of them to the question.gerry3
@Dex can you repost your answer here or a summary of it and then link to the full one? I haven't had a chance to try it yet, but it's the best I've seen, so I'd like to have something to accept.gerry3

2 Answers

1
votes

The issue is that AVCaptureSessionPresetPhoto is now context-aware and runs in different resolutions based on whether you are displaying video or still image captures.

The live preview is different for this mode because it pads the rows with extra bytes. I'm guessing this is some sort of hardware optimization.

In any case, you can see how I solved the problem here:

iOS CVImageBuffer distorted from AVCaptureSessionDataOutput with AVCaptureSessionPresetPhoto

-2
votes

The AVCaptureSessionPresetPhoto is for taking pictures, not capturing live feed. You can read about it here: http://developer.apple.com/library/ios/#documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/03_MediaCapture.html

(My belief is that this is actually two different cams or sensors, as they behave very differently, and there's a couple of seconds delay just for switching between the Photo and, say, 640x480).

You can't even use both presets at the same time, and switching between them is a headache as well - How to get both the video output and full photo resolution image in AVFoundation Framework

HTH, although not what you wanted to hear...

Oded.