I have been attempting to do some real time video image processing in MonoTouch. I'm using AVCaptureSession to get frames from the camera which works with an AVCaptureVideoPreviewLayer.
I also successfully get the callback method "DidOutputSampleBuffer" in my delegate class. However every way that I have tried to create a UIImage from the resulting CMSampleBuffer fails.
Here is my code setting up the capture session:
captureSession = new AVCaptureSession ();
captureSession.BeginConfiguration ();
videoCamera = AVCaptureDevice.DefaultDeviceWithMediaType (AVMediaType.Video);
if (videoCamera != null)
{
captureSession.SessionPreset = AVCaptureSession.Preset1280x720;
videoInput = AVCaptureDeviceInput.FromDevice (videoCamera);
if (videoInput != null)
captureSession.AddInput (videoInput);
//DispatchQueue queue = new DispatchQueue ("videoFrameQueue");
videoCapDelegate = new videoOutputDelegate (this);
DispatchQueue queue = new DispatchQueue("videoFrameQueue");
videoOutput = new AVCaptureVideoDataOutput ();
videoOutput.SetSampleBufferDelegateAndQueue (videoCapDelegate, queue);
videoOutput.AlwaysDiscardsLateVideoFrames = true;
videoOutput.VideoSettings.PixelFormat = CVPixelFormatType.CV24RGB;
captureSession.AddOutput (videoOutput);
videoOutput.ConnectionFromMediaType(AVMediaType.Video).VideoOrientation = AVCaptureVideoOrientation.Portrait;
previewLayer = AVCaptureVideoPreviewLayer.FromSession (captureSession);
previewLayer.Frame = UIScreen.MainScreen.Bounds;
previewLayer.AffineTransform = CGAffineTransform.MakeRotation (Convert.DegToRad (-90));
//this.View.Layer.AddSublayer (previewLayer);
captureSession.CommitConfiguration ();
captureSession.StartRunning ();
}
I have tried creating a CGBitmapContext from a CVPixelBuffer casted from the sample buffer's image buffer like so:
public override void DidOutputSampleBuffer (AVCaptureOutput captureOutput, MonoTouch.CoreMedia.CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
{
CVPixelBuffer pixelBuffer = sampleBuffer.GetImageBuffer () as CVPixelBuffer;
CVReturn flag = pixelBuffer.Lock (0);
if(flag == CVReturn.Success)
{
CGBitmapContext context = new CGBitmapContext
(
pixelBuffer.BaseAddress,
pixelBuffer.Width,
pixelBuffer.Height,
8,
pixelBuffer.BytesPerRow,
CGColorSpace.CreateDeviceRGB (),
CGImageAlphaInfo.PremultipliedFirst
);
UIImage image = new UIImage(context.ToImage());
ProcessImage (image);
pixelBuffer.Unlock(0);
}else
Debug.Print(flag.ToString()
sampleBuffer.Dispose();
}
This results in the following error
<Error>: CGBitmapContextCreate: invalid data bytes/row: should be at least 2880 for 8 integer bits/component, 3 components, kCGImageAlphaPremultipliedFirst.
even with some tweaking of parameters I either get an invalid Handle exception or a segfault in native objective-c.
I have also tried simply creating a CIImage with the CVImageBuffer and creating a UIImage from that like so:
public override void DidOutputSampleBuffer (AVCaptureOutput captureOutput, MonoTouch.CoreMedia.CMSampleBuffer sampleBuffer, AVCaptureConnection connection)
{
CIImage cImage = new CIImage(sampleBuffer.GetImageBuffer ());
UIImage image = new UIImage(cImage);
ProcessImage (image);
sampleBuffer.Dispose();
}
This results in an exception when initializing the CIImage:
NSInvalidArgumentException Reason: -[CIImage initWithCVImageBuffer:]: unrecognized selector sent to instance 0xc821d0
This honestly feels like some sort of bug with MonoTouch but if I'm missing something or just trying to do this in a weird way please let me know of some alternative solutions.
thanks