2
votes

enter image description hereI Have 2 Portrait videos. one which is taken from default iphone camera and second one is record from my application using UIIMagePickerController.

when i apply cifilter on video 1 then filter apply perfectly but when i apply filter on second video then video getting zoomed half video part blurred and stretched and when i export it it should rotate.

My Code

AVAssetTrack *FirstAssetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

CIFilter *filter = [CIFilter filterWithName:@"CIPhotoEffectInstant"];

player.currentItem.videoComposition  = [AVVideoComposition videoCompositionWithAsset: asset applyingCIFiltersWithHandler:^(AVAsynchronousCIImageFilteringRequest *request){
    // Clamp to avoid blurring transparent pixels at the image edges

    CIImage *source = [request.sourceImage imageByClampingToExtent];
    source = [source imageByApplyingTransform:FirstAssetTrack.preferredTransform];

    [filter setValue:source forKey:kCIInputImageKey];

    // Crop the blurred output to the bounds of the original image
    CIImage *output = [filter.outputImage imageByCroppingToRect:request.sourceImage.extent];

    // Provide the filter output to the composition
    [request finishWithImage:output context:nil];
}];

This code is not worked for second video so some change for second video This is not proper code but i want to check its size and orientation and After changes in orientation its works fine when play in avplayer but when export it is rotated

AVPlayer plays video composition result incorrectly

i checked this link we both facing same issue so i changes my code according to this but still not working properly

AVAssetTrack *FirstAssetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CIFilter *filter = [CIFilter filterWithName:@"CIPhotoEffectInstant"];


UIImageOrientation FirstAssetOrientation_  = UIImageOrientationUp;
BOOL  isFirstAssetPortrait_  = NO;
CGAffineTransform firstTransform = FirstAssetTrack.preferredTransform;
if(firstTransform.a == 0 && firstTransform.b == 1.0 && firstTransform.c == -1.0 && firstTransform.d == 0)  {
    FirstAssetOrientation_= UIImageOrientationRight;
    isFirstAssetPortrait_ = YES;
}
if(firstTransform.a == 0 && firstTransform.b == -1.0 && firstTransform.c == 1.0 && firstTransform.d == 0)  {
    FirstAssetOrientation_ =  UIImageOrientationLeft;
    isFirstAssetPortrait_ = YES;
}
if(firstTransform.a == 1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == 1.0)   {
    FirstAssetOrientation_ =  UIImageOrientationUp;
}
if(firstTransform.a == -1.0 && firstTransform.b == 0 && firstTransform.c == 0 && firstTransform.d == -1.0) {
    FirstAssetOrientation_ = UIImageOrientationDown;
}

player.currentItem.videoComposition = [AVVideoComposition videoCompositionWithAsset:asset applyingCIFiltersWithHandler:^(AVAsynchronousCIImageFilteringRequest * _Nonnull request) {
    // Step 1: get the input frame image (screenshot 1)
    CIImage *sourceImage = request.sourceImage;

    // Step 2: rotate the frame
    CIFilter *transformFilter = [CIFilter filterWithName:@"CIAffineTransform"];
    [transformFilter setValue:sourceImage forKey: kCIInputImageKey];
    [transformFilter setValue: [NSValue valueWithCGAffineTransform: firstTransform] forKey: kCIInputTransformKey];
    sourceImage = transformFilter.outputImage;
    CGRect extent = sourceImage.extent;
    CGAffineTransform translation = CGAffineTransformMakeTranslation(-extent.origin.x, -extent.origin.y);
    [transformFilter setValue:sourceImage forKey: kCIInputImageKey];
    [transformFilter setValue: [NSValue valueWithCGAffineTransform: translation] forKey: kCIInputTransformKey];
    sourceImage = transformFilter.outputImage;

    // Step 3: apply the custom filter chosen by the user
    extent = sourceImage.extent;
    sourceImage = [sourceImage imageByClampingToExtent];
    [filter setValue:sourceImage forKey:kCIInputImageKey];
    sourceImage = filter.outputImage;
    sourceImage = [sourceImage imageByCroppingToRect:extent];

    // make the frame the same aspect ratio as the original input frame
    // by adding empty spaces at the top and the bottom of the extent rectangle
    CGFloat newHeight = 1920 * 1920 / extent.size.height;
    CGFloat inset = (extent.size.height - newHeight) / 2;
    extent = CGRectInset(extent, 0, inset);
    sourceImage = [sourceImage imageByCroppingToRect:extent];

    // scale down to the original frame size
    CGFloat scale = 1920 / newHeight;
    CGAffineTransform scaleTransform = CGAffineTransformMakeScale(scale, scale*3.2);
    [transformFilter setValue:sourceImage forKey: kCIInputImageKey];
    [transformFilter setValue: [NSValue valueWithCGAffineTransform: scaleTransform] forKey: kCIInputTransformKey];
    sourceImage = transformFilter.outputImage;

    // translate the frame to make it's origin start at (0, 0)
    CGAffineTransform translation1 = CGAffineTransformMake(1, 0, 0, 1, 0, 0);
    [transformFilter setValue:sourceImage forKey: kCIInputImageKey];
    [transformFilter setValue: [NSValue valueWithCGAffineTransform: translation1] forKey: kCIInputTransformKey];
    sourceImage = transformFilter.outputImage;

    // Step 4: finish processing the frame (screenshot 2)
    [request finishWithImage:sourceImage context:nil];

}];
1

1 Answers

4
votes

Just remove your transform line, that cause problem to it.

Only this one :

source = [source imageByApplyingTransform:FirstAssetTrack.preferredTransform];

And check. :)

AVAssetTrack *FirstAssetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

CIFilter *filter = [CIFilter filterWithName:@"CIPhotoEffectInstant"];

player.currentItem.videoComposition  = [AVVideoComposition videoCompositionWithAsset: asset applyingCIFiltersWithHandler:^(AVAsynchronousCIImageFilteringRequest *request){
    // Clamp to avoid blurring transparent pixels at the image edges

    CIImage *source = [request.sourceImage imageByClampingToExtent];

    [filter setValue:source forKey:kCIInputImageKey];

    // Crop the blurred output to the bounds of the original image
    CIImage *output = [filter.outputImage imageByCroppingToRect:request.sourceImage.extent];

    // Provide the filter output to the composition
    [request finishWithImage:output context:nil];
}];