8
votes

I want to know is there any method to render a view to image? just like a screenshot, but I can specify any view to that method. I know how to do it in ios. Saving view content, but how can I do it in mac os ?

3

3 Answers

15
votes

There are several ways, but non of them works perfect. Many problems are connected with Layer Backed views.

if you don't have layer backing or hosting views you can use this code:

    NSData* data = [mainView dataWithPDFInsideRect:[mainView bounds]];
    NSImage *img = [[NSImage alloc] initWithData:data];

if you work with layer based views:

till 10.8 the best way for me was

        NSSize mySize = mapImage.bounds.size;
        NSSize imgSize = NSMakeSize( mySize.width, mySize.height );
        NSBitmapImageRep *bir = [mapImage bitmapImageRepForCachingDisplayInRect:[mapImage bounds]];
        [bir setSize:imgSize];
        [mapImage cacheDisplayInRect:[mapImage bounds] toBitmapImageRep:bir];
        caheImg = [[NSImage alloc]initWithSize:imgSize];
        [caheImg addRepresentation:bir];

but in 10.8 bitmapImageRepForCachingDisplayInRect stopped working with Layer Backed views.

there is an option to make a screenshot of your screen and cut out your view:

+ (NSImage*) screenCacheImageForView:(NSView*)aView
{
    NSRect originRect = [aView convertRect:[aView bounds] toView:[[aView window] contentView]];

    NSRect rect = originRect;
    rect.origin.y = 0;
    rect.origin.x += [aView window].frame.origin.x;
    rect.origin.y += [[aView window] screen].frame.size.height - [aView window].frame.origin.y - [aView window].frame.size.height;
    rect.origin.y += [aView window].frame.size.height - originRect.origin.y - originRect.size.height;

    CGImageRef cgimg = CGWindowListCreateImage(rect,
                                           kCGWindowListOptionIncludingWindow,
                                           (CGWindowID)[[aView window] windowNumber],
                                           kCGWindowImageDefault);
    return [[NSImage alloc] initWithCGImage:cgimg size:[aView bounds].size];
}
2
votes

I'm using this code in my app, which works great.

However, if I drag the app to a secondary monitor the screen capture is thrown out and I get the wrong rect captured.

The following code fixes that issue (I've also removed some redundant maths from the calculation where a field was being subtracted and then added) in case anyone comes across this in the future:

NSRect originRect = [aView convertRect:[aView bounds] toView:[[aView window] contentView]];

NSArray *screens = [NSScreen screens];
NSScreen *primaryScreen = [screens objectAtIndex:0];

NSRect rect = originRect;
rect.origin.y = 0;
rect.origin.x += [aView window].frame.origin.x;
rect.origin.y = primaryScreen.frame.size.height - [aView window].frame.origin.y - originRect.origin.y - originRect.size.height;
2
votes

The answer from @Remizorr works in ObjectiveC like explained even under macOS High Sierra 10.13.4. - Xcode 9.2.

It also works for layerBacked view (wantsLayer = true).

Here is the updated Swift version :

extension NSView {

func snapshotImage() -> NSImage? {

    guard let window = window,
        let screen = window.screen,
        let contentView = window.contentView else { return nil }

    let originRect = self.convert(self.bounds, to:contentView)
    var rect = originRect
    rect.origin.x += window.frame.origin.x
    rect.origin.y = 0
    rect.origin.y += screen.frame.size.height - window.frame.origin.y - window.frame.size.height
    rect.origin.y += window.frame.size.height - originRect.origin.y - originRect.size.height
    guard window.windowNumber > 0 else { return nil }
    guard let cgImage = CGWindowListCreateImage(rect, .optionIncludingWindow, CGWindowID(window.windowNumber), CGWindowImageOption.bestResolution) else { return nil }

    return NSImage(cgImage: cgImage, size: self.bounds.size)
}

}

NOTE: This works even for a WKWebView and it does take a snapshot of a WKWebView video playing (which WKWebViews takeSnapshot does not) ;).