1
votes

I'm capturing a 1px x 1px screenshot with the following code. I then sample the image and get a NSColor.

CGImageRef image = CGDisplayCreateImageForRect(displayID, CGRectMake(0, 0, 1, 1));

NSBitmapImageRep *rawImage = [[NSBitmapImageRep alloc] initWithCGImage:image];
NSColor *convertedColor = [rawImage colorAtX:0 y:0];

Everything works correctly except that the color of the captured pixel does not match the color on the source image. For example: I make a red square on photoshop. I set the color of the square to be RGB red:255, green:0 and blue:0. When I capture a pixel inside that square on the screen I get RGB red:255,green:0,blue:17. Is there a way to make the captured image match the original color?

Edit: Also, when I take a screenshot of the red square in photoshop, and read the RGB color on the screenshot, the color is also red:255,green:0,blue:17. It seems it is a colorspace behavior on the OS. Would this be the expected behavior? This code is basically a color picker like the one on photoshop. However I'm assuming photoshop corrects the color somewhere in the process. Any ideas?

Thanks!

1

1 Answers

0
votes

Try to get down to deeper for capturing 1px screenshot like this:

CGImageRef image = CGDisplayCreateImageForRect(displayID, CGRectMake(5, 5, 1, 1)); //another pixel then first pixel
NSBitmapImageRep *rawImage = [[NSBitmapImageRep alloc] initWithCGImage:image];
NSColor *convertedColor = [rawImage colorAtX:0 y:0];