I'm writing application that operates on black&white images. I'm doing it by passing a NSImage object into my method and then making NSBitmapImageRep from NSImage. All works but quite slow. Here's my code:
- (NSImage *)skeletonization: (NSImage *)image
{
int x = 0, y = 0;
NSUInteger pixelVariable = 0;
NSBitmapImageRep *bitmapImageRep = [[NSBitmapImageRep alloc] initWithData:[image TIFFRepresentation]];
[myHelpText setIntValue:[bitmapImageRep pixelsWide]];
[myHelpText2 setIntValue:[bitmapImageRep pixelsHigh]];
NSColor *black = [NSColor blackColor];
NSColor *white = [NSColor whiteColor];
[myColor set];
[myColor2 set];
for (x=0; x<=[bitmapImageRep pixelsWide]; x++) {
for (y=0; y<=[bitmapImageRep pixelsHigh]; y++) {
// This is only to see if it's working
[bitmapImageRep setColor:myColor atX:x y:y];
}
}
[myColor release];
[myColor2 release];
NSImage *producedImage = [[NSImage alloc] init];
[producedImage addRepresentation:bitmapImageRep];
[bitmapImageRep release];
return [producedImage autorelease];
}
So I tried to use CIImage but I don't know how to get into each pixel by (x,y) coordinates. That is really important.