I have a subclass of NSView where I'm handling the -mouseDown: event to get the position of the click on the screen. With this position I defined a point that I'll use to draw a rect on the -drawRect: it's working fine.
BUT... when I set up wantsLayer the things isn't work. When I get the position of the input, I looked that Y-axis have an increase of 20 points and I don't know what's happening... Can anyone explain? How I fix this problem?
Simulation:
I click at coordinate x: 100; y: 100; and the drawRect draws the rect on x: 100; y: 100; It's okay, it's what I want.
With setWantsLayer:YES
I click at coordinate x: 100; y: 100; and the drawRect draws the rect on x: 100; y: 120; (or something like this)
Is possible I use CALayers without setting -setWantsLayer to YES? I'm trying figure this out but I have no idea what's happening... I need your help.
UPDATE: I'm trying figure this out and I did a lot of tests now... Now I can say that the problem is with -mouseDown: from NSView, when I set up -setWantsLayer to YES it don't works like expected anymore...
I have on my window a CustomView and I created a subclass of NSView and set as the CustomView class. The CustomView is at position (0, 20). The coordinate orientation isn't flipped.
I believe when I set up to NSView wants layer the -mouseDown: update the frame to position (0, 0) (or in other words, it get the NSWindow frame) instead of (0, 20). When it occurs every position from -mouseDown: get an increase of 20 points on Y-axis. I don't know if what I'm saying is right, but is the facts that I'm getting as result of my tests.
Someone can help me to figure this out?