2
votes

I'm trying to implement exact pinch zooming in a 3D scene on a touch device. The 3D scene is just a flat plane and I have a perspective camera aimed diagonally down at it. As in here, for example:

http://imggle.com/wp-content/uploads/2013/07/google-earth-images-64.jpg

When a user starts a pinch gesture, I project the screen coordinates under his fingers into world coordinates on the plane.

Now how could I solve the camera position so that, no matter where the user moves his fingers, the same world coordinates would remain "in his fingertips" - i.e. projected into specific points in screen space?

Any thoughts are much appreciated.

1

1 Answers

0
votes

Having just implemented something very similar on an OpenGL ES app, I would recommend using a pan gesture recogniser and then updating your model view matrix based on the number of pixels in the pan.

You will need to calibrate your pan against the point where one pixel on the screen is roughly the same as one pixel in your OpenGL ES scene (i.e you are seeing like for like dragging). This is easy to do if you are allowing the user to zoom in and out anyway, but if not you may need to implement this temporarily in your app.