2
votes

I am developing an augmented reality app for Project Tango using Unity3d.

Since I want to have virtual object interact with the real world, I use the Meshing with Physics scene from the examples as my basis and placed the Tango AR Camera prefab inside of the Tango Delta Camera (at the relative position (0,0,0)).

I found out, that I have to rotate the AR Camera up by about 17deg, so the Dynamic mesh matches the room, however there is still a significant offset to the live preview from the camera.

I was wondering, if anyone who had to deal with this before could share his solution to aligning the Dynamic Mesh with the real world.

Screenshot

How can I align the virtual world with the camera image?

1

1 Answers

3
votes

I'm having similar issues. It looks like this is related to a couple of previously-answered questions:

Point cloud rendered only partially

Point Cloud Unity example only renders points for the upper half of display

You need to take into account the color camera offset from the device origin, which requires you to get the color camera pose relative to the device. You can't do this directly, but you can get the device in the IMU frame, and also the color camera in the IMU frame, to work out the color camera in the device frame. The links above show example code.

You should be looking at something like (in unity coordinates) a (0.061, 0.004, -0.001) offset and a 13 degree rotation up around the x axis.

When I try to use the examples, I get broken rotations, so take these numbers with a pinch of salt. I'm also seeing small rotations around y and z, which don't match with what I'd expect.