5
votes

I have some objects on scene and using OrbitControls to move camera around scene. I have found some ideal camera position/rotation to make nice rendering, so I read camera position and rotation and tried to enter them directly in blender to get same view.

Position pasted from three.js to blender camera works well - I get same position like orbited camera in three.js. Rotations works different way even keeping in mind that Y and Z are switched. Shouldn't it work same way (just with switching Y and Z)?

If not so how to convert those values to be able to set blender camera in same position as in three.js? Here is post I wrote on github with some screenshots: https://github.com/mrdoob/three.js/issues/3348 How should I read camera rotations so they will rotate camera same way as in three.js? I am reading just camera.rotation.x ,y ,z. Should I read rotations from some global/local matrix or other way?

Many thanks for any help.

Maciej

1

1 Answers

0
votes

In order to be able to go from one software to another you need to find the coordinate system that each uses. For example three.js uses a coordinate system where positive x goes to the right, positive y goes up and positive z comes out of the screen. From what I see at http://en.wikibooks.org/wiki/Blender_3D:_Noob_to_Pro/Understanding_Coordinates (I have not used blender) in blender positive x goes to the right, positive y goes into the screen and pozitive z goes up.

So they are different. But if you rotate the three.js system by 90 degrees around the x-axis you get the blender coordinate system.