I am getting rotation matrix and orientations (Euler angles) using a sensor in an android device. I want to use these in opencv for affine transformation. affine transformation uses homography matrix to do its job. My question is how to convert rotation matrix or orientation array to homography matrix that is usable in affine transformation?
Android code to get rotation matrix and orientation:
final float[] rotationMatrix = new float[9];
SensorManager.getRotationMatrix(rotationMatrix, null, accelerometerReading, magnetometerReading);
final float[] orientationAngles = new float[3];
SensorManager.getOrientation(rotationMatrix, orientationAngles);
opencv code to affine transform:
homographtMatrix = ... # to calc from rotation matrix or orientation angls
warped = cv2.warpPerspective(img, homographtMatrix, (cols, 600))
Sample rotation matrix:
[
[-0.39098227, -0.24775778, 0.8864249],
[0.9200034, -0.07699536, 0.38427263],
[-0.026955934, 0.96575755, 0.2580418]
]
Sample euler angles:
[1.3097044 0.0269592 1.97264932]
Image going to affine transform:
Desired transform (Cuts from left doesn't matter i can fix it):
Then I will tile a floor in a segmented image.