I am trying to convert the 5 values returned when using the Rotation Vector Sensor Type to roll, azimuth, and pitch.
The code I am using to do so is the following.
@Override
public void onSensorChanged(SensorEvent event) {
double[] g = convertFloatsToDoubles(event.values.clone());
double norm = Math.sqrt(g[0] * g[0] + g[1] * g[1] + g[2] * g[2] + g[3] * g[3]);
g[0] /= norm;
g[1] /= norm;
g[2] /= norm;
g[3] /= norm;
double xAng = (2 * Math.acos(g[0])) * (180 / Math.PI);
double yAng = (2 * Math.acos(g[1])) * (180 / Math.PI);
double zAng = (2 * Math.acos(g[2])) * (180 / Math.PI);
}
private double[] convertFloatsToDoubles(float[] input)
{
if (input == null)
return null;
double[] output = new double[input.length];
for (int i = 0; i < input.length; i++)
output[i] = input[i];
return output;
}
The issue is the values that are returned by variables xAng
and yAng
seem to be restricted to 80 - 280.
As for zAng
(which I think is the azimuth), it is working like a compass but when it returns 0 it appears to be about 12 degrees off magnetic South.
I assume I have done something wrong with the maths used but I am unsure of what exactly.
The values for Sensor.TYPE_ROTATION_VECTOR
are defined here as:
values[0]: x*sin(θ/2)
values[1]: y*sin(θ/2)
values[2]: z*sin(θ/2)
values[3]: cos(θ/2)
values[4]: estimated heading Accuracy (in radians) (-1 if unavailable)