This question is a bit specific. Assume for simplicity that I have a unit circle, i.e. a circle centered on the origin (0,0) with radius 1. On this circle I have three points: A, B, and C. Assume that B and C are fixed, and that A can be moved along the circle's circumference.
Question: What conditions can I check efficiently to ensure that A is not moved outside the arc between B and C originally containing it?
As a point of clarification: I do not just wish to detect whether the point is on this arc or not (i.e., get a boolean), but to have a quantifiable measure (for example, of distance) so I can prevent it from being moved outside. You can imagine my setup like this, restricting point A to the red arc:
Here are some ideas I have pondered so far, but none were successful:
- Limit the angle alpha of A = [cos(alpha),sin(alpha)] to the origin between the angles beta and gamma of B=[cos(beta),sin(beta)] and C=[cos(gamma),sin(gamma)], where alpha, beta, and gamma are angles in radians. Unfortunately, this naive case only works in scenario (a) above. If the arc to which A is restricted crosses the (in my case, western) discontinuity from +pi/-pi, a naive upper and lower bound won't do.
- Calculate the length of the arcs between A-to-B and A-to-C, then ensure that as A is moved this sum does not change. Unfortunately, I have only found this solution to calculate the arcs between two points, and it always calculates the shorter arc. In scenario (c), however, the correct arc A-to-B is the larger one.