In iOS, I'm trying to determine the point on a rectangle intersected by an imaginary line from the center point to the perimeter of the rectangle at a predetermined angle.
Say that I know the center point, the size of the rectangle, and the angle (starting from 0 degrees for East and going counterclockwise through 90 for North and 180 for West and 270 for South to 360 degrees for East again). I need to know the coordinates of the intersecting point.
The somewhat confusing (to me) mathematical but presumably accurate answer at Finding points on a rectangle at a given angle led me to try the following code, but it doesn't work properly. This question is similar to that one, but I'm looking for a corrected Objective-C / iOS method rather than a general mathematical response.
I think a part of the code problem has to do with using the single 0 to 360 degree angle (in radians with no possibility of a negative number) input, but there are likely to be other problems. The code below mostly uses notation defined in the answer from belisarius, including my attempt to calculate intersecting points for each of the four regions defined there.
This code is in my UIImageView subclass:
- (CGPoint) startingPointGivenAngleInDegrees:(double)angle {
double angleInRads = angle/180.0*M_PI;
float height = self.frame.size.height;
float width = self.frame.size.width;
float x0 = self.center.x;
float y0 = self.center.y;
// region 1
if (angleInRads >= -atan2(height, width) && angleInRads <= atan2(height, width)) {
return CGPointMake(x0 + width/2, y0 + width/2 * tan(angleInRads));
}
// region 2
if (angleInRads >= atan2(height, width) && angleInRads <= M_PI - atan2(height, width)) {
return CGPointMake(x0 + height / (2*tan(angleInRads)),y0+height/2);
}
// region 3
if (angleInRads >= M_PI - atan2(height, width) && angleInRads <= M_PI + atan2(height, width)) {
return CGPointMake(x0 - width/2, y0 + width/2 * tan(angleInRads));
}
// region 4
return CGPointMake(x0 + height / (2*tan(angleInRads)),y0-height/2);
}