I want to implement a program that splits a set S of n points in the plane into two sets such that the distance of the convex hulls of the two sets is maximized.
It should be done in O(n^3).
I have several ideas but I'm not sure whether they are correct:
1. Idea: I sort points by X and then for each two points I find the distance but only if there's no any other point between them. Then I choose the closest point to the first one on the left and the closest point to the second one on the right (first point would be part of the first polygon and second one of the second polygon). Then I find distance between point and line, line and line, line and point and point and point(but I already know that). That's how i find the maximal distance and then with Graham scan I create a convex hull.
I repeat everything for Y, because maximal distance could be on Y, not on X.
2. Idea: I sort points by X and then create a convex hull for first 3 and another convex hull for other points. Then find a minimum distance between these two hulls. I save the distance and if it's greater than distance i saved before i overwrite it and remember polygons. And last step is when only 3 points are left for the second convex hull. Then I do all these steps for Y.
What do you guys think about my ideas? Do you have any improvements or maybe another, better idea?
distance between convex hulls
defined? – MBo