I'm learning greedy algorithms and came across a problem that I'm not sure how to tackle. Given a set of intervals (a,b) with start time a and end time b, give a greedy algorithm that returns the minimum amount of intervals that overlap every other interval in the set. So for example if I had:
(1,4) (2,3) (5,8) (6,9) (7,10)
I would return (2,3) and (7,8) since these two intervals cover every interval in the set. What I have right now is this:
- Sort the intervals by increasing end time
- Push the interval with the smallest end time onto a stack
- If an interval (a,b) overlaps the interval on the top of the stack (c,d) (so a is less than d) then if a<=c keep (c,d). Else update the interval on the top of the stack to (a,d)
- If an interval (a,b) does not overlap the interval on the top of the stack (c,d) then push (a,b) onto the stack
- At the end the stack contains the desired intervals and this should run in O(n) time
My question is: how is this algorithm greedy? I'm struggling with the concepts. So maybe I have this right and maybe I don't, but if I do, I can't figure out what the greedy rule is/should be.
EDIT: A valid point was made below, about which I should have been clearer about. (7,8) works instead of (1,10) (which covers everything) because every time in (7,8) is in (5,8) (6,9) and (7,10). Same with (2,3), every time in there is in (1,4) and (2,3). The goal is to get a set of intervals such that if you looked at all possible times in that set of intervals, each time would be in at least one of the original intervals.