If an array A[1..N] is represented using a segment tree having sets in each interval, why does a range query [L..R] returns at most ceil(log_2 N) sets (or disjoint intervals)?
If came across this statement while reading this answer.
To quote:
Find a disjoint coverage of the query range using the standard segment tree query procedure. We get O(logn) disjoint nodes, the union of whose multisets is exactly the multiset of values in the query range. Let's call those multisets s_1, ..., s_m (with m <= ceil(log_2 n)).
I tried searching for a proof, but couldn't find it on any site. Can anyone help me prove it?