I have a question about the algorithmic complexity of a job (non-preemptive) scheduling algorithm, used in particular to obtain the optimal solution to a 1||sum(Uj) problem (one machine only), where Uj is a unitary penalty (that is: equal to 0 if the job terminates by its deadline, and equal to 1 if it terminates late).
I need to minimize the sum of unitary penalties (that is: the sum of late jobs). The job processing times are given, together with their deadlines and there's no release time (so, all the jobs are released synchronously at time t=0). The algoritm works like this:
1- sequence all the jobs according to Earliest Due Date
2- identify the first late job Jj. If there are no late jobs go to step 4.
3- identify the largest processing time job between Jj and its predecessors. Remove that job and return to step 2, considering the new sequence without the removed job
4- The optimal sequence is given by: current sequence [followed by] subset of jobs removed at step 3 in any order.
I'm asked to compute the worst case O(...) complexity of this algorithm, in a general and good implemented case (so, I have to try to guess the complexity "implementation-free" for the moment, given all the algorithm steps).
I think that step 1 requires O(nlogn) since (using QuickSort or MergeSort) it's just a matter of ordering.
But then, what about steps 2-3-4? Is it correct to state that the final complexity may be O(n^2) if we try to implement this for example in C/C++? Or is it O(n*logn) thanks to EDD "optimizing" the subsequent search? What could be a possible proof to determine its complexity?
Thanks a lot in advance.