I'm studying for an algorithm exam and I cant find a way to solve the next problem:
INPUT: Positive integers r1,r2....rn and c1,c2....cn
OUTPUT: An n by n matrix A with 0/1 entries such that for all i the sum of the i-th row in A is ri and the sum of the i-th column in A is ci, if such a matrix exists. Consider the following greedy algorithm that constructs A row by row. Assume that the first i-1 rows have been constructed. Let aj be the number of 1's in the j-th column in the first i-1 rows. The ri columns with maximum cj-aj are assigned 1's in row , and the rest of the columns are assigned 0's. That is, the columns that still needs the most 1's are given 1's. Formally prove that this algorithm is correct using an exchange argument.
So what I tried to do as with most greedy problems I encountered is to wrap it up in an induction, assume that the rows up to the i-th row in the greedy solution and the optimal solution are the same and then try to change the i+1-th row so it will match the greedy and prove that it wont create a less optimal solution to the optimal given. then keep it up for k-i iterations till the entire solution is similar.
After trying that unsuccessfully I tried the same idea only per coordinate assume the ij coordinate is the first one that does not match and again failed.
Then I tried a different approach assuming that I have a set of steps in the greedy and swap a whole step of the algorithm (which is basically the same idea as the first one) and still I did not manage to find a way in which it is guaranteed that I did not create a less optimal solution.
thanks in advance for any assistance.