Given a total set of denominations and a total amount, we have to find the minimum number of coins needed to make the total exactly.
Constraint: Only one coin of each denomination is available.
How do you compare the Greedy Approach with Dynamic Programming Approach?
Edit: Example: I have cash of denominations 1,2,3,5. I have only one coin of each denomination. I want to get the change for 11 using minimum number of coins. The answer is 4 ( 1 + 2 + 3 + 5). Had I had an infinite supply of each denomination, the answer would've been 3 (5 + 5 + 1 or 5 + 3 + 3).