Say you are given an array of signed integers called NUMBERS
of length r
. Say NUMBERS
is equal to
1, 2, 3, 4, 5
where r = 5
.
You want to form a sum X of any n
elements of the set NUMBERS
and multiply this by the sum Y of the remaining m
elements for any given n
. So the length of the NUMBERS
array is (n + m)
words in memory.
This sum is of the form:
(x1 +x2 +x3 +...+xn)*(y1 +y2 +...+ym)
For example, with n = 2
you could have a possible combination for X be the sum of the arbitrary n = 2
elements 1
and 4
, and thus the sum Y of the remaining m = 3
elements be 2 + 3 + 5
. You then multiply both sums to get: X * Y = 5 * 10 = 50
.
Formally:
Given (n + m)
signed integers, we want to minimize and maximize the result of the algebraic expression of the form:
(x1 +x2 +x3 +...+xn)*(y1 +y2 +...+ym)
for a fixed given n
.
This involves finding all nCr
combinations of the set NUMBERS
where n
is given, and then finding the sum X for each combination, then calculating the sum Y of the remaining elements not used for the sum X. Then multiplying both. Using the combinatorics nCr
formula you can obtain the number of possible combinations the sum X and the corresponding sum Y, and then brute force every single multiplication to see which will yield the smallest or largest result.
I am not sure how to proceed, especially since recursion and nested for loops are outside the scope of this first project.
assembly
? – fuz