If it's possible to "do some initialization steps once", we can simply precompute a two-dimensional matrix (a hash table is an overkill for this case) with all the minimum indexes for all possible pairs of subarray indexes. This is an O(n^2)
operation and once it's done, retrieving the minimum index in any subarray will be O(1)
. As correctly stated in the comments this is an instance of the range minimum query problem, and here's my proof of concept in Python:
def init_table(array):
n = len(array)
# create an NxN matrix
table = [[0] * n for _ in xrange(n)]
for i in xrange(0, n):
minIdx, minVal = i, array[i]
for j in xrange(i, n):
if array[j] < minVal:
minIdx = j
minVal = array[j]
table[i][j] = minIdx
return table
Another alternative, an equivalent solution using dynamic programming:
def init_table(array):
n = len(array)
# create an NxN matrix, initializing values in the diagonal
table = [[0 if i != j else i for j in xrange(n)] for i in xrange(n)]
for i in xrange(0, n):
for j in xrange(i+1, n):
if array[j] < array[table[i][j-1]]:
table[i][j] = j
else:
table[i][j] = table[i][j-1]
return table
Either way, we start by creating the 2D matrix for a given array:
array = [1, 2, 4, 6, 1, 3, 5, 7]
table = init_table(array)
Let's say that we want to examine the range between i=2
and j=6
:
i = 2
j = 6
array[i:j+1]
=> [4, 6, 1, 3, 5]
In the above range, the minimum element is 1
, which is at index 4
in the original array. Let's see if our matrix works:
table[i][j]
=> 4
If we need the index relative to the sub-array, simply subtract i
from the result:
table[i][j] - i
=> 2