Suppose we are given an input array of integers, how to find the longest convex subsequence that satisfies the following condition:
c[i] < (c[i-1] + c[i+1]) / 2
c[i-1]
, c[i]
and c[i+1]
are three consecutive elements in the subsequence.
For example, if input array is { 1, 2, -1, 0, 3, 8, 5 }
, the longest convex subsequence should be: { 1, -1, 0, 3, 8 }
or { 2, -1, 0, 3, 8 }
.
I tried to solve this problem using the same dynamic programming idea in "Longest Increasing Subsequence" (LIS) problem. But because each element in the subsequence depends on the previous two elements, it seems that an O(n^2) solution is impossible. Thank you for your help.
(1, -1, 0, 3, 8)
to be a solution? - noMAD