The problem is given as follows:
Given a DAG and a number 0 < p ≤ 1
, return the minimum-cardinality set of vertices that disconnects at least a p
-fraction of paths from a source (i.e., no incoming arcs) to a sink (i.e., no outgoing arcs).
For p = 1
, the problem is equivalent to minimum cut. For other values of p
, however, I'm not sure what the answer will be.
An algorithm I'm thinking of is to first compute the min-cut set for a DAG and then try to prune it to satisfy the criteria. This by itself is interesting to see if the subset that we find is actually the minimum cut-set for the specific p
that is given. The problem with this algorithm is that it is wasteful, because it computes many nodes that we don't need in the final answer and in fact, is solving "bigger" problem in the first place.
Any pointers to the solution of this problem? wouldn't it be possible that for some of the algorithms of min-cut, we put this extra constraint as an early-stopping criterion?
For checking how many paths are removed, suppose we have indexed each vertex (and will keep it updated if needed) so that we know how many paths get disconnected by their removal. Please do not worry for the complexity of the index being updated. One last thing, there is no constraint on the resulting components in terms of size or anything.