0
votes

I have a simple directed graph, with no anti-parallel edges. I need to find an algorithm to determine if this graph contains a vertex with out-deg=|V|-1 and in-deg=0.

The input of this algorithm can only be an adjacency-matrix of this graph. And using this adjacency-matrix, we need to do it in O(|V|).

Thank you for your help.

1

1 Answers

3
votes

Since there can be at most one such node, this can be done by iteratively eliminating candidates.

First, put all nodes onto a stack. We will use this stack to keep our candidates.

As long as we have at least two nodes p and q on the stack, check if the edge (p, q) exists. If it exists, then q cannot have in-degree 0 and we can remove it from the stack. If it does not exist, then p cannot have out-degree |V|-1, so we can remove it from the stack. Hence, after each check, we remove exactly one candidate, which allows us to arrive at a single candidate after O(|V|) checks.

Now we only need to check this node for the given in- and out-degree by checking the corresponding row and column in the adjacency matrix, which can also be done in O(|V|).