BFS consumes a lot of memory especially when the branching factor of the tree is huge. DFS, on the other hand, may take a long time to visit other neighbouring nodes if the depth of the tree is huge, but it has better space-complexity. For example, in BFS, if I have a branching factor of 10 and depth of 12, using the formula O(b^d)
, it means I'll have around 10^12 nodes. If each node is 10Bytes, it'll need around 10TB.
However, what is considered 'huge' here? Is a 10TB space-complexity considered huge (I'm pretty sure it is, though)? How do I determine when I should leave DFS and BFS and look for other algorithms? Are they even still used in modern applications or are some other better algorithms applied instead (like IDDFS)?