Imagine the case of a perfect tree with height h. When the iterator is at the first node (the leftmost leaf), it will need to remember somehow by which internal nodes it arrived at this node, because in one of the next steps it will need to yield the values in those nodes (and their right subtrees). As there are no parent node pointers, this information is not readily available and must be stored somewhere.
No matter how this path is memorised (a list of internal nodes, or a list of directions, like left-left-left-left, possibly stored compactly in bit-representation), the theoretical size of this storage is O(h).
As noted in comments below, storing only directions will require to walk down the path again from the root at each step in the iteration. Alternatively you could memorise the last visited value, and use binary search to find the next node. If all values in the tree are unique, then the memory size of the actual values will have at least the same order of magnitude as a compact path representation (bitwise left-left-left...), so either way (storing the path, or the last visited value) the storage requirements are similar.
Now, as soon as you put some limit on the size of the tree, there is of course no more sense in talking about space (or time) complexity. For instance, if a tree is never going to have a height of more than 64, then the current path can be represented in a 64-bit unsigned integer, with maybe some additional small integer to denote the current size of that path.
As the number of atoms in the observable universe is estimated at 2250, you could work with that as well, and use a 256-bit unsigned integer instead (two longs). Computer memory will never be capable of storing a perfect tree of that height any way.