2
votes

So everything I have been reading says that left recursive rules in a CFG traverses infinitely and proceeds to show a process of converting it to right recursive rule and accomplishes this using using alpha and beta terms to represent multiple non-terminals (I think I got that part right). So in my mind left recursive rules=bad when dealing with LL parsers.

So is there a situation that left recursive rules are acceptable/desirable? If not, why do left recursive rules exist asside from bad design or term to describe the 'inverse' (term used lightly) of right recursive rules?

This is not technically a homework question, but it is related to my class.

1

1 Answers

3
votes

The key to your question is that left recursive rules are only bad when dealing with LL parsers. If you're using a LR parser I often feel the grammar becomes much clearer. Takes for instance the standard expression grammar

E : E + T | T
T : T * F | F
F : number | ( E )

Which clearly is left recursive, but is easier to read than its right recursive variant. When the parser in question has no problem dealing with the original grammar, then there would be no point in factoring this grammar to be right-recursive.

In this case you also get the benefit of having a grammar that will not require epsilon rules, which the corresponding LL compatible grammar would.