I work with BeautifulSoup using lxml to parse and navigate XML files.
I noticed strange behaviour. Beautifulsoup suppresses exceptions thrown by lxml parser when reading malformed XML file (eg. truncated doc or missing closing tags).
Example:
from bs4 import BeautifulSoup
soup = BeautifulSoup("<foo><bar>trololo<", "xml") # this will work
It'll even possible to call find() and navigate such broken XML tree...
Let's try reading exactly the same malformed document with pure lxml:
from lxml import etree
root = etree.fromstring("<foo><bar>trololo<") # will throw XMLSyntaxError
Why is this? I know BeautifulSoup itself is not doing any parsing, it's just a wrapper library around lxml (or other parsers). But I'm interested in actually getting errors if XML is malformed, e.g. closing tags are missing. I want just the basic XML syntax validation (not interested in XSD schema validation stuff).