I have a php webpage that uses a URL Parameter to set a variable which is then displayed within that page. URL: webaddress.com/page.php?id=someCity
We take the $_GET['id'] and assign it as a variable ($city) which is then used on the page reconstructing static text in a somewhat dynamic method.
For instance:
Welcome to our page about Somecity. We can help you find products related to someCity because we have vast experience in Somecity. Obviously this would be achieved using <?php echo $city; ?>
My client is being told he is open to a Cross Site Scripting (XSS) vulnerability. My research shows that an iFrame can then be used to steal cookies and do malicious things. The recommended solution is to use the PHP Function htmlspecialchars() which changes characters to "HTML entities". I don't understand how this is more secure than simply removing all the tags by using strip_tags().
So, I use both as well as a string replace and capitalization as this is also needed.
$step1 = str_replace('_', ' ', $_GET['id']); // Remove underline replace with space
$step2 = strip_tags($step1); // Strip tags
$step3 = htmlspecialchars($step2); // Change tag characters to HTML entities
$city = ucwords($step3);
QUESTION: Is this sufficient to prevent XSS and is it true that there would be additional benefit to htmlspecialchars() over strip_tags()? I understand the difference based on other submissions of similar questions but would like to know how each function (especially htmlspecialchars() ) prevents XSS.