Let's say, I have a dynamic page that creates URL's
from user inputs.
For example: www.XXXXXXX.com/browse
<-------- (Browse being the page)
Every time user enters some query, it generates more pages.
For example: www.XXXXXXX.com/browse/abcd
<-------- (abcd being the new page)
Now, I want Google to do crawl this "browse" page but not the sub pages generated by it.
I'm thinking of adding this to my robots.txt page; "Disallow: /browse/"
Would that be the right thing to do? or will it also prevent the Googlebot from crawling the "browse" page? What should I do to get the optimal result?