0
shabbir's Avatar, Join Date: Jul 2004
Go4Expert Founder
Quote:
Originally Posted by Click SSL View Post
Still I am not getting perfectly how it helps in blocking?
If you add things like
Code:
User-agent: *
Disallow: /
0
shiv2011's Avatar, Join Date: Jun 2011
Light Poster
its very important because we exclude comtent that we donot want to crawl.
0
shiv2011's Avatar, Join Date: Jun 2011
Light Poster
it exclude contents that we donot want to crawl.
0
linkbuilding5's Avatar, Join Date: May 2011
Contributor
robot.txt file is the file where you can allow or disallow a webpage to crawl.