What Code stop SE crawl a site?

Discussion in 'Search Engine Optimization (SEO)' started by infoway USA, Jul 20, 2010.

  1. infoway USA

    infoway USA New Member

    Joined:
    Jun 14, 2010
    Messages:
    34
    Likes Received:
    0
    Trophy Points:
    0
    Occupation:
    Webmaster
    Location:
    West Hills, CA, USA
    Home Page:
    http://www.infoway.us/
    what is the code through which I can ask any search engine not to crawl my own site or part of the site?
     
  2. shabbir

    shabbir Administrator Staff Member

    Joined:
    Jul 12, 2004
    Messages:
    15,335
    Likes Received:
    377
    Trophy Points:
    83
    Multiple ways.

    1. In robots.txt add the following line
    User-agent: *
    Disallow: /

    2. In Head Meta you can specify no crawl or index like

    <meta name="robots" content="noindex,nofollow" />

    3. Last but by no means least is make it a password protect directories using Apache
     
  3. nabeelarkisdata

    nabeelarkisdata New Member

    Joined:
    Jun 19, 2010
    Messages:
    11
    Likes Received:
    0
    Trophy Points:
    0
    Occupation:
    Webmaster
    Location:
    Mauritius
    Home Page:
    http://rajeevkistoo.com/
    Disallow: / this is the code. this prevents search engines from crawling your site
     

Share This Page

  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice