1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

What Code stop SE crawl a site?

Discussion in 'Search Engine Optimization (SEO)' started by infoway USA, Jul 20, 2010.

  1. infoway USA

    infoway USA New Member

    Joined:
    Jun 14, 2010
    Messages:
    34
    Likes Received:
    0
    Trophy Points:
    0
    Occupation:
    Webmaster
    Location:
    West Hills, CA, USA
    Home Page:
    what is the code through which I can ask any search engine not to crawl my own site or part of the site?
     
  2. shabbir

    shabbir Administrator Staff Member

    Joined:
    Jul 12, 2004
    Messages:
    15,293
    Likes Received:
    365
    Trophy Points:
    83
    Multiple ways.

    1. In robots.txt add the following line
    User-agent: *
    Disallow: /

    2. In Head Meta you can specify no crawl or index like

    <meta name="robots" content="noindex,nofollow" />

    3. Last but by no means least is make it a password protect directories using Apache
     
  3. nabeelarkisdata

    nabeelarkisdata New Member

    Joined:
    Jun 19, 2010
    Messages:
    11
    Likes Received:
    0
    Trophy Points:
    0
    Occupation:
    Webmaster
    Location:
    Mauritius
    Home Page:
    Disallow: / this is the code. this prevents search engines from crawling your site
     

Share This Page