0
ozsubasi's Avatar, Join Date: Jan 2012
Invasive contributor
My understanding is that Google would also see the 403 error page, just as it does for a 404.
The idea of this is that either the site owner will remove the link (because they will not want broken links) or that Google will eventually de-ındex it as it will 404's. But as I said earlier, if Google does not crawl the page where the link is located it really doesn't help from that point of view.
But should the site put up any further links to mine, Google will not be able to follow them and therefore I can at least prevent it from happening again.
0
shabbir's Avatar, Join Date: Jul 2004
Go4Expert Founder
Google will not see 403 unless Google follows the link and index your site and Google never does it that way or else it will never end indexing any given page.

So if I am on site A - and finds links to site A1, A2 ... A10 on Site A, I will not follow each link but I will make an index of Sites A and then add the links A1 to A10 to be visited and then visit each of those links and index. (Assuming Site A and A1 to A10 have one page for understanding)

So Google's bot will not be visiting Site A1 with referral from Site A or Page A

So Google will always be able to index those pages but users will not.
0
ozsubasi's Avatar, Join Date: Jan 2012
Invasive contributor
I understand what you are saying, and that once google has followed a link it is difficult to tell it not to follow it any more.
Google will at some time revisit the pages where the links are placed, and will attempt to follow the links from that page. But it is after all only a machine, and although it may have followed the link before, having now made it a 403 if it tries to follow it again it will find it broken.
I have a number of pages on my site which are now 404 because I have deleted them. After trying the links to them a number of times Google will eventually de-index them. The same theory applies for 403, except that instead of de-indexing the page the links are pointing to, it can only de-index the link itself because it does not now know where the page is.
This theory may not work, but the point is that we are very limited in what we can try to do to disconnect our sites from links that could potentially be detrimental. I have not found any evidence that this kind of action can cause any harm, so for me it is a case of doing something rather than nothing and if it helps then it is a bonus.

Last edited by ozsubasi; 24Sep2012 at 11:36..
0
shabbir's Avatar, Join Date: Jul 2004
Go4Expert Founder
Quote:
Originally Posted by ozsubasi View Post
I have not found any evidence that this kind of action can cause any harm, so for me it is a case of doing something rather than nothing and if it helps then it is a bonus.
Doing something is good but then according to me you are just blocking visitors but by no chance Googlebot. Will love to see the results because what we are saying right now is what could happen but not what is happening.
0
ozsubasi's Avatar, Join Date: Jan 2012
Invasive contributor
If a visitor tries to follow the link they will get a 403 and so will googlebot.
How can googlebot get a different response? It comes from my server, they cannot get anything different.
If I make a page a 404 by deleting it, Google will not remove it from its index for many months and I believe the same applies for 403.
What I am talking about in this thread is what sites can possibly do to try to remove potential problems, and at the same time accepting that none of us has definite answers, it can only be opinion.
Even if a link does finally get removed from webmaster tools, there is no way of knowing for sure what the reason was, i.e. whether it is something we have done.
0
ozsubasi's Avatar, Join Date: Jan 2012
Invasive contributor
As the disavow tool has now been launched, and as there is a new thread about it here:
http://www.go4expert.com/showthread.php?t=29203
this thread is now closed.