If a visitor tries to follow the link they will get a 403 and so will googlebot.
How can googlebot get a different response? It comes from my server, they cannot get anything different.
If I make a page a 404 by deleting it, Google will not remove it from its index for many months and I believe the same applies for 403.
What I am talking about in this thread is what sites can possibly do to try to remove potential problems, and at the same time accepting that none of us has definite answers, it can only be opinion.
Even if a link does finally get removed from webmaster tools, there is no way of knowing for sure what the reason was, i.e. whether it is something we have done.