I know of sites that are still doing very well despite seemingly doing things that would have caused them problems with Penguin, according to what the most popular ideas are of what would cause an issue with it.
Perhaps next time they are going to be hit as well.
What you are saying is fine, but to simply stop optimizing is not in my opinion going to be enough, because on the evidence of the first Penguin it is necessary to undo things that have already been done.
My question is how do sites start to go about that?