How to stop old 404s from showing up in your crawl errors report

If your 404 error url is meant to be long gone, let it die. Just ignore it, as google recommends. But to prevent it from showing up in your crawl errors report, you’ll need to do a few more things.

As yet another indication of the power of links, google will only show the 404 errors in the first place if your site or an external website is linking to the 404 page.

In other words, if I type in your-website-name.Com/unicorn-boogers, it won’t show up in your crawl errors dashboard unless I also link to it from my website.

To find the links to your 404 page, go to your crawl errors > url errors section:

Then click on the url you want to fix

Search your page for the link. It’s often faster to view the source code of your page and find the link in question there:

It’s painstaking work, but if you really want to stop old 404s from showing up in your dashboard, you’ll have to remove the links to that page from every page linking to it. Even other websites.

What’s really fun (not) is if you’re getting links Petroleum Manufacturers Email List pointed to your url from old sitemaps. You’ll have to let those old sitemaps 404 in order to totally remove them. Don’t redirect them to your live sitemap.

C) access denied


Access denied means googlebot can’t crawl DW Leads the page. Unlike a 404, googlebot is prevented from crawling the page in the first place.

What they mean
Access denied errors commonly block the googlebot through these methods:

You require users to log in to see a url on your site, therefore the googlebot is blocked
Your robots.Txt file blocks the googlebot from individual urls, whole folders, or your entire site
Your hosting provider is blocking the googlebot from your site, or the server requires users to authenticate by proxy