Google Search Console Errors and Warnings
One of the best things about using Google Search Console is that it provides valuable SEO feedback. Some errors and new warnings have started to appear, especially when switching from the old version to the new version.
In this article, I wanted to talk about how you can fix Search Console errors and warnings. Let's take an in-depth look at what can happen in indexing…
The index coverage status report gives you a list of any issues encountered, as well as the pages Google has crawled and indexed. Moreover, these errors are too valuable to be found in any paid SEO tools.
In Google Search Console, you will see the Index on the left and the Scope section right below. Here you can see the status of all your pages. The index status of each page is shown as one of 4 options;
Valid: The page is indexed.
Error: The page could not be indexed.
Warning: The page is indexed; but there are some problems.
Excluded: The page has been removed from the Google index for reasons beyond your control. (Discovered but not indexed or crawled but not indexed)
If you still see the warning due to an issue you have already fixed, check the last crawl date.
Errors
Some errors in Search Console.
Server Error (5XX)These pages indicate that a server-side level 500 error code is returned to Google. These error codes indicate that there is a problem with the server. How to Fix?: Manually check whether the page is loaded, if any, the problem will be resolved automatically. In this case, there is not much the site owner can do, check the uptime and report the situation to your hosting provider. Redirect Error
Redirect error
may be due to any of the following;
Very long redirect chains,
Redirect loops, Redirection is directed to empty URL. How to fix it?: Fix your redirection loops, for this, look at the examples, check which pages are directed to where, and indicate these pages to the software team.
The submitted URL was blocked by the robots.txt file. This error occurs when you submit your page for indexing and Google bots cannot access your site because of the command contained in Robots.txt. It is quite common on new sites or sites that have been migrated. How to Fix?: Remove the line of code that prevents the site from being indexed (Disallow) from your Robots.txt file. To test this, use the Robots.txt testing tool included in the older version. The new Google SC does not see your robots.txt file immediately, so I highly recommend the testing tool.
The submitted URL is marked as “noindex”
This error occurs when you submit the page for indexing, the “noindex” meta tag is Google cannot index the pages because the pages are present. How to Fix?: Remove the “noindex” meta tag from the pages you want to be indexed. Additionally, if you are using a ready-made CMS such as WordPress and have an XML Sitemap plugin, uncheck the "Include sitemap in HTML format" option. remove this meta tag from pages you do not want.
The submitted URL appears to be a Soft 404
Soft 404 error is code that tells the user that they have gone to a successful page but that it is an error page/does not contain the expected content . Google may also view these pages as a waste of crawler budget. How to Fix it: If your pages no longer exist, configure them to return either a 404 or 401 response code (maybe even 410). Consider creating a customized 404 page.
The submitted URL is an unauthorized request ( 401) error
This error usually occurs when Google visits password-protected pages and is not authorized to crawl. How to Fix?: Remove the password or authorizations from the pages, then allow Googlebot to access the page.
The sent URL was not found (404): When you delete the page from the site; However, it is an error that occurs when we do not update your sitemaps. How to Fix?: Keep your sitemaps clean regularly. There is a crawling problem with the submitted URL. It means there are other problems with the URL that are not mentioned here. How to Fix?: Test with Google to see if there is a difference from what appears to a normal visitor. Use the URL Inspection Tool to check. One of the potential causes of this problem is Javascript. If you use JS heavily to load resources, most search engines may ignore this. Other possible causes include long loading times.
Warnings
Robots.txt warning screen.
Indexed even though it was blocked by Robots.txtPage was blocked by Robots.txt Although it is added to the index. Although Google respects the Robots.txt file, it may sometimes not comply with these rules. In this case, there is no obstacle to blocking the directory before; but can be seen from later blocked pages.How to Fix?: If one wants to remove the page from the indexFirst, unblock Robots.txt, then remove the page from the index with the "noindex" instruction. The biggest mistakes made are blocking it with Robots.txt and waiting for it to be removed from the index; However, since Googlebot cannot access the page, the indexes remain.
Valid
Valid states.Indexed, not submitted using the sitemap. The pages are indexed; but these important URLs are not specified in the sitemap. How to Fix?: Specify the URLs you think are important in your sitemaps.
Excluded
Excluded errors
Excluded errors occur in Search Console. Pages with this error are not indexed and are generally not included in the Google index. We want to be added to the Google index; However, pages that are deemed inappropriate by Google may be included in this scope.
Excluded by the "noindex" tag.
There is a "noindex" meta tag on the page that prevents indexing. That's why the relevant page(s) are not indexed. How to Fix?: Remove the “noindex” meta tag and wait for the page to be crawled again.
Crawling anomaly
When Google tries to retrieve the URL, an unspecified error occurs. An error has occurred. How to Fix?: This situation may be caused by your server. The page may be part of the redirect chain. It could be a page redirecting to a 404 page or an existing 404 page. You can see the exact answer to this question with the URL Inspection Tool.
Crawled – not currently indexed
Crawled by Googlebots without any problems; but it refers to pages that are not indexed. How to Fix?: The page may or may not be indexed at a later stage. You do not need to make an additional request for it to be indexed, this situation may need to wait for a while, unfortunately Google's resources are not unlimited, and it may be related to whether it finds the page quality or not.
Discovered – it is not indexed at the moment
Page found by Google; but it has not been scanned yet. The site may have been overloaded at that moment and Googlebot may have abandoned the page. The page may be one of your pages that is important enough to be crawled. It is important not to get this error when it comes back. How to Fix?: Check the bandwitch status of your site and make sure it can handle enough traffic.
Redirected page
Since the URL is redirected (301, 302 etc.) Could not be retrieved.How to Fix?: The URL could not be indexed because it already entered a redirect, check that it redirects properly to the new URL and do not use this URL in sitemaps.
Duplicate, the submitted URL was not selected as canonical
These URLs should be marked as Canonical, but it shows URLs that are not marked as Canonical. In this case, Google understands that these URLs are copies of other URLs and does not index them. It is usually seen quite frequently on e-commerce sites. How to Fix it?: Avoid showing pages as copies of each other. Provide as much added value as possible on every page. Even if you show the same type of pages as separate pages, make it clear that they are Canonical of each other. For example; In cases such as red, blue and yellow sweaters, you can specify a single canonical and show this to Google.
Alternative page with the correct canonical tag
These pages have the correct Canonical usage and are therefore indexed. It is stated that it has not been added. How to Fix it?: There is no need for you to do anything extra, as long as your Canonicals are used correctly. Copy, Google chose a different standard page from the user. Even though the Canonicals of some pages are specified, Google chooses the pages it wants in cases where it thinks another URL is better. How How to fix?: Correctly select Canonical pages and reduce similar pages. After correcting the URLs, go back to the Index Coverage section and click the 'Verify Revision' button.
Correction-verification
Don't forget to verify the corrections .Google will start verifying your fix and show the status as 'Verification started'.
Verification Started
When you start the verification…Once Google completes the verification process, you can see the status as “Successful” if the verification is successful.
Verification Successful
The screen you will see when the verification is completed successfully. After you have successfully corrected the error, you will receive a message like the one below indicating that you have successfully corrected the error.
Successful-message
When the transaction is completed successfully, you will also receive an e-mail. I gave the mobile example and you will too. Monitoring the status of your indexed pages should be a daily SEO task that you perform frequently. Solving these errors may also be useful in Bing and Yandex.