How developers can use search console?

Team TypeStack
Team TypeStack ...
Dec 05, 2022  . 7 min read
shareshareshare
share

In this blog, we'll talk about how to use Search Console if you're a developer. Since the tool provides a lot of information about search optimization, developers may think it's not useful to them, but that is not the case at all. On the contrary, a search console for developers is recommended.

In this blog, we'll talk about the most useful Search Console reports to help developers build healthy, findable, and optimized websites for Google search. In a nutshell, use the Index Coverage Report to understand sitewide search indexing issues. Use the URL Inspection Tool to debug page-level search indexing issues. Use the Security Issues Report to find and fix issues affecting your site, and use the Core Web Vitals Report to ensure your website provides a great page experience for your users.

The first thing that is important for you is whether Google can find and crawl your pages. Small glitches can have a massive effect when it comes to Google bots being able to read websites. For instance, sometimes we see companies accidentally adding no index tags to entire websites or blocking the content from being crawled through an error on their robot.txt file. These issues can be easily uncovered using the Index Coverage Report.

When you open the report, the first page you see is the 'summary page.' The default view shows indexing errors on your website, but you can click to show valid with warnings, valid, and excluded pages. Additionally, you will find a checkbox to add to the main chart the number of impressions your page has got on search.

Errors prevent pages from being indexed. Pages with errors won't appear in Google, which can mean traffic loss to your website. For instance, your page might be returning a 44 or a 500-level error. Additionally, you might get an error if you submit a page through a site map, but it contains a no-index directive. This would prevent the page from occurring on search results. Valid with warnings are pages that may or may not be shown on Google, depending on the issue, but we think there is a problem you should look into. For example, Google might find indexed pages, though blocked by robot.txt.

If you do want to block the page, robot.txt is not the best way to avoid being indexed. To do so, you should either use a no-index directive or request authentication to see the page. Valid pages should be indexed and will be shown on Google Search. And excluded pages won't be indexed, meaning they won't appear in Google. But either we think that is your intention, or we think it's the right thing to do. For example, the page has a no-index directive, your choice, or the page is a duplicate of another page, Google's choice.

To debug an issue with a specific page, for example, a page Google is showing an error in the Coverage Report, you should use the Inspect URL tool. You can use it to learn the current index status of your pages, to test the live URL, to ask Google to crawl a specific page, and to see exact information about the loaded resources and other information of a page. You can access the tool from the top bar of the sidebar, and you'll also see a little magnifying glass next to URLs in some reports. For instance, if you drill down to a URL from the Index Coverage Report, you can click to inspect it. Once the page displays the results, you will see three different sections, all of them presenting information from Google's last crawl or crawl attempt. If you recently made changes to the page, you might want to check if they're working as intended by clicking Test Live URL and comparing the live version to the indexed one. In the presence of Google, you'll get a verdict on whether or not the URL can appear in Google search results. There are two important options available for developers-

1. if you changed the page and want to request Google to re-index it, use Request Indexing.

2. You can click View Crawl Page to check the HTML version that Google indexed and more information on the HTTP response and loaded resources.

In the Coverage section, you'll learn where the page was discovered, like a site map or referring page, when was the last crawl, by which user agent, and whether the page is included in the Google Index or another version of it was chosen as the canonical.

You'll find any structured data details in the Enhancements section, along with AMP and mobile usability warnings and errors. For example, the inspection will return an error about missing and wrong values if your page is not properly marked with structured data.

Search Console also includes two reports that will help you optimize your site's help- security issues and core web vitals. The Search Console Security Issues Report shows warnings when Google finds that your site might have been hacked or used in ways that could potentially harm a visitor or their device. For example, a hacker might inject malicious code into your pages to redirect your users to another site or automatically create pages with nonsensical sentences filled with keywords. These are examples of website hacking.

An attacker might also trick users into doing something dangerous, such as revealing confidential information or downloading malicious software. That's called social engineering. When you log in to Search Console, you'll already be notified on the Overview page if your site has security issues. Clicking the alert will lead you to the Security Issues Report, where you'll find a list of all security issues Google found on your website.

In the report, you'll find more details about the type of threat, a sample of pages affected by it, and a process for you to inform Google when you fix the issues.

Lastly, the Core Web Vitals Report shows how your pages perform based on real-world usage data, sometimes called fill data. The report is based on three metrics. LCP, or Largest Contentful Paint, is the amount of time it takes to render the largest content element visible in the viewport, starting from when the user requests the URL. This is important because it tells the reader that the URL is actually loading. FID, or First Input Delay, is the time from when a user first interacts with your page, when they click the link or tap the button, to the time when the browser responds to that interaction. This is important on pages where the user needs to do something because this is when the page has become interactive.

CLS, or Cumulative Layout Shift, is the amount that the page layout shifts during the loading phase. This score is rated from 0 to 1, where 0 means no shifting and 1 means the most shifting. This is important because having elements shift while a user is trying to interact with the page is very annoying. Log in to Search Console and navigate to the Core Web Vitals Report. You'll see that the report is broken down by mobile and desktop. Open one of them to see an aggregate report with more details. By default, the chart shows trends for pages with poor performance, but you can click on Need Improvement and Good to check their trends too. Be aware that if a page does not have a minimum amount of reporting data for any of these metrics, it is omitted from the report, so you probably won't see all your pages. Click an issue on the table to drill down. In this report, you'll find more details about the issue, a chart showing trends, and a table with simple URLs.

You would also like to check out tools such as Lighthouse before you deploy changes to production.

To summarize, use the Index Coverage Report to understand sitewide search indexing issues. Use the URL Inspection tool to debug page-level search indexing issues. Use the Security Issues Report to find and fix threats affecting your site, and use the Core Web Vitals Report to make sure your website provides a great page experience to your users. With that, we came to the end of this article, hope it will be helpful for you.

success