14 Important terms and definitions of Google Search Console
Google Search Console is a powerful tool offered by Google to the Webmasters. It allows Webmasters to analyse their website by checking index status and optimize visibility of their websites. If the data is handled and interpreted correctly it may lead to tonnes of profits for your website. But if you’re a newbie to search console, you may get confused with the terms and data used. So I have listed out the 14 most important terms and definitions of search console which will help you in learning and understanding the webmasters’ tools more better.
14 Important terms and definitions of search console.
To be mentioned, Search console now has a beta version. Google is developing it and adding features. Presently you can access both the versions for Webmaster tools.
Here are the 15 terms and definitions of Search Console:
#1. hreflang tags:
The hreflang tag tells Google about which language you are using on a specific page, so the search engine can serve that result to users searching in that language. So take for example you’re searching for a blog on SEO in French or Spanish. The Websites using hreflang tags setup with French/Spanish will show up in the result. Hence this becomes important for users searching for a different language than your website’s Homepage language. A common example of the hreflang tag is below.
<link rel="alternate" href="http://example.com" hreflang="en-us" />
So what particularly happens is when you use this tag in your pages. It signals google that if a user searches for a result in language “x” will want this result instead of a page with similar content in language “y”. So if you have a page in the Spanish language you can signal google to show up this page in search results for your webpage whenever a user searches for results in Spanish though you had an English version of the homepage.
The benefit of using hreflang tag is it reduces bounce rate, increase conversions by making sure your target audience lands on the version of your page most appropriate for them.
#2. Structured data:
If Google understands the markup on your pages, it can use this information to add rich snippets and other features to your search result. For example, the search snippet for a restaurant might show its average review and price range. You can add structured data to your page using the schema.org vocabulary and formats such as Microdata and RDF, alongside other approaches such as Microformats. You can also add structured data by tagging the data on your page using Data Highlighter.
The Structured Data page in the Search Console shows the structured information that Google was able to detect on your site. It also provides information about errors in page markup that may prevent rich snippets (or other search features) from being displayed. Read this article.
#3. Data Highlighter:
Data Highlighter is a webmaster tool for teaching Google about the pattern of structured data on your website. You simply use Data Highlighter to tag the data fields on your site with a mouse. Then Google can present your data more attractively — and in new ways — in search results and in other products such as the Google Knowledge Graph.
You can highlight the data like title, author, date published, categories, and reviews. It becomes important when you’re using a product or doing an event which you want people to look at when they search for your website. Google will then show the highlighted data to the users more attractively. This results in more user engagement with your site and conversions. Read this article on using data highlighter.
#4. HTML improvements:
The HTML Improvements page shows you potential issues Google found when crawling and indexing your site. It’s recommended that you review this report regularly to identify changes that potentially increase your rankings in Google search results pages while providing a better experience for your readers.
These issues don’t prevent your site from being crawled or indexed, but paying attention to them can improve the user experience and even help drive traffic to your site. For example, title and meta description text can appear in search results pages, and useful, descriptive text is more likely to be clicked on by users.
To view data for your site, you need to make sure you’ve added your site to your account and verified ownership. If we haven’t crawled or indexed your site yet, Google won’t be able to display this data.
Data that may be included on this page include:
- Title problems: Potential problems with the title tag on your pages, such as missing or repeated page titles.
- Meta description problems: Potential problems with duplicate or otherwise problematic meta descriptions.
- Non-indexable content: Pages containing non-indexable content, such as some rich media files, video, or images.
#5. International Targeting:
International targeting tab under search analytics shows hreflang errors which we talked about in the first. Also, you will be able to target a specific country. For example, if you have a Top level domain like .com then you will be able to target a specific country as per your requirement. But if you have a domain like .in then the domain itself specifies that it’s geotargeted location is India. So just leave that section Unlisted.
#6. Mobile Usability:
Global web traffic from mobile devices is on the rise, and recent studies show that 61% of people said that they’d quickly move onto another site if they didn’t find what they were looking for right away on a mobile site. Also While nearly 75% of users prefer a mobile-friendly site, 96% of consumers say they’ve encountered sites that were clearly not designed for mobile devices. So here is a big opportunity for you to build your brand. Here’s the detailed study. The mobile usability report identifies pages on your site with usability problems for visitors on mobile devices.
Google provides a free mobile-friendly test tool. Click here to test your site.Kudos! My site is mobile friendly. Click To Tweet
The Mobile-friendly test tool is easy to use; simply type in the full URL of the web page that you want to test. The test typically takes less than a minute to run.
Test results include a screenshot of how the page looks to Google on a mobile device, as well as a list of any mobile usability problems that it finds. Mobile usability problems are issues that can affect a user that visits the page on a mobile (small screen) device, including small font sizes (which are hard to read on a small screen) and use of Flash (which isn’t supported by most mobile devices).
Here’s a screenshot of this website.
#7. Index Status:
The basic view of the data records only the total indexed value. The advanced view records additional data.
- Total indexed URLs in your site:
Shows the total URLs available to appear in search results, along with other URLs Google might discover by other means. This number changes over time as you add and remove pages.
- URLs blocked by robots.txt
This is the total number of URLs disallowed from crawling by your robots.txt file. If your site is very big, you might want to hide other data so that the graph is scaled to a readable range.
- URLs removed
The number of URLs you have removed with the URL removal tool. Again, this value should be quite low in comparison to the other URLs in this report, so it’s easier to view this selection by itself rather than in comparison with other URLs.
How to use the Index Status report?
- Look for a steady rise in the graph. A steady increase in the number of crawled and indexed pages indicates that Google can regularly access your content and that your site is being indexed.
- Check into sudden drops. If you see a sudden drop in the number of indexed pages, it might mean that your server is down or overloaded, or that Google is having trouble accessing your content.
- Make note of unusually high index volume for your site. A high number of URLs could mean that your site has problems with canonicalization, duplicate content, or automatically generated pages, or that it has been hacked. In many cases, Google will send you a message when we detect problems with your site, so make sure to set your notification preferences.
- Review sudden changes. Spikes or dips that appear in several charts can indicate problems with site configuration, redirects, or security.
How to read the report?
This report shows resources used by your site that are blocked to Googlebot. Not all resources are shown, only resources that Google’s think might be under your control.
- The report landing page shows a list of hosts that provide resources on your site that are blocked by robots.txt rules. Some resources will be hosted on your own site, and some will be hosted on other sites.
- You can Click on any host in the table to see a list of blocked resources from that host, with a count of pages on your site affected by each blocked resource.
- Click on any blocked resource in the table for a list of your pages that load the resource.
- And Click on any page in the table hosting a blocked resource for instructions on how to unblock that resource.
#8. robot.txt file-
A robots.txt file is a file at the root of your site that indicates those parts of your site you don’t want to be accessed by search engine crawlers. The file uses the Robots Exclusion Standard, which is a protocol with a small set of commands that can be used to indicate access to your site by section and by specific kinds of web crawlers (such as mobile crawlers vs desktop crawlers).
Fetch is a term in search console which Fetches a specified URL in your site and displays the HTTP response. It does not request or run any associated resources (such as images or scripts) on the page. This is a relatively quick operation that you can use to check or debug suspected network connectivity or security issues with your site and see the success or failure of the request. You can also use this to immediately get crawled by your site. For example, you published a post which you want Google to crawl immediately so that it can appear in search results. So just type in url of that page and select Fetch. Your post will be crawled in no time.
#10. Fetch and render:
Fetches a specified URL in your site, displays the HTTP response and also renders(provides) the page according to a specified platform (desktop or smartphone). This operation requests and runs all resources on the page (such as images and scripts). Use this to detect visual differences between how Googlebot sees your page and how a user sees your page.
- The request will be added to the fetch history table, with a “pending” status. When the request is complete, the row will show the success or failure of the request and some basic information. Click any non-failed fetch row in the table to get additional details about the request, including raw HTTP response headers and data, and (for Fetch and Render) a list of blocked resources and a view of the rendered page.
- If the request succeeded and is less than four hours old, you can tell Google to re-crawl and possibly re-index the fetched page, and optionally any pages that the fetched page links to.
- The robots.txt Tester tool shows you whether your robots.txt file blocks Google web crawlers from specific URLs on your site. For example, you can use this tool to test whether the Googlebot-Image crawler can crawl the URL of an image you wish to block from Google Image Search. You can submit a URL to the robots.txt Tester tool. The tool operates as Googlebot would check your robots.txt file and verifies that your URL has been blocked properly.
#11. Impressions :
An impression is counted for your website when a user queries and if your page results in the Search results then an impression is counted. Mark, it’s not that the user went to your site. When the user clicks on your link from the search results it is counted as Clicks and not impressions. A linking URL records an impression when it appears in a search result for a user.
#12. URL Inspection tool:
The URL Inspection tool is a new feature added in the new search console. It provides information about Google’s indexed version of a specific page. The Information includes AMP errors, structured data errors, and indexing issues.
How to use the URL inspection tool?
To inspect a URL in your property: Enter the complete URL in the search bar at the top of any page in Search Console.
Note you will have to switch to another property to inspect a URL in another property. You can also inspect both AMP and non-AMP URLs. In both cases, the tool provides information about the corresponding AMP and non-AMP version of the page.
Alternate page versions: If the page has alternate versions, for example, if it uses hreflang to point to alternate language versions, or if you (or Google) decide that there are alternate/duplicate versions of the page, the tool also provides information about the canonical version.
Any click that sends the user to a page outside of Google Search is counted as a click, and clicking a link that stays inside search results is not counted as a click. For example, if a user clicks on your link it is counted as a click.
CTR stands for Click through rate. It measures the total no. of impressions that resulted in clicks. Click through rate is counted by dividing the click count divided by the impression count. If a row of data has no impressions, the CTR will be shown as a dash (-) because CTR would be division bye Zero.
So this was all about Google Search console. If you haven’t signed up for Search console. Go do it now. Here’s the link.
Put your comment below, and let me know if I have missed something. I will update it soon.
Rahul Tiwari14 terms and their definitions used in search consoleClick To Tweet
Read another two related posts here:
Hey there! Am Rahul. Thanks For stopping by my site. I am 20, and I am a learning and growing entrepreneur. I run this blog and my mission is to help people with blogging. Read More…