Search Console: definitive tutorial of the new Google Webmaster Tools
In this post you will find a very complete Google Webmaster Tools tutorial for beginners and novices who want to get started with this tool. For this you will find a complete video tutorial of Google Webmaster Tools where you will learn to use the most important functionalities and characteristics of the tools.
Google Webmaster Tools is one of those essential services for any webmaster who wants to get organic traffic. Well… no… it’s not Webmaster Tools anymore… it’s now called Search Console .
Some of us are having a hard time getting used to the new name after so much time talking about Google Webmaster Tools (or WMT), so you will excuse me if most of the time I resist referring to this service by the name with which Google uses it. renamed last May: Search Console. Over time I suppose we’ll get used to calling it that, but for now it seems that “Webmaster Tools” still sounds better to all of us (… which even Google is having a hard time, since most of the time it refers to this service by the old name 😉)
You might also be interested in:
Why has its name been changed?
According to statements by the company itself , the change was due to the fact that they detected that the term “webmaster” did not represent all users of these tools. So, they decided to change the name so that no one felt excluded and that everyone (webmasters, business owners, SEOs,…) felt comfortable using the service.
Ok, sure that’s true … but it seems to me that there is something else. In my opinion the name change gives us very clear clues of the approach that Google intends to give to this set of tools and how it wants us to use them.
It gives the impression that they have tried to move away from the term webmaster (which seems more geared towards the technical maintenance part) and are trying to place more importance on their ability to help you get more (and better) organic traffic from Google.
It is about using the term “Search” instead of “Webmaster” and thus enhancing the “organic” approach compared to the previous one that seemed to refer more to technical and type of SEO improvements on page .
But surely they are also trying to put everything in its place. Using the term “search” they make it clear what the central axis of Search Console is: searches … and thus clearly differentiate it from the central axis of their other star tool in analysis: Google Analytics, which obviously focuses on visits .
This difference between the two products should seem obvious to any “webmaster”, but the truth is that sometimes there is some confusion about what is used for what. After all, both provide us with analysis of issues relating to our website and help us improve its performance.
That said, I will start this article by showing you the most basic aspects of the product to go a little deeper and deeper into this magnificent tool (or rather “tools”) that will help you enormously to meet the objectives of your website.
Wait a minute: webmaster tools?… Organic traffic?… Maybe we’re messing it up and you’re starting to wonder:
What is Google Webmaster Tools really good for?
If I have to summarize the benefits of using Search Control, I would do it in these three points:
- Diagnose possible problems on my website from Google’s point of view: indexing, crawling, penalties, …
- Know the origin and find the most effective way to expand organic traffic to the web.
- Provide information to Google about our website: how we want our URLs to appear (for example with or without www), send our sitemap to facilitate the tracking and indexing of our website and report the relevance of each page (according to our criteria) among others things.
But maybe you have it clearer in this video from Google with which he presents the service.
After watching the video I have no doubt that Alice will succeed in the online business of tuned jewelry thanks to Webmaster Tools 😉
Webmaster Tools and the conversion process
Before starting to see how to create and work with a Google Webmaster Tools account, it seems interesting to me to make a situation map that clarifies in a very graphic way which are the phases of the process that these tools monitor.
These are the phases of the process that lead us to the final goal: the conversion of visits from organic traffic.
- Tracking: The first objective of SEO is to get Google to crawl all those pages for which we are interested in obtaining organic traffic.
- Indexing: This will be the second phase of the process. Once we get Google to find our pages, we will have to ensure that it can classify them correctly and include them in its index.
- Positioning: Without a doubt the best known phase of the SEO process. What we want is that our page is the most relevant in the index for all those queries that our target audience will make.
- CTR : What it is about at this point is to make our links in the SERPs more attractive than the rest and that the number of clicks on them increases.
- Retain: Once we get the visit we will have to offer a content appropriate to the query made by the user. Only in this way will we be able to generate the necessary interest so that you do not leave our website, we have the opportunity to show you our call to action.
- Generate conversions: If through organic traffic we have managed to attract the right audience to the ideal content, we will get more conversions. (Although the truth is that in this last phase we would already be talking more about CRO (Conversion Rate Oriented))
As we see in this graphic representation, Search Console provides us with data on what happens in the first 4 phases of the process, that is, all those that take place outside our site.
We will obtain information about what happens in the last two phases in the Google Analytics reports.
Now that we know the type of information that we are going to deal with with Google Webmaster Tools and what data we must access from Analytics, we are ready to create and configure our Webmaster Tools account.
How to create an account in Google Webmaster Tools for beginners and advanced
To use Webmaster Tools we will have nothing more than to log in with our Google username and password … well, nothing more and nothing less … because if we do not have an account we will have to create one.
Add a website
Once we log into our Webmasters Tools account we will have to add a property.
If our account is new, a welcome screen like this will appear in which we will have to add a property.
If we are already Google Webmasters Tools users, we will simply have to use the “Add Property” option on the Main Page and a dialog box will appear (like the one below) in which we must enter the URL of our website.
Before giving “continue” I advise you to review the format in which you are writing the domain and make sure that this is how you want the pages of your site to appear in the search results. I mean you can include the domain with or without the “www” before the domain name and extension.
Then you can configure this in W, but it is advisable to write the URL well from the beginning to make sure that the property we are creating is consistent with what we want to appear in the SERPs and with the preferences that we are going to configure next.
Verify a website
You have already told Google Webmaster Tools what the URL of your website is… now you will have to prove that it is yours.
Verifying a website in Webmaster Tools is nothing more than that: showing Google that we are the owners of this site so that Google can give us access to its management.
This verification can be done in any of these ways:
- Upload a file to the server
- Add a meta tag to the website’s HTML code
- Add a new DNS record
- Use a Google Analytics account
- Use Google Tag Manager
In general, Google will recommend one of these methods and the others will show you as alternatives. Now, the method that shows you as recommended seems to vary depending on the verification history and the characteristics of the website itself. Thus, it may be that on one occasion it recommends using the Google Analytics verification method and on another property it may recommend a new DNS record.
The important thing is that you know that, regardless of the method you use to verify the property, you can always add another type of verification later.
Thus, you can have a property verified with Analytics and at the same time with a file that you uploaded via FTP.
Both the method of uploading a file and adding a meta tag on the main page of the site are very easy to implement.
So is the method used by Google Analytics, for which we only have to take into account that we must use the asynchronous tracking code between the <head> and </head> tags.
To add a new DNS record you will already depend a little more on the way in which your hosting provider lets you manage the service, and in the case of verification with Google Tag Manager you will need to have certain knowledge about the use of this system labeling.
Once the website is verified, we will be able to access the main panel of our Webmaster Tools property.
Setting up Google Webmaster Tools
But before we get down to business and start looking at each Google Webmaster Tools tool, I would like to show you some adjustments that, from my point of view, it will be essential to make (or at least very convenient) in the Webmaster Tools configuration.
These options appear in the menu that is displayed when clicking on the cogwheel icon (in the upper right corner of a property’s control panel)
Receive notifications from Google
It is important that, as soon as you create your Google account, you configure the “Search Console Preferences” section well to be informed as soon as possible of possible problems that may arise on your website.
For that you will have to choose the language and the email in which you want to receive your notifications, but above all you have to make sure to check the box that enables these notifications.
Regarding the type of messages that you can receive, I advise you to leave the option “All problems” instead of “Main problems”. At the end of the day, it is about having the best possible web and, as Google advises, normally you will not receive more than one email a day.
Link Google Webmaster Tools with Google Analytics
Linking data from Google Webmaster Tools and Google Analytics should always be one of the first actions to take after verifying our ownership.
What are the advantages of linking?
Well, one of the main advantages is that in Google Analytics the Webmaster Tools data will be included in “Acquisition / Search Engine Optimization” . There we will find 3 reports that after linking we will be able to access. These are:
- Queries: which will include CTR, Impressions, Clicks and Average Position.
- Landing pages: which will identify the landing pages accessed from search engines (including the same parameters as in the previous case)
- Geographical summary: Indicating the geographical origin of organic traffic
How do we link both services to share this data?
The link can be made both from Google Analytics and from the Webmaster Tools itself.
If we link them from Google Analytics
In Google Analytics the link is made at the Property level, so to link both services we will have to open our Google Analytics account.
A simple way to make the link is from the message that appears when trying to access any of the “Search Engine Optimization” Analytics reports . These reports use Search Console data, so if we have not linked both accounts, a message will appear inviting us to link.
Note that Google Analytics itself still refers to “Search Console” as “Tools for webmasters” (… you can see that I am not the only one who has a hard time changing 😉)
But we can also go directly to the Google Analytics Manager and choose the “Property Configuration” in the corresponding property.
In any case, we will have to scroll to the end of the option until we find the “Configuration of Google Webmaster Tools”
Caramba! Again they call it “Tools for webmasters” Let’s see if they clarify … 😉
Clicking on the “Edit” link, they will take us to the list of verified sites in Google Webmaster Tools so that we can choose the one that corresponds.
If we link from Webmaster Tools
From Webmaster Tools it is even simpler still. We just have to access from the icon of the configuration wheel (top right) and choose “Property of Google Analytics.
A list of Google Analytics properties will open, from which we will choose the one that corresponds to make the link.
Google Webmaster Tools “memory” problems
Google Webmaster Tools has always had what in my opinion is a “memory problem”. I mean, it doesn’t allow you to see data that is more than 3 months old.
There is a false belief that by making this link you will achieve that, from that moment, Google Analytics registers its own data from Webmaster Tools and incorporates it into its reports, thus avoiding the problem of not having data beyond these three months.
That is false.
The truth is that I do not understand very well how you keep reading this over and over again in different places (some even with a respectable reputation) … with how easy it is to verify.
See here the report of Queries in a property linked to Google Webmaster Tools. It is clear that it only keeps 90 days, since the first date is 92 days ago (the 90 that it keeps and the 2 days late have the Webmaster Tools data to process).
In the “Site Configuration” option of the drop-down menu we will find this very important option.
I think this should be mandatory to configure it from minute zero in which we install our property.
What we are telling Google when configuring this section is the way we want our domain to be displayed: with or without “www.” at first.
Make no mistake, it is not just a question of aesthetics. When we tell Google that we prefer, for example, that our domain always be shown with www, if someone links us to http://domain.com, google will interpret that the address they are linking to is http: // www. domain.com
Choosing the option “Do not establish a preferred domain” (which is the one that is activated by default) can be highly detrimental to our SEO.
If we do not set preferences and Google tracks a link to the url without www and another with www, it will consider that they are different URLs … but identical content!
Zasca! Pandarazo all over the web!
Indeed, you are
You’re risking a penalty from Panda for having duplicate content in two URLs that are actually the same.
This option, like the previous one, is found under the option “Site Configuration”, but in fact its configuration is not of such vital importance in most cases.
The point here is to limit the frequency of crawling to Google spiders to prevent them from slowing down the server.
In my opinion this option should always be in its default value “Allow Google to optimize for my site (recommended)”.
Google spiders crawl our site by finding new pages on our website and identifying changes to them. This is a necessary step prior to indexing, which on the other hand is the only way to get to the SERPs.
Therefore, if you are interested in appearing in the search results you should NEVER limit the access time to Google spiders … and if Google spiders slow down your server, what you have is a problem with your server (not with Google) .
Add users and owners to Google Webmaster Tools
Each account has at least 1 user: the owner, but more can be added in a simple way from the “Users and property owners” option.
The normal thing is that sooner or later, more people intervene in the management of your website.
Sometimes you will want to appoint users within your own company who use the data that comes from Webmaster Tools in different positions and departments.
At other times you will need to give complete control to someone who is helping you, for example, to develop your online marketing strategy and who needs to make more profound changes.
For each case, you will have to create the appropriate type of user (or owner) with the most appropriate permissions for each one … and you can manage all this from this section of the configuration.
Add new users
To add a new user we will use the “Add new user” button in the upper right part of the option
The following users that we add will have the category of “Complete” or “Restricted” depending on whether we want them to have full access to most of the functions or if we want them to simply be able to view them. You can learn more about the specific features that each type of user will have access to on this Search Console support page .
Add more owners
To add more owners we will have to access the option “Manage property owners”
Actually this brings us to the configuration option “Verification details”, so we can access in either of the two ways to create a new owner.
Google Webmaster Tools informs us here of the property verification methods used, the verification attempts made and the owners already verified.
Finally, we will find the option to “Add an owner”, with which we can add the email account of the new owner. Keep in mind that this email has to correspond to a Google account to be able to name someone as the owner.
Associated with the property of Webmaster Tools
In addition to users and owners, we can configure associates, which in turn can be users or accounts.
…Yes I know. Right now you may be wondering what difference does it make? It is very simple. The users and owners we explained earlier were people who could use the Google Webmaster Tools account and view their reports.
Associates will not be able to see this information, they can simply act on behalf of the site.
To understand a little better what I mean by acting “on behalf of the site” we will put an example of associated with a property of Google Webmaster Tools.
An example of an associated user is someone who can publish applications to the Chrome Web Store on behalf of the website. For which, it will also have to be specified in the corresponding box when adding the partner.
Unlike with users, associated accounts are not added from this panel, but it will depend in each case on the type of account that is associated.
Thus, a typical case of an associated account is the AdWords account . By associating an AdWords account with our Webmaster Tools property we will allow Adwords to use organic search data in its reports.
Another more common example is Google Plus pages that can be published on behalf of the page and will also appear here.
Change of address
In this section on the configuration of Google Webmaster Tools I have intentionally left this option for the end.
There are really few occasions when you will have to use it. Only when you move your website to another domain.
In that case you will have to open the property of the new domain, verify both properties (the new and the old one), carry out the corresponding 301 redirects… and then use this tool so that Google reindexes the pages of this site towards the new domain.
What has been said: except for migrations, this option will surely not be touched … but when the time comes, you will be grateful to know that it was there.
We have already seen all the configuration options that we have in the Google Search Console menu. The truth is that some of them will, in most cases, be expendable. But nevertheless there are 7 points that we should always cover before starting to use Search Console.
Have you forgotten any? … well come on, don’t be shy. Open Webmaster Tools in another window and when you finish you return to the article. That we still have an interesting tour of all the tools and utilities that Search Console offers us.
Structure of Webmaster Tools
We are now going to see the structure of our property and what we can obtain from each of the sections, its tools and its utilities.
But first, you are going to allow me to skip a bit the order in which the Seach Console shows the different tools to perform an action that (despite not including it in the configuration menu) I consider basic to start taking advantage of this service .
Add a sitemap
Although the creation of a sitemap is not a mandatory action for Google to index our websites, the advantages are so clear that I consider it to be another priority when launching our “Search Console” (which sounds bad when translating, right?).
By adding a sitemap to our property we are giving Google a roadmap with which to send their spiders to crawl our website.
With this we will achieve two things:
- Faster indexing of our URLs
- We can diagnose the indexing of our site by comparing the number of URLs that we intend to index with the pages that are actually indexed.
To submit a sitemap you will simply have to go to “Tracking / Sitemaps” and “Add or test Sitemap”.
In the dialog box we fill in the URL of the sitemap and choose “Test sitemap” or “Submit sitemap”.
It’s a really simple process, but it will take up to a couple of days for Google to process it, so the sooner you do it the better.
Come on, now with the structure of the service and the different tools that it offers us.
It is the first thing we will see when we access our property of Google Webmaster Tools.
The control panel is nothing more than that, a simple dashboard in which we can check at a glance certain critical aspects of the status of our website.
The dashboard is divided into 2 sections:
1- Important news
The fact that in this section they tell us that there are no recent messages is a good sign, because here we will be informed of possible problems that may arise on the site.
Unfortunately this is not always the case and sometimes the appearance of this section is more similar to this:
… But the important thing in these cases is that they notify you when something happens so that you can correct it as soon as possible.
In any case, if you have listened to me and have configured that all the alerts are sent to you by email, you will live more calmly and without fear of encountering surprises when you enter your Google Webmaster Tools property.
2- Current Status
The current state is nothing more than a graphical summary of 3 aspects that, although they are treated more in depth in other sections, together they give us an image of the health of our website in searches.
On the one hand, the tracking status is shown, checking for possible DNS errors, server and validity of the robots.txt file.
Along with these data we find a summary of errors in URLs that cannot be accessed or cannot be found.
It shows us the evolution of clicks that have been made in the SERPs towards our website over time.
Here we can graphically see the URLs that we have sent in each sitemap against which Google has indexed.
We will see all these aspects later, when we see in depth the sections in detail to which they lead us by pressing >> to expand the information.
In Search Aspect we will find 4 tools that will help us improve the way our website appears in the SERPs:
Structured data is specific data that we label so that Google can interpret and (when it deems appropriate) display in the SERPs, enriching the results with additional information that they extract from this data. (This is why they are called “rich snippets”)
That these fragments are presented in the result of our website in the SERPs is tremendously interesting, because with them we will achieve, on the one hand, offer more information to the end user and on the other a competitive advantage with the other results with which we will share SERP (well a result with rich snippets will always be much more attractive than a simpler result).
In this section of Google Webmaster Tools we can perform a quick audit of the structured data that we have implemented on our site. We will see how much data and what type we have and the most important thing: in case of errors they will be shown here.
Ok, sure that you have already made clear the importance of implementing structured data (especially on certain pages) and you even know how to check that they are used correctly and without errors… but… how do I implement them?
Well, in this also Webmaster Tools will help you. What’s more, it gives you two options: the simple one (in this section) and another option of “Structured data markup assistant” (included in the extras of “Other Resources”) a little more laborious, but also very useful.
Data markup wizard
What the wizard does is guide you to label, on a specific page, the data corresponding to a data type. Then it returns the code that you must implement on the web page so that these specific data are correctly marked and the search engines can interpret them.
But of course, this is touching HTML code and many of you will be thinking (not without reason): “Ufff … Jesus … don’t do that to me because the code makes a mess”
Ok, it’s normal. Especially if we take into account that, obviously, the code returned by the wizard is a code from a static website and normally the urls that we will pass to it will be dynamic. There we already started to mess it up having to modify templates, identify variables …
Don’t worry, that’s what this section of Data Marker is for.
The data marker is the tool that will allow us to do all this in a simple and quite intuitive way. So much so that I am convinced that many began to adore the old Google Webmaster Tools when this utility appeared.
Imagine you have a shoe store. Obviously you will be interested in marking the same “Product” data on all the cards (image, price, description, …)
Well, when you start the markup, it will give you the option to extrapolate the markup to all similar pages. What is it cool?
This explanatory video is included in the tool itself, you will clearly see how simple the marking process is.
HTML improvements: A little on-page SEO?
With this tool we can check a lot of parameters of our on-page optimization .
There are many of us who often have the bad habit of starting to do these checks with different types of software that, normally, give us more information or show it to us in another way. But it may be a mistake not to start with Google’s own Webmaster tools. At the end of the day here we are not seeing any problem on-page. What Google is showing us are the problems that it has detected.
Correcting these problems will certainly be more effective, as we are focusing on improvements that we know Google will appreciate.
Sitelinks are results that are displayed in Google search results for certain pages and depending on the keywords that the search engine user is searching for.
Sitelinks are not something we can configure to appear in the SERPs. That is something to be decided by the algorithm itself.
What we can do is suggest that Google not show certain pages in sitelinks, lowering their position. But again we are talking about “suggesting” … and in the end you can tell Google what you want and then she goes and does what she wants.
This section includes those tools that will be more focused on analyzing and improving the ranking of our URLs in the SERPs.
This is one of the tools that I like the most, as it allows us to obtain data on the behavior of our pages in search results, customizing them with metrics and filters.
The report lets us choose between 4 metrics which we want to be displayed. These are:
Clicks: times our URL has been clicked on in the SERPs.
Impressions: times our URL has been shown in the SERPs.
CTR: which is nothing more than a ratio that relates the two previous metrics.
Position: which is the average of the positions that a page occupies in the results for different searches.
The metrics will be shown grouped in different ways depending on whether we choose to show them to:
- Queries: the queries made in which our URLs appear
- Pages: Each of the metrics corresponding to each of the pages of our website will be displayed
- Countries: the different values of each metric chosen are grouped by countries from which the search queries were made
- Devices: It is very interesting to see how certain metrics vary depending on the device from which the query is made. With these filters we can detect search patterns that may only occur from mobile terminals and not from desktop (such as “nearby ATMs”) and on which we can focus to improve the experience of users of these specific devices.
- Search type: This grouping will give us an idea of for which devices Google will be showing us the most, that is, for which it considers that our page is relevant.
- Dates: limiting or comparing intervals in time.
Each of the groupings can in turn be filtered to show only certain values of them.
Thus, for example, on a website we can filter all queries that do not contain our brand to eliminate branding searches.
But in addition to the filters applicable to the groups, we have another that we could call a “direct filter”:
- The type of search: in which we can filter by web, image or video searches (or by comparing them) We say that it is a direct filter because you can choose a type of search, but obviously it cannot be grouped.
For more information on search analysis and its possibilities, I recommend that you take a look at Search Console’s own explanation .
Links to your site: a look at backlinks
This section shows 3 types of data related to our inbound links.
Domains with the most inbound links
Actually it is more than that, because we can do the complete traceability of each link. From each of the domains you can access the destination pages that link and from these you can access the source pages of each link.
Although there are other much more complete and comfortable tools to carry out this task, it is always important to keep an eye on the Search Console data, as it is Google’s own data.
Most linked pages
It is a list of landings on our site ordered by number of links.
In a similar way to what happened in the domains we will be able to make the complete route of these links, but this time in the opposite direction. It shows us the landings. From them we will be able to access the domains that link them and finally to the source pages of the link.
Linking your data
It is the anchor text of a lifetime (and I will never understand why they have not looked for such a strange name if, as soon as you enter the detail, it already says “Anchor Text”)
In short, it is a list of the anchors used for the inbound links that Google knows about.
Monitoring our incoming links is a task that must be carried out regularly to detect possible toxic links to disavow or perhaps some anchor text of those that squeak and that could mean that we are being victims of a negative SEO attack .
What we find here is nothing more than a report of the internal links on our website.
It is important to know which pages we point to the most with internal links, as we are implicitly telling Google that these are important pages within our website.
But the truth is that I have my doubts that Google now pays as much attention to internal links as before. Or perhaps it is that now there are many other more important factors and the relative value of these is diluted.
In any case, I am convinced that linking between our own pages should always be based on adding value to the user by linking content that can provide added value and not thinking about “what will Google think if I link a lot to my contact? ”… well that: you’re looking for a lead… nothing strange.
In this section is where Search Console will warn us when a manual penalty “falls” on us … this is perhaps the worst thing that can happen to us.
If any of the algorithms algorithmically affects us, the exit from the pothole will be easier (almost always it is limited to correcting what is wrong … and waiting)
However, if a notice of a manual penalty appears on this panel, be prepared to investigate thoroughly, correct quickly and request a reconsideration … let’s see if it comes.
In truth, on this occasion more than ever the saying that “the absence of news is good news” can be applied.
I sincerely hope that your “Manual Actions” section looks like this for a long time:
This section will only be useful if you have your site in several languages or if you have several similar sites for different languages.
This tool allows us to configure the segmentation of the site by language and by country:
Through the hreflang tags we can identify the language to which each page is directed. Google detects this setting and offers each user the appropriate content.
This section will show possible errors on sites that implement these tags.
Segmentation by country
The results that Google shows users are not the same if you search, for example, on google.es and on google.co.uk
Here we can configure the country of the users to whom we direct our website.
Imagine that you hire a hosting in France and your target audience is in Spain… (well… it is not the most correct and we could talk about that for a while, but that would be WPO and it is not the topic at hand).
If that is your case, you will be interested in configuring the country as Spain so that Google shows the web to your target.
Since Google launched the new Mobile-Friendly algorithm on April 21, this report should not be overlooked. In reality, even if it had not launched that change, it should not be overlooked either, since what it indicates is a usability issue and that must always be a priority. But the truth is that from that day on we see the wolf’s ears a little more.
The fact is that this report shows us the problems that users will have to access from mobiles and the number of pages that present those problems.
By clicking on the >> we can access the detail of the specific URLs that present these problems.
In this case I have 2 pages to review. Let’s not forget that Mobbilegeddon affects individual pages (not sites) so surely these two pages will not pass the Google test and will not appear in the SERPs from mobile phones, while those of my competition surely will.
Under this heading we will find the necessary tools to know everything related to indexing in Google: what we index, what we do not index and what we block.
The objective in the indexing phase is to get as many pages of our sitemap as possible to the Google index.
In this sense, the Indexing Status report is as simple as it is useful.
Crossing the data that this report provides us with the total pages of our sitemap will give us a fairly clear idea of how we are doing in terms of crawling and indexing.
It is normal that not all the pages that we send in our sitemap are indexed, but if the difference between what is sent and what is indexed is very large, we will have to analyze the reason.
Let’s see an example:
We have a page where we have posted about 500 URLs on the sitemap.
However, when we access the Indexing Status we see that there are only about 300 in the Google index.
The difference is worrying and we decided to expand the information.
In the same indexing report we access the “Advanced” tab and we also mark the “Pages blocked by robots” … and there it is. The problem is that we were blocking the robots from accessing the rest of the pages.
To solve this we will have to analyze the robots.txt file, identify what those URLs are and check if they should be indexed.
This report will give us a clear idea of what Google thinks about our page, as it shows us the different keywords that it detects on our site along with a graphic indicator of their importance on the site.
On March 11, 2015 Google released this report of blocked resources. It shows us the resources that you cannot access to correctly interpret our website.
There are two issues that are always repeated when we talk about Google changes:
- Google does nothing just because.
- With Google you always have to know how to read between the lines.
Why do I say this? Well because in my opinion this report is the necessary predecessor of the Mobile Friendly algorithm.
Therefore, it is advisable to keep the number of blocked resources as low as possible.
Using the URL removal form we can make Google remove a page from our site from the search results.
But these deletions are not permanent. The elimination will have a minimum duration of 90 days. But after 90 days, if we do nothing, Google can re-crawl and index the page.
To permanently remove the page from the search results, we must carry out one of the following actions during that period:
- Generate a 404 (not found) or 410 (permanently unavailable) status code.
- Use robots.txt to block crawling.
- Use the meta-tag <rel = ”noindex”> to tell Google not to index it.
This section includes tools and reports that will allow us to check the accessibility of Google to all the URLs that we want it to index.
The tracking errors that are presented in this report can be of various types, but we can group them into 5XX type errors (whose code starts with 5) and 4XX type errors
5XX errors are server-level errors. Normally they appear when you try to access a URL and it is not returned by the hosting … but not because it does not exist, but because it does not respond.
4XX errors are errors that occur when trying to access a page that is not there. These are called “application” errors.
The most common among 4xx errors are 404 (page missing) and 410 (page permanently deleted).
Google Webmaster Tools shows us the errors divided by type of error and separated by type of device.
When we have this type of errors we must manage them and try to reduce their number, since a page that cannot be accessed will generate a bad experience and that Google penalizes it.
The tracking statistics tell us the amount of data that Google tracks each day (pages and kilobytes), but it also tells us the download time per page.
A drastic reduction in crawls or a sudden increase in download time can lead to a problem that prevents the search engine from accessing our site optimally.
Browse like Google
The importance of seeing our website as Google does is that we will be able to detect resources that block the crawl and perhaps cause an incorrect indexing.
To explore a page like Google we will have to enter the URL in the search box and choose the way we want to explore it. We have two ways to explore like Google:
When we “get” what we do is what it returns to us is the HTTP response of the download along with the code that Google receives.
This method of exploration is almost immediate.
Obtain and process
Through this exploration method we will obtain the same information as in the previous one, but also the web will be processed to show it as the user sees it together with the image as Google interprets it.
The robots.txt file tester is, on the one hand, a tool for checking errors in the file.
On the other, it allows us to edit the file and download it to upload it later to the root directory by ftp.
Although having a sitemap is not necessary for Google to crawl and index our URLs, we have already said how important it can be to make it easier for you to crawl with the information in this file.
In this section we will be able to monitor, on the one hand, the pages that we send in our sitemap (which we can consider indexing suggestions) and, on the other hand, how many of those pages Google finally indexes.
But it will also indicate the errors it finds when trying to access the different URLs of our sitemap.
This is another of those tools that must be configured whenever possible, because if we do not do it we can generate duplicate content on our website.
Sometimes a URL can be presented with certain parameters and without them is the case, for example, of the internal search result on a website. A typical case would be the search for an article in an ecommerce.
It could be the case that we have a product file with the following URL:
But if instead of accessing the file directly we use the page’s internal search engine, we may arrive at the same page, but with the URL.
As you can see, they are two different URLs and the same content. Ergo … Panda’s penalty for duplicate content.
Fixing it is as simple as declaring the search parameter ? Search in this Search Console tool.
Depending on the characteristics of each website, we can find pages that do not really use any parameters or with others that have a lot of different parameters.
If Google detects any security problem or malicious software on our website, it will be here that it informs us about them and how they can affect us.
Google Webmaster Tools has a kind of “mixed bag” where it stores all those utilities with which it experiments or that simply do not have a clear classification in another section or Google service.
Among them we can find the “Structured Data Assistant” that we already talked about previously, but there is also access to such diverse tools as Page Speed , Google My Business, Academia para Webmasters, …
It is not that they are own tools of Webmaster Tools but it is very appropriate to include access from this service. At the end of the day they are useful for any
webmaster … user … that is why we stopped calling him Webmaster Tools (or not)
Google Webmaster Tools is a very complete service that offers a wide variety of tools that will help us manage the entire process of crawling, indexing and capturing organic traffic.
Many of these utilities will not be necessary in most cases, but there are many others, such as HTML Improvements, Search Analysis, Manual Actions, Indexing Status, … which will be common to all webmasters (although Google doesn’t like that term anymore)
Can you live without Google Webmaster Tools?
The truth is yes. There are many other tools (most of them paid) that can better or worse supply all these functionalities.
However, Google Webmaster Tools has an advantage that no other provides: it is from Google… and therefore uses the data and knowledge of the search engine in which we precisely try to have the most optimized presence possible.
Would you add any more tips on Google Webmaster Tools?
Use Search Console a lot?
If you liked the tutorial, you can share it with a friend.