Finally Google gives us good news. When we were already approaching the total encryption of organic traffic (in many websites the famous not provided has already reached 80% of organic visits) the first search engine has improved (at last) the accuracy of Google Webmaster Tools.
And thank goodness, the situation was becoming critical. Although it pays to be cautious, the Google Webmaster Tool has never been known for its accuracy. Therefore, it is advisable to test it with our projects without believing everything we read.
But it seems to work. In this way, both the SEO professional and the web analyst (click here to learn more about this profile) will have better quality information to know those keywords for which we are positioned and that are bringing us traffic.
But we should know that there are additional methods to reinforce the information we have about organic traffic. Therefore, in this article we propose alternative methods to overcome the not provided:
Índice de contenidos
1. Measure the ranking of critical keywords for your business.
Statistically, there are studies that show that each search engine position captures a certain percentage of the total traffic for each keyword. Therefore, knowing our position, we could infer the traffic we capture for each keyword.
However, this technique has lost much of its accuracy. Search results are increasingly volatile and depend on local factors, our search history… and now also on our activity on Google Plus, showing on the first page of search results posts from the contacts we have in our circles on that social network.
2. Use the analysis tools of other search engines.
Although there is speculation that Bing will soon encrypt its search results, the truth is that it has not yet done so. However, the market share of this search engine is much lower than that of Google and, in addition, its users have their own characteristics that will not be representative of the total number of users.
3. We can work with the data that Analytics still offers us.
If we calculate that our not provided is 90% we can work with the remaining 10%. For example, if for the keyword “web analytics” Google shows us 10 visits and our not provided is 90%, we can infer that we will have 100 visits for this keyword.
However, the few visits that Analytics still shows us have their own patterns, not representative of the rest of the population. For example, they will be profiles not logged in to Gmail.
4. Base your strategy on the Landing-Centralist model.
Optimizing each web page for one (or a small number of) keywords can give us an estimate of the traffic we get for each keyword.
In this sense, Google Analytics allows us to filter the organic traffic obtained by each landing page, and we can infer that the traffic we have captured with that keyword or any of its variations.
As you can see, none of the above solutions is 100% reliable. So, hopefully this Google Webmaster Tools update will shed some light on the lack of data that both SEO professionals and web analysts are working with.
Eduardo Martínez Fustero is SEO Executive Manager at IEBS business school where he also collaborates with the Master in Web Analytics(more information).