Data anomalies in Search Console

In the past 5-6 years or so, a lot of online businesses, especially the ones who are hungry for growth have relied on organic traffic as one of their key sources. Now growth could mean an increase in pure numbers (traffic, sessions, users) … or it could mean an increase in more tangible business parameters (revenues, profits). One of the things that I have learnt is that depending on which success metrics we chase, our own identity undergoes a shift.

Search as a major source of traffic

The major contributors to organic traffic are search and social. Wherever there is a site which has great and unique content by the loads, there is a chance for driving organic traffic.

At different points in time, I have been skeptical about Social Media and me-too posting that most brand pages do on platforms such as Facebook. However, Search for me has always been fascinating and I still have faith in Search :).

SEO can’t be a method

Search Engine Optimization (SEO) has evolved over a period of time and I have blogged about it on multiple occasions. Unfortunately, the number of times the algorithm changes and the rate of evolution of what Google (the market leader in this space) construes as quality content ensures that you can’t have a steady SEO “process”.

Having said that, SEO involves a fair amount of design thinking.

The reason behind this statement is because the problem behind search visibility (and the factors that control that) keep changing. It’s a wicked problem. Design thinking can solve such kind of problems because of its test and iterate mechanism.

Data to drive Design Thinking

This is where having the correct data to decide on next steps is crucial. Having a data driven design thinking approach would entail that there are periodical reviews of what kind of data we have available to make the right choices.

Search data has always been plagued with incomplete information. Starting from the 2011 encrypted search announcement, where a bulk of the data in Google Analytics was being reported as (not set). There have been ample approaches to clarify this data, unfortunately, as Google Search goes more towards handhelds and as digital privacy increases, the percentage of data where there is clear visibility will keep going down.

This can’t be helped. What can be done is take these “anomalies” into account and factor those in while doing your analysis.

So what kind of Data anomalies in Search Console do we expect to find?

Google Support has compiled this list. They keep updating their data reporting logic and keep updating this page as well.

One of the major changes that you can see is that last month, they started reporting more data in Google Webmaster Tools. Please bear in mind that this is just a change in the data that is being reported and not the actual search traffic that is on your site.

The link also explains why there is data disparity between Google Analytics and Google Webmaster Tools and any other third party tool that you could be using to generate keyword data.

So, my data is incomplete, what to do?

Don’t panic.

Work with the list of data anomalies and identify which ones are impacting you the most. Having visibility on which parts of data are not available to you is also better than not knowing anything and assuming that the data you have is complete.

In iterations, the first comparison is always your previous state. In both cases the data being made available to you is pretty much the same. Hence, a week on week comparison report is much more valuable as opposed to a comparison report with your closest competitor.

As long as the measures of success is on the same tool, the data anomaly should be cancelled out. Please bear in mind that for most of our data work, we do not need precise data but can work with coarse data.

A simple approach to identify this would be – if you work with charts and graphs more, then you can work with coarse data and absorb the anomalies. If you work with more than 4 decimals, then you might want to add 3-4 lines of disclaimer below your data.

Bing eyes Desktop Search Market

Microsoft’s search engine is one of the lesser known search engines on the web. With lesser known I mean that its far behind in terms of market share with respect to the market leader – Google.

One thing that seems to be working in their favour though, is the So.cl is doing better than G+. Now, in an attempt to nudge out Google from the King of Search Engines, Microsoft has rolled out a new update with their Windows platform. The Bing Desktop.

Good features, and I do recommend this because I like beautiful wallpapers on my desktop :), is that Bing auto-changes the desktop on my Windows. It’s a simple feature, but its different from what Google Desktop does. The rest of the features are pretty much the same as Google Desktop (including the search bar when you hit Ctrl twice).

Google increased their business by focusing on their main product – Search. I think Microsoft is taking a page from that strategic move … focus on pushing products via their flagship product – Windows.

Google Search Update

I had earlier posted about a starters guide to SEO. Back then my understanding of this was also developing and the only way I could add to my understanding of the subject was via experiments … that I carried out on this blog, and also at work.

Finally, we did arrive at a scalable solution. A solution for SEO which could easily be replicated and scaled for almost all my target keywords. Out of a target 5000+ keywords, we managed to get in the top 10 for a decent 3000 of them, and would have proceeded to touch base upon all of them as well (do remind me to release this as well!)

Had it not been for the upcoming Google, I would have went ahead with the plan. However, the new update effectively means that all black hat SEO tactics (which the Indian SEO industry is famous for) will be negatively penalized. So throw your keyword stuffing, badly written English to match your keywords, slightly different versions to match the keyword variants, and link submissions out of the best practices window.

I wonder what would be the extent of the penalty levied by the new algorithm on sites which are already way ahead on their link submissions. Getting those links off the 1000 or so directories is going to be tough!!

Here’s a parting thought, that Google Search is embedded in so many Digital Strategist’s plans, that people are not even considering to optimize their pages for other search engines (such as Bing). Isn’t it great how being a market leader can impact an entire industry?

Note to Self – Learn about Bing optimization and work on generating search traffic from Bing.