Webmaster rolls out new Search Console

New-Search-Console

This has been coming for quite some time. Google Webmaster is slowly rolling out an update to the Search Console. The new search console is a much cleaner interface with most of the reports and insights hidden under multiple layers.

The Dashboard

The new Search Console dashboard is much more simpler now, with just three reports that you can view –

  1. Performance – How your site is performing in the search results. This is similar to the Search Analytics report.
  2. Index Coverage – How your site has been indexed and what errors does the search engine detect on the site
  3. AMP – Information about the Accelerated Mobile Pages and how Google detects AMP on your site

The good things

Straight off the cuff, this tool provides just the right information to the user. It forces the user to engage with the reports available, and provides interesting insights that were not available before.

Take a look at this new report which tells me the impact on Search Impressions when I decided to mark 75 odd pages on noindex.

Index Coverage Reports Insight
The report shows how I have lost impressions because of adding noindex to pages

Overall, most reports even if being the same are now much more easier to understand and interpret.

Another example is the Performance report with the Pages section selected. This feature was available in the older interface as well, but now the data is much more clear and this tells me which pages should I work on to improve my CTRs.

Search Analytics Report
I need to tweak the description and meta details of some of the top pages

The bad things

There are more things yet to come, and this is not a complete experience yet. I cannot only rely on the new interface. Some reports simply haven’t been incorporated into the new Search Console yet.

Reports on Structured Data is completely missing. The diagnostic set of tools and the crawl request tools are not included in the tool.

The update to the search console is welcome and I for one am glad that a much more cleaner interface is made available.

18 months down the line, what has Google AMP really achieved

In early 2016, Google launched Accelerated Mobile Pages (AMP) for publishers who wanted to have their content loaded in a flash on mobile devices at a much faster rate.

At the point of writing this article, 65% of all traffic in Asia (and this is higher for developer countries) is on the mobile. A bulk of these users are on mobile data networks which may not be as fast as a steady broadband connection.

What Google did therefore was to launch a series of initiatives, Weblight and now AMP that would help the search engine load the publisher’s content faster for the user.

Google is focusing on the user

The rationale that Google gave to publishers was that it focused on the user’s experience. If a user is doing a Google search on a choppy data connection, the search results might be presented in the blink of an eye, however, because the publisher’s site was taking too long to load, the user would get a bad experience … or worse, the user would say that Google is too slow!

With Google Weblight, what the organization did was to load the content on an interim site (which was Weblight) and display the content there. This created two problems –

  1. Publishers lost traffic, and Ad Revenues
  2. Publishers lost control on the format of their content and their style guides

Both reasons were strong enough for a lot of publishers to stay away from Weblight.

AMP gets introduced in the mix

To give some control of the content formats back to the user and also to incorporate both analytics and ad scripts into the publisher’s content, Google created another mark-up language. This is AMP.

AMP allows the publisher to present the content on their own site, in a style that’s acceptable to the publisher. It may not have too much flexibility, but at least the publisher is free to design that style instead of the Weblight approach.

This may not be an ideal situation, but atleast it ensures that users are shown the content they are search for the fastest.

Have people embraced AMP?

Well, it’s a bit hazy there. For those of us who were on existing Content Management Systems (CMS) such as WordPress or Joomla it was much easier to transition. It just meant having to install some plugins and do the configuration.

However, the folks who have made their own web apps and products, they are completely clueless as to how to go about implementing AMP.

The sad part is that a lot of the product developers that I have spoken to, are of the opinion that AMP is just a new thing that “SEO folks” have to do. Add to the mental model of SEO being perceived as a much lower the value chain task – that pretty much means that developers are simply not aware about the benefits of AMP.

What irks me is that people’s individual bias is used to mask their ignorance about how to make their products perform better on search.

So, if you are leading a product team or are working on building products, then definitely head on to the Accelerated Mobile Pages project.

As a publisher who has embraced AMP, how does that impact me?

It surprisingly does not help me much with acquiring more traffic. The website is shown a bit differently in the search engine results, and that perhaps is getting me a bit higher click through rates. However, the numbers are not significantly high enough for me to assess based on the data provided.

One major problem with all new initiatives that Google is doing with Search is their stubbornness on keeping things completely opaque.

Not a single publisher is in the loop when it comes to knowing what was the exact payoff of any of the optimization activities they did. It is left for these teams to dig in and figure it out themselves before they are able to attribute the success of this activity. I believe that’s a major deterrent for a lot of product managers to make this choice of embracing AMP.

The web is not Google

I am coming back to this post after 6 months, found this on the internet – the AMP Letter. This is pretty much what I wanted to say about how this is shaping up.

Data anomalies in Search Console

In the past 5-6 years or so, a lot of online businesses, especially the ones who are hungry for growth have relied on organic traffic as one of their key sources. Now growth could mean an increase in pure numbers (traffic, sessions, users) … or it could mean an increase in more tangible business parameters (revenues, profits). One of the things that I have learnt is that depending on which success metrics we chase, our own identity undergoes a shift.

Search as a major source of traffic

The major contributors to organic traffic are search and social. Wherever there is a site which has great and unique content by the loads, there is a chance for driving organic traffic.

At different points in time, I have been skeptical about Social Media and me-too posting that most brand pages do on platforms such as Facebook. However, Search for me has always been fascinating and I still have faith in Search :).

SEO can’t be a method

Search Engine Optimization (SEO) has evolved over a period of time and I have blogged about it on multiple occasions. Unfortunately, the number of times the algorithm changes and the rate of evolution of what Google (the market leader in this space) construes as quality content ensures that you can’t have a steady SEO “process”.

Having said that, SEO involves a fair amount of design thinking.

The reason behind this statement is because the problem behind search visibility (and the factors that control that) keep changing. It’s a wicked problem. Design thinking can solve such kind of problems because of its test and iterate mechanism.

Data to drive Design Thinking

This is where having the correct data to decide on next steps is crucial. Having a data driven design thinking approach would entail that there are periodical reviews of what kind of data we have available to make the right choices.

Search data has always been plagued with incomplete information. Starting from the 2011 encrypted search announcement, where a bulk of the data in Google Analytics was being reported as (not set). There have been ample approaches to clarify this data, unfortunately, as Google Search goes more towards handhelds and as digital privacy increases, the percentage of data where there is clear visibility will keep going down.

This can’t be helped. What can be done is take these “anomalies” into account and factor those in while doing your analysis.

So what kind of Data anomalies in Search Console do we expect to find?

Google Support has compiled this list. They keep updating their data reporting logic and keep updating this page as well.

One of the major changes that you can see is that last month, they started reporting more data in Google Webmaster Tools. Please bear in mind that this is just a change in the data that is being reported and not the actual search traffic that is on your site.

The link also explains why there is data disparity between Google Analytics and Google Webmaster Tools and any other third party tool that you could be using to generate keyword data.

So, my data is incomplete, what to do?

Don’t panic.

Work with the list of data anomalies and identify which ones are impacting you the most. Having visibility on which parts of data are not available to you is also better than not knowing anything and assuming that the data you have is complete.

In iterations, the first comparison is always your previous state. In both cases the data being made available to you is pretty much the same. Hence, a week on week comparison report is much more valuable as opposed to a comparison report with your closest competitor.

As long as the measures of success is on the same tool, the data anomaly should be cancelled out. Please bear in mind that for most of our data work, we do not need precise data but can work with coarse data.

A simple approach to identify this would be – if you work with charts and graphs more, then you can work with coarse data and absorb the anomalies. If you work with more than 4 decimals, then you might want to add 3-4 lines of disclaimer below your data.

Game Theory and SEO

This blog has been my place to articulate my thoughts, to propose experiments and my views on multiple topics. Having said that, this is one such piece.

I would love to hear your views about this and feel free to scroll down to that comment box and leave a line (or two).

What is Game Theory?

Taking the excerpt from Wikipedia –

Game theory is “the study of mathematical models of conflict and cooperation between intelligent rational decision-makers.” Game theory is mainly used in economics, political science, and psychology, as well as logic, computer science and biology.

In this piece, I am proposing that we can use the basic precepts of Game Theory and apply them to SEO strategies as well.

Originally, it addressed zero-sum games, in which one person’s gains result in losses for the other participants. Today, game theory applies to a wide range of behavioral relations, and is now an umbrella term for the science of logical decision making in humans, animals, and computers.

In Search Engine Optimization, for a particular query search, only one site can be at the top. At the cost of the search visibility of other sites.

Ergo, SEO is clearly a zero-sum scenario.

Wait, isn’t this between two players?

That’s what we construe of Game Theory … and more importantly with Prisoner’s Dilemma. However, in the real world, and in almost any market driven environment, there are always multiple players.

Such scenarios are referred to as n-person games, or in Gaming parlance – multi-player games. This gives way to something we define as Evolutionary game theory.

What is Evolutionary Game Theory?

Evolutionary game theory considers games involving a population of decision makers, where the frequency with which a particular decision is made can change over time in response to the decisions made by all individuals in the population.

So, in SEO the strategy that I can adopt at any point of time is suspect to change, and over a period of time, most players who are working on their SEO would tend to change their strategy and evolve their approach.

In economics, the same theory is intended to capture population changes because people play the game many times within their lifetime, and consciously (and perhaps rationally) switch strategies.

Ditto about SEO again. In textbook style, I could say don’t do Black Hat. However, you know it and I know it … that at some point of time in our lives we have done Black Hat. Yes yes yes, it doesn’t work and you have to pay the price, but we still have gone ahead, haven’t we?

This change in tactics, resulting in evolution of market dynamics effectively ends up changing the winning strategies of the game. A research article that talks about how the competing strategies change within a network of decision makers is available here.

To read more on Evolutionary Game Theory, here is the wiki link.

Rituals and Evolutionary Game Theory

One more interesting characteristic that mathematical biologist John Maynard Smith realized when studying the behavior of game theory in communities was that in biological communities (his research was based on Darwinian concepts and survival of the fittest) most of the players did not focus on their strategy as a winning one, but treated their strategies as at a ritualistic level.

Ergo, for most members of the population it was not important whether they were engaged in a competitive and winning strategy, but rather that they were engaged in a strategy in the first place.

Wait, what?

Let me rephrase that statement.

Players involved in playing a multi-player game, where the game itself was changing constantly, the winning strategy was not important for players.

So much, as having a strategy in the first place.

Uh, I thought this was going to be on SEO

It is.

In a game of lets-get-on-top (on Google), all of us marketers are running circles trying to figure out the best SEO strategy.

We have seen many of the oft-quoted paradigms here –

  1. Content is king
  2. Great Link profiling
  3. Black Hat

What I am proposing is that it really does not matter which step you take … as long as you decide to take a step as per a strategy and then choosing to evolve your stance after you find out the result.

 

Google, GoDaddy and the HTTPS Conundrum

I like to stay active on this blog, and I love the constant tinkering on WordPress (right from identifying which plugins to install to customizing the theme).

This is one of the main reasons I am able to blog on a regularish basis. Obviously, since it’s a content driven site, the bulk of my traffic comes from Google Search.

So what’s wrong with that

The over dependence on organic search means for the blog to have more visitors, user engagement and comments – I have to try and follow the diktats of the market leader in Search. That’s Google.

I am a fan of most of Google’s work. However, the kind of hold they have on the search market means that publishers who want to be found on search have to work towards being search friendly. Google is all about Do No Evil, and I respect them for that. However, with the recent HTTPS update to their search algorithm, small time publishers are forced to relook at their hosting solutions.

GoDaddy and shared hosting

My site doesn’t get a lot of traffic, 2k-3k visitors a month. For that kind of traffic, a shared hosting plan is perfect. I have been using GoDaddy for quite some time now. Primarily, because most of my domains are within this account.

For the past 8 years or so, I have been using this account, I cannot complain about the service. I know it’s a shared hosting, and it has managed to meet my expectations.

Until now.

GoDaddy and HTTPS

The hosting plan I have is a simple plan and it does not support me having to install a custom SSL certificate. So much so that, even if I wanted to purchase a certificate from GoDaddy – I am not able to do so. Perhaps its a glitch in their interface.

I could get a Lets Encrypt certificate, and that’s what I have done for the 13 Llama Interactive site and for Harshaja’s blog. The problem with this approach is that both these sites are hosted on a DO instance. Where I can easily control the installation of the SSL certificate.

There is no simple option for doing that. GoDaddy support is of no use, and that leads me to a dead end.

Cloudflare can help

This is where a reverse proxy like Cloudflare helped. Atleast all the requests that are going to the site can be sent to an HTTPS version of the site. The lookup itself is done via Cloudflare and I have updated the website settings in my WordPress to serve from the HTTPS endpoint.

However, this kind of kills the wp-admin section. Thankfully through the REST API and Jetpack’s connection to WordPress.com I can still manage to post content.

So what can a publisher do?

At this juncture, I could simply shift my hosting and be done with it. It’s the easiest option. However, what about all those publishers out there who may not have such an option available to them.

There has to be a simpler solution to this mess.

Food for Thought: Part V

Healthy food

Ever since Google Reader was shunted, I haven’t been able to settle on a RSS reader. I did try out multiple other services and I am currently using Digg’s reader. Not having a good RSS software kind of puts a huge impediment on one’s reading, and this is probably my excuse for not posting more often!

Continue reading “Food for Thought: Part V”