18 months down the line, what has Google AMP really achieved

In early 2016, Google launched Accelerated Mobile Pages (AMP) for publishers who wanted to have their content loaded in a flash on mobile devices at a much faster rate.

At the point of writing this article, 65% of all traffic in Asia (and this is higher for developer countries) is on the mobile. A bulk of these users are on mobile data networks which may not be as fast as a steady broadband connection.

What Google did therefore was to launch a series of initiatives, Weblight and now AMP that would help the search engine load the publisher’s content faster for the user.

Google is focusing on the user

The rationale that Google gave to publishers was that it focused on the user’s experience. If a user is doing a Google search on a choppy data connection, the search results might be presented in the blink of an eye, however, because the publisher’s site was taking too long to load, the user would get a bad experience … or worse, the user would say that Google is too slow!

With Google Weblight, what the organization did was to load the content on an interim site (which was Weblight) and display the content there. This created two problems –

  1. Publishers lost traffic, and Ad Revenues
  2. Publishers lost control on the format of their content and their style guides

Both reasons were strong enough for a lot of publishers to stay away from Weblight.

AMP gets introduced in the mix

To give some control of the content formats back to the user and also to incorporate both analytics and ad scripts into the publisher’s content, Google created another mark-up language. This is AMP.

AMP allows the publisher to present the content on their own site, in a style that’s acceptable to the publisher. It may not have too much flexibility, but at least the publisher is free to design that style instead of the Weblight approach.

This may not be an ideal situation, but atleast it ensures that users are shown the content they are search for the fastest.

Have people embraced AMP?

Well, it’s a bit hazy there. For those of us who were on existing Content Management Systems (CMS) such as WordPress or Joomla it was much easier to transition. It just meant having to install some plugins and do the configuration.

However, the folks who have made their own web apps and products, they are completely clueless as to how to go about implementing AMP.

The sad part is that a lot of the product developers that I have spoken to, are of the opinion that AMP is just a new thing that “SEO folks” have to do. Add to the mental model of SEO being perceived as a much lower the value chain task – that pretty much means that developers are simply not aware about the benefits of AMP.

What irks me is that people’s individual bias is used to mask their ignorance about how to make their products perform better on search.

So, if you are leading a product team or are working on building products, then definitely head on to the Accelerated Mobile Pages project.

As a publisher who has embraced AMP, how does that impact me?

It surprisingly does not help me much with acquiring more traffic. The website is shown a bit differently in the search engine results, and that perhaps is getting me a bit higher click through rates. However, the numbers are not significantly high enough for me to assess based on the data provided.

One major problem with all new initiatives that Google is doing with Search is their stubbornness on keeping things completely opaque.

Not a single publisher is in the loop when it comes to knowing what was the exact payoff of any of the optimization activities they did. It is left for these teams to dig in and figure it out themselves before they are able to attribute the success of this activity. I believe that’s a major deterrent for a lot of product managers to make this choice of embracing AMP.

Easy guide to Keyword Research

Keyword research sounds like such a daunting task. What’s the point of researching a keyword anyway? Shouldn’t your natural marketing instincts tell you which keywords are most relevant to your business. Not quite… Without quality keywords, found through effective keyword research, your search marketing campaigns (paid AND organic) will be completely misguided. If people aren’t…

via The Big, Easy Guide to Keyword Research for Businesses — Wordstream Blog Feed

If you have heard about Unicorns and Start-ups, then you probably would know of Larry Kim and Wordstream.

Linked the above post on why is Keyword Research important and how can a business go about doing keyword research both from a Search Campaign point of view and from a Content Marketing point of view.

If you are concerned with search, then this is a must read.

Data anomalies in Search Console

In the past 5-6 years or so, a lot of online businesses, especially the ones who are hungry for growth have relied on organic traffic as one of their key sources. Now growth could mean an increase in pure numbers (traffic, sessions, users) … or it could mean an increase in more tangible business parameters (revenues, profits). One of the things that I have learnt is that depending on which success metrics we chase, our own identity undergoes a shift.

Search as a major source of traffic

The major contributors to organic traffic are search and social. Wherever there is a site which has great and unique content by the loads, there is a chance for driving organic traffic.

At different points in time, I have been skeptical about Social Media and me-too posting that most brand pages do on platforms such as Facebook. However, Search for me has always been fascinating and I still have faith in Search :).

SEO can’t be a method

Search Engine Optimization (SEO) has evolved over a period of time and I have blogged about it on multiple occasions. Unfortunately, the number of times the algorithm changes and the rate of evolution of what Google (the market leader in this space) construes as quality content ensures that you can’t have a steady SEO “process”.

Having said that, SEO involves a fair amount of design thinking.

The reason behind this statement is because the problem behind search visibility (and the factors that control that) keep changing. It’s a wicked problem. Design thinking can solve such kind of problems because of its test and iterate mechanism.

Data to drive Design Thinking

This is where having the correct data to decide on next steps is crucial. Having a data driven design thinking approach would entail that there are periodical reviews of what kind of data we have available to make the right choices.

Search data has always been plagued with incomplete information. Starting from the 2011 encrypted search announcement, where a bulk of the data in Google Analytics was being reported as (not set). There have been ample approaches to clarify this data, unfortunately, as Google Search goes more towards handhelds and as digital privacy increases, the percentage of data where there is clear visibility will keep going down.

This can’t be helped. What can be done is take these “anomalies” into account and factor those in while doing your analysis.

So what kind of Data anomalies in Search Console do we expect to find?

Google Support has compiled this list. They keep updating their data reporting logic and keep updating this page as well.

One of the major changes that you can see is that last month, they started reporting more data in Google Webmaster Tools. Please bear in mind that this is just a change in the data that is being reported and not the actual search traffic that is on your site.

The link also explains why there is data disparity between Google Analytics and Google Webmaster Tools and any other third party tool that you could be using to generate keyword data.

So, my data is incomplete, what to do?

Don’t panic.

Work with the list of data anomalies and identify which ones are impacting you the most. Having visibility on which parts of data are not available to you is also better than not knowing anything and assuming that the data you have is complete.

In iterations, the first comparison is always your previous state. In both cases the data being made available to you is pretty much the same. Hence, a week on week comparison report is much more valuable as opposed to a comparison report with your closest competitor.

As long as the measures of success is on the same tool, the data anomaly should be cancelled out. Please bear in mind that for most of our data work, we do not need precise data but can work with coarse data.

A simple approach to identify this would be – if you work with charts and graphs more, then you can work with coarse data and absorb the anomalies. If you work with more than 4 decimals, then you might want to add 3-4 lines of disclaimer below your data.

ḷ.com, one more shenanigan in Referral Spam

Spamming the Analytics data of websites is now an old practice. It’s better known as Referral Spam, and I have written about this in the past at multiple times. Purely a black hat practice, I doubt whether it would give great returns.

Yes, it would give traffic to the spammer, but how does that really translate into revenue. Or is the tactic hoping to drive gullible folks by the hordes?

The referral spam industry for some reason also loves to send the geographical position as Samara. For those of you who are noticing this now, here’s how the tactic works.

How Referral Spam works

  1. The bot hits a particular site for multiple times in the day
  2. The analyst sees his Google Analytics account, and gets surprised by a spike in traffic. Who wouldn’t mind seeing such a spike :)
  3. The obvious report to check this out would be the Source / Medium in the Acquisition section.
  4. There staring at you in all glory is the spamming domain
  5. The analyst gets curious, and visits the site

The rest, would not be history, it would be a scam.

How should I combat this?

Raven Tools has a comprehensive article on combating Referral spam. They have listed multiple methods to ensure that this spammy data is not accounted for in your analytics data.

Personally, I allow the data to reside in my Master Data View. The reason behind that is – since I do not look at aggregate data anyways (I prefer lots and lots of custom segments), I am not too bothered with that data! I do however, mark it as a annotation on my GA. That’s the advice I would give to anyone.

Is there a point to Social Media Management?

Life is short. It is time to point out an ugly truth, and to be the brave person that you are, the intelligent rational assessor of reality that you are, and kill all the organic social media activity by your company. All of it. Seems radical, but let’s take it one step at a time.…

via Stop All Social Media Activity (Organic) | Solve For A Profitable Reality — Occam’s Razor by Avinash Kaushik

Any Social Media Marketer would take this as an affront, but the wealth of insights based on pure data that’s being shared by Avinash in the above article is something to think about.

Social Media Platforms are not to be confused as Owned Platforms

There are platforms which we build (such as our very own discussion forum) or a blog. These are Owned Platforms … and then there are platforms where people exist and we simply establish our brand’s presence on those platforms. Such as any Social Media sites e.g Facebook, Twitter.

In such cases, your brand’s outreach is subject to the policies dictated by that platform. Zuck’s Death Spiral (ZDS) is one such example that Avinash is talking about.

Shouldn’t brands adopt Social?

By all means adopt social and engage with your customers online. However, keep in mind that when in Rome, you do as the Romans do. That means, on Facebook – you follow the rules that Zuck lays out. Ergo, the same rinse repeat formula of posting 4-5 Social Media posts a day may not work.

What is required instead, is a concerted effort to truly wow your fans. If you do not wish to do that and want to instead rely on the same well worn formula of doing selfies of your brand, then your social media team is doing you a grave injustice.

A Success/Failure method for Analytics

When identifying the Key Performance Indicators (KPI) of your business, it makes sense to choose the proper measures of success. I have written about choosing the proper measures of success in the past. Since most of the work that I do is in the realm of the web, the principles via which we operate and do reports are more or less the same.

The only thing that changes is the conversion … or the success metric. In other words, the reason for which the website is built, the purpose of that site. Hence, the measure of success approach works.

Designing for new paradigms

However, what would happen if the product being built is not meant for the web, or was not based on the same principles? How would we go about identifying metrics and actionable reports.

For that we would have to go to the very reason why we need analytics.

The Purpose of Analytics

If I were to define the reason why we use analytics in any product, it would be to –

  1. Identify the wins, celebrate them and try to find the rules which get us more wins
  2. Identify the failures, and figure out ways to fix those failures so that we can improve

This view helps us do two things primarily, one to find out and scale the good things, and the other to find out and weed out the bad things in our product.

To do this, we would need metrics (or KPIs) that would indicate a success or a failure.

Measures of Success

The measure of success metric help in identifying the clear wins and celebrating them within the team. These also help in figuring out what worked for you in the past and on how to re-create those wins. One definitive thing that needs to be done (and I have learnt this the hard way), is that wins or measures of success metrics need to shared in a broader audience to give a sense of purpose to the entire team on what they are working on.

A good measure of success is task completion rate, or conversion rate, or profitability.

Measures of Failure

The measure of failure metric help in identifying failures within a certain activity. These are also metrics which help in identifying opportunities of improvement. Measure of Failure metrics should help us root out problems within our current design/product. I say root out, because once you identify the failure, you have to act and ensure that the failure does not happen again.

An example of measure of failure could be bounce rate.

Unlike measures of success, measures of failure may not be shared with large teams. Rather I feel (and I am want your opinion on this), that they are much more effective when communicated to the right localized teams.

Importance of Context in Analyzing data

Recently, I was analyzing some user generated data in a mobile app. The app was sending content on specific categories to a niche audience, and at the end of each content piece, there was a simple 5 star rating feedback for users to rate the piece.

The assumption that the design team who thought of this was that the feedback data was an objective metric.

Objective metric for Subjective behavior

Unfortunately, the behavior of users and how they understood the content piece is a very subjective topic. By subjective, I mean to say that for two different users, the value they would associate to the usefulness of the same piece varies.

We could always say ceterus paribus, but I would say – “Let’s not fool ourselves here”.

In the world of subjectivity, ceterus paribus doesn’t exist

There could be so many factors that are associated to my giving a 5/5 to a piece v/s 4/5 to the same piece, that in the end, I’d be forced to say it depends, and then list out of a whole new set of variables.

Slicing the Data with new variables

This is a problem. Since, my existing data set does not have these new variable. So, from analyzing – now I am back to collecting data. To be frank, there’s no end to this cycle … collect data, realize that you might want more data and rinse, repeat.

Where do we divine the new rules and new variables? We start from the context.

Ergo, the simple and freeing approach of the answer to the questions we were looking for in the data, sometimes lies partially in the data points, and partially in the context.

Let me illustrate this

Let’s take a fairly popular metric – Bounce rate.

Now, if I were to say that my website’s bounce rate is 100%, what would you say?

Sucks, right??

.

.

.

Now, if I were to tell you that my website is a single page website where I want my users to watch a product launch video. That bounce rate suddenly pales and aren’t you itching to ask me about the number of users who played the video upto a certain point?

If you have been working with Google Analytics, then some of you might even suggest that adding a non-interaction event in GA when the play button is hit.

One more example

Let’s take one more metric. Pages/Session to measure how much content the user is consuming on a site.

.

.

.

Let’s see this in a different spiel. A user is on your site, searching for content and is not able to find what he wants, and keeps visiting different pages. After going through 8-9 pages, he finally gives up and leaves the site. That 8.5 as pages/session now doesn’t seem that sexy now does it?

 

Understand the context

Therefore staring at a pure data puke may not help. Understanding the context under which that data was collected is as important as going through excel sheets or powerpoint presentations.

TL;DR – Data without context is open to too many interpretations and is a waste of time.