Shiny tools don’t make a purpose

Recently, I bought a Fitbit. It’s a fantastic tool. Now, I can rave more about the features and go on and on. However, a friend and a colleague asked me an interesting question.

Has it changed you?
No, it did not.

Before I go on, I have to tell you that I am on the heavier side of the weighing scale. Those of you who know me personally would be surprised at the sudden interest in all things health. Yeah, I roll like that.

It’s not about the Fitbit

Like any other measurement tool, the Fitbit is doing a marvelous job at letting me know certain metrics that I need to care about.

They have even gamified the steps by putting in cute little badges and built in peer support (and also peer pressure) to keep me motivated. All this is good as it should be.

At the core of it, it’s a measurement tool. Just like any of the billion other tools we use in Analytics.

Targets and Measurements

On very similar lines, we as marketers or as businessmen often deploy shiny new tools because we think they will help us do more.

Unfortunately, like me in this case, how many of us forget on defining the purpose?

I implicitly assumed that the Fitbit would automatically by some magic give me the purpose of losing weight and leading a more healthy life. Without this purpose, here’s what would happen —

I will wear it to work, and dutifully report the steps taken and life would go on as usual. Some of the badges would come in as time goes by, and it would not really matter to me if I took 2000 steps a day (which is a walk in the park) or 10000 steps a day (I haven’t achieved this yet).

How would I change, if let’s say I choose to give myself a target of say, 10000 steps a day.

Without Purpose, there’s no Change

I would for one have to make time to walk those 10000 steps. I could try walking in the office or doing a much more rigorous transit than an Uber. However, I would have to commit to making the time for those steps.

Thus, this choice of making a change in my routine should be addressed. At the heart of it, the shiny new tool is not at the center. Yes, you have bought Google Analytics Premium and all of that is great … but that’s not really at the center.

At the center, is the purpose. Has this been defined? Has this been clarified and articulated so that the team knows about this?

A tool doesn’t give us Purpose

It does give us a sense of progress towards our purpose. A Measure of Success, if you will. The shiny new tool that we just acquired is useful, but only as long as we keep the purpose at the center.

As people who know how to use a tool, if we do not understand the purpose, the tool will end up regurgitating meaningless data.

TL;DR — When setting up measures, don’t keep the tool at the center. Keep the purpose at the center. The rest should follow.

18 months down the line, what has Google AMP really achieved

In early 2016, Google launched Accelerated Mobile Pages (AMP) for publishers who wanted to have their content loaded in a flash on mobile devices at a much faster rate.

At the point of writing this article, 65% of all traffic in Asia (and this is higher for developer countries) is on the mobile. A bulk of these users are on mobile data networks which may not be as fast as a steady broadband connection.

What Google did therefore was to launch a series of initiatives, Weblight and now AMP that would help the search engine load the publisher’s content faster for the user.

Google is focusing on the user

The rationale that Google gave to publishers was that it focused on the user’s experience. If a user is doing a Google search on a choppy data connection, the search results might be presented in the blink of an eye, however, because the publisher’s site was taking too long to load, the user would get a bad experience … or worse, the user would say that Google is too slow!

With Google Weblight, what the organization did was to load the content on an interim site (which was Weblight) and display the content there. This created two problems –

  1. Publishers lost traffic, and Ad Revenues
  2. Publishers lost control on the format of their content and their style guides

Both reasons were strong enough for a lot of publishers to stay away from Weblight.

AMP gets introduced in the mix

To give some control of the content formats back to the user and also to incorporate both analytics and ad scripts into the publisher’s content, Google created another mark-up language. This is AMP.

AMP allows the publisher to present the content on their own site, in a style that’s acceptable to the publisher. It may not have too much flexibility, but at least the publisher is free to design that style instead of the Weblight approach.

This may not be an ideal situation, but atleast it ensures that users are shown the content they are search for the fastest.

Have people embraced AMP?

Well, it’s a bit hazy there. For those of us who were on existing Content Management Systems (CMS) such as WordPress or Joomla it was much easier to transition. It just meant having to install some plugins and do the configuration.

However, the folks who have made their own web apps and products, they are completely clueless as to how to go about implementing AMP.

The sad part is that a lot of the product developers that I have spoken to, are of the opinion that AMP is just a new thing that “SEO folks” have to do. Add to the mental model of SEO being perceived as a much lower the value chain task – that pretty much means that developers are simply not aware about the benefits of AMP.

What irks me is that people’s individual bias is used to mask their ignorance about how to make their products perform better on search.

So, if you are leading a product team or are working on building products, then definitely head on to the Accelerated Mobile Pages project.

As a publisher who has embraced AMP, how does that impact me?

It surprisingly does not help me much with acquiring more traffic. The website is shown a bit differently in the search engine results, and that perhaps is getting me a bit higher click through rates. However, the numbers are not significantly high enough for me to assess based on the data provided.

One major problem with all new initiatives that Google is doing with Search is their stubbornness on keeping things completely opaque.

Not a single publisher is in the loop when it comes to knowing what was the exact payoff of any of the optimization activities they did. It is left for these teams to dig in and figure it out themselves before they are able to attribute the success of this activity. I believe that’s a major deterrent for a lot of product managers to make this choice of embracing AMP.

The web is not Google

I am coming back to this post after 6 months, found this on the internet – the AMP Letter. This is pretty much what I wanted to say about how this is shaping up.

Data anomalies in Search Console

In the past 5-6 years or so, a lot of online businesses, especially the ones who are hungry for growth have relied on organic traffic as one of their key sources. Now growth could mean an increase in pure numbers (traffic, sessions, users) … or it could mean an increase in more tangible business parameters (revenues, profits). One of the things that I have learnt is that depending on which success metrics we chase, our own identity undergoes a shift.

Search as a major source of traffic

The major contributors to organic traffic are search and social. Wherever there is a site which has great and unique content by the loads, there is a chance for driving organic traffic.

At different points in time, I have been skeptical about Social Media and me-too posting that most brand pages do on platforms such as Facebook. However, Search for me has always been fascinating and I still have faith in Search :).

SEO can’t be a method

Search Engine Optimization (SEO) has evolved over a period of time and I have blogged about it on multiple occasions. Unfortunately, the number of times the algorithm changes and the rate of evolution of what Google (the market leader in this space) construes as quality content ensures that you can’t have a steady SEO “process”.

Having said that, SEO involves a fair amount of design thinking.

The reason behind this statement is because the problem behind search visibility (and the factors that control that) keep changing. It’s a wicked problem. Design thinking can solve such kind of problems because of its test and iterate mechanism.

Data to drive Design Thinking

This is where having the correct data to decide on next steps is crucial. Having a data driven design thinking approach would entail that there are periodical reviews of what kind of data we have available to make the right choices.

Search data has always been plagued with incomplete information. Starting from the 2011 encrypted search announcement, where a bulk of the data in Google Analytics was being reported as (not set). There have been ample approaches to clarify this data, unfortunately, as Google Search goes more towards handhelds and as digital privacy increases, the percentage of data where there is clear visibility will keep going down.

This can’t be helped. What can be done is take these “anomalies” into account and factor those in while doing your analysis.

So what kind of Data anomalies in Search Console do we expect to find?

Google Support has compiled this list. They keep updating their data reporting logic and keep updating this page as well.

One of the major changes that you can see is that last month, they started reporting more data in Google Webmaster Tools. Please bear in mind that this is just a change in the data that is being reported and not the actual search traffic that is on your site.

The link also explains why there is data disparity between Google Analytics and Google Webmaster Tools and any other third party tool that you could be using to generate keyword data.

So, my data is incomplete, what to do?

Don’t panic.

Work with the list of data anomalies and identify which ones are impacting you the most. Having visibility on which parts of data are not available to you is also better than not knowing anything and assuming that the data you have is complete.

In iterations, the first comparison is always your previous state. In both cases the data being made available to you is pretty much the same. Hence, a week on week comparison report is much more valuable as opposed to a comparison report with your closest competitor.

As long as the measures of success is on the same tool, the data anomaly should be cancelled out. Please bear in mind that for most of our data work, we do not need precise data but can work with coarse data.

A simple approach to identify this would be – if you work with charts and graphs more, then you can work with coarse data and absorb the anomalies. If you work with more than 4 decimals, then you might want to add 3-4 lines of disclaimer below your data.

ḷ.com, one more shenanigan in Referral Spam

Spamming the Analytics data of websites is now an old practice. It’s better known as Referral Spam, and I have written about this in the past at multiple times. Purely a black hat practice, I doubt whether it would give great returns.

Yes, it would give traffic to the spammer, but how does that really translate into revenue. Or is the tactic hoping to drive gullible folks by the hordes?

The referral spam industry for some reason also loves to send the geographical position as Samara. For those of you who are noticing this now, here’s how the tactic works.

How Referral Spam works

  1. The bot hits a particular site for multiple times in the day
  2. The analyst sees his Google Analytics account, and gets surprised by a spike in traffic. Who wouldn’t mind seeing such a spike :)
  3. The obvious report to check this out would be the Source / Medium in the Acquisition section.
  4. There staring at you in all glory is the spamming domain
  5. The analyst gets curious, and visits the site

The rest, would not be history, it would be a scam.

How should I combat this?

Raven Tools has a comprehensive article on combating Referral spam. They have listed multiple methods to ensure that this spammy data is not accounted for in your analytics data.

Personally, I allow the data to reside in my Master Data View. The reason behind that is – since I do not look at aggregate data anyways (I prefer lots and lots of custom segments), I am not too bothered with that data! I do however, mark it as a annotation on my GA. That’s the advice I would give to anyone.

Is there a point to Social Media Management?

Life is short. It is time to point out an ugly truth, and to be the brave person that you are, the intelligent rational assessor of reality that you are, and kill all the organic social media activity by your company. All of it. Seems radical, but let’s take it one step at a time.…

via Stop All Social Media Activity (Organic) | Solve For A Profitable Reality — Occam’s Razor by Avinash Kaushik

Any Social Media Marketer would take this as an affront, but the wealth of insights based on pure data that’s being shared by Avinash in the above article is something to think about.

Social Media Platforms are not to be confused as Owned Platforms

There are platforms which we build (such as our very own discussion forum) or a blog. These are Owned Platforms … and then there are platforms where people exist and we simply establish our brand’s presence on those platforms. Such as any Social Media sites e.g Facebook, Twitter.

In such cases, your brand’s outreach is subject to the policies dictated by that platform. Zuck’s Death Spiral (ZDS) is one such example that Avinash is talking about.

Shouldn’t brands adopt Social?

By all means adopt social and engage with your customers online. However, keep in mind that when in Rome, you do as the Romans do. That means, on Facebook – you follow the rules that Zuck lays out. Ergo, the same rinse repeat formula of posting 4-5 Social Media posts a day may not work.

What is required instead, is a concerted effort to truly wow your fans. If you do not wish to do that and want to instead rely on the same well worn formula of doing selfies of your brand, then your social media team is doing you a grave injustice.

A Success/Failure method for Analytics

When identifying the Key Performance Indicators (KPI) of your business, it makes sense to choose the proper measures of success. I have written about choosing the proper measures of success in the past. Since most of the work that I do is in the realm of the web, the principles via which we operate and do reports are more or less the same.

The only thing that changes is the conversion … or the success metric. In other words, the reason for which the website is built, the purpose of that site. Hence, the measure of success approach works.

Designing for new paradigms

However, what would happen if the product being built is not meant for the web, or was not based on the same principles? How would we go about identifying metrics and actionable reports.

For that we would have to go to the very reason why we need analytics.

The Purpose of Analytics

If I were to define the reason why we use analytics in any product, it would be to –

  1. Identify the wins, celebrate them and try to find the rules which get us more wins
  2. Identify the failures, and figure out ways to fix those failures so that we can improve

This view helps us do two things primarily, one to find out and scale the good things, and the other to find out and weed out the bad things in our product.

To do this, we would need metrics (or KPIs) that would indicate a success or a failure.

Measures of Success

The measure of success metric help in identifying the clear wins and celebrating them within the team. These also help in figuring out what worked for you in the past and on how to re-create those wins. One definitive thing that needs to be done (and I have learnt this the hard way), is that wins or measures of success metrics need to shared in a broader audience to give a sense of purpose to the entire team on what they are working on.

A good measure of success is task completion rate, or conversion rate, or profitability.

Measures of Failure

The measure of failure metric help in identifying failures within a certain activity. These are also metrics which help in identifying opportunities of improvement. Measure of Failure metrics should help us root out problems within our current design/product. I say root out, because once you identify the failure, you have to act and ensure that the failure does not happen again.

An example of measure of failure could be bounce rate.

Unlike measures of success, measures of failure may not be shared with large teams. Rather I feel (and I am want your opinion on this), that they are much more effective when communicated to the right localized teams.

Importance of Context in Analyzing data

Recently, I was analyzing some user generated data in a mobile app. The app was sending content on specific categories to a niche audience, and at the end of each content piece, there was a simple 5 star rating feedback for users to rate the piece.

The assumption that the design team who thought of this was that the feedback data was an objective metric.

Objective metric for Subjective behavior

Unfortunately, the behavior of users and how they understood the content piece is a very subjective topic. By subjective, I mean to say that for two different users, the value they would associate to the usefulness of the same piece varies.

We could always say ceterus paribus, but I would say – “Let’s not fool ourselves here”.

In the world of subjectivity, ceterus paribus doesn’t exist

There could be so many factors that are associated to my giving a 5/5 to a piece v/s 4/5 to the same piece, that in the end, I’d be forced to say it depends, and then list out of a whole new set of variables.

Slicing the Data with new variables

This is a problem. Since, my existing data set does not have these new variable. So, from analyzing – now I am back to collecting data. To be frank, there’s no end to this cycle … collect data, realize that you might want more data and rinse, repeat.

Where do we divine the new rules and new variables? We start from the context.

Ergo, the simple and freeing approach of the answer to the questions we were looking for in the data, sometimes lies partially in the data points, and partially in the context.

Let me illustrate this

Let’s take a fairly popular metric – Bounce rate.

Now, if I were to say that my website’s bounce rate is 100%, what would you say?

Sucks, right??

.

.

.

Now, if I were to tell you that my website is a single page website where I want my users to watch a product launch video. That bounce rate suddenly pales and aren’t you itching to ask me about the number of users who played the video upto a certain point?

If you have been working with Google Analytics, then some of you might even suggest that adding a non-interaction event in GA when the play button is hit.

One more example

Let’s take one more metric. Pages/Session to measure how much content the user is consuming on a site.

.

.

.

Let’s see this in a different spiel. A user is on your site, searching for content and is not able to find what he wants, and keeps visiting different pages. After going through 8-9 pages, he finally gives up and leaves the site. That 8.5 as pages/session now doesn’t seem that sexy now does it?

 

Understand the context

Therefore staring at a pure data puke may not help. Understanding the context under which that data was collected is as important as going through excel sheets or powerpoint presentations.

TL;DR – Data without context is open to too many interpretations and is a waste of time.