Read through the post, and realized that the title is a bit off. It’s not that Social Media is sending some dangerous traffic, but that the traffic being sent is being incorrectly measured as Direct traffic and therefore, difficult to act upon. This misdirection can lead to a lot of tactical mistakes.
What’s more interesting is the story about World War II that Gareth has nicely illustrated. The deaths due to a D-Day rehearsal were more than D-Day itself. The reason behind this is people coming to the wrong conclusions because of the data made available.
A light skim of this article might put me off Social Media as a marketing channel. As it is I am a bit biased against it, but this would have put the final nail in the coffin. However … this is the blind spot that I am referring to.
Slight misinformation, and there we go jumping to the wrong conclusions. As an analyst, something that you might want to keep in mind is the quality and the veracity of the data that you analyze.
In early 2016, Google launched Accelerated Mobile Pages (AMP) for publishers who wanted to have their content loaded in a flash on mobile devices at a much faster rate.
At the point of writing this article, 65% of all traffic in Asia (and this is higher for developer countries) is on the mobile. A bulk of these users are on mobile data networks which may not be as fast as a steady broadband connection.
What Google did therefore was to launch a series of initiatives, Weblight and now AMP that would help the search engine load the publisher’s content faster for the user.
Google is focusing on the user
The rationale that Google gave to publishers was that it focused on the user’s experience. If a user is doing a Google search on a choppy data connection, the search results might be presented in the blink of an eye, however, because the publisher’s site was taking too long to load, the user would get a bad experience … or worse, the user would say that Google is too slow!
With Google Weblight, what the organization did was to load the content on an interim site (which was Weblight) and display the content there. This created two problems –
Publishers lost traffic, and Ad Revenues
Publishers lost control on the format of their content and their style guides
Both reasons were strong enough for a lot of publishers to stay away from Weblight.
AMP gets introduced in the mix
To give some control of the content formats back to the user and also to incorporate both analytics and ad scripts into the publisher’s content, Google created another mark-up language. This is AMP.
AMP allows the publisher to present the content on their own site, in a style that’s acceptable to the publisher. It may not have too much flexibility, but at least the publisher is free to design that style instead of the Weblight approach.
This may not be an ideal situation, but atleast it ensures that users are shown the content they are search for the fastest.
Have people embraced AMP?
Well, it’s a bit hazy there. For those of us who were on existing Content Management Systems (CMS) such as WordPress or Joomla it was much easier to transition. It just meant having to install some plugins and do the configuration.
However, the folks who have made their own web apps and products, they are completely clueless as to how to go about implementing AMP.
The sad part is that a lot of the product developers that I have spoken to, are of the opinion that AMP is just a new thing that “SEO folks” have to do. Add to the mental model of SEO being perceived as a much lower the value chain task – that pretty much means that developers are simply not aware about the benefits of AMP.
What irks me is that people’s individual bias is used to mask their ignorance about how to make their products perform better on search.
As a publisher who has embraced AMP, how does that impact me?
It surprisingly does not help me much with acquiring more traffic. The website is shown a bit differently in the search engine results, and that perhaps is getting me a bit higher click through rates. However, the numbers are not significantly high enough for me to assess based on the data provided.
One major problem with all new initiatives that Google is doing with Search is their stubbornness on keeping things completely opaque.
Not a single publisher is in the loop when it comes to knowing what was the exact payoff of any of the optimization activities they did. It is left for these teams to dig in and figure it out themselves before they are able to attribute the success of this activity. I believe that’s a major deterrent for a lot of product managers to make this choice of embracing AMP.
The web is not Google
I am coming back to this post after 6 months, found this on the internet – the AMP Letter. This is pretty much what I wanted to say about how this is shaping up.
This post is part of the thread: AMP – an ongoing story on this site. View the thread timeline for more context on this post.
In the past 5-6 years or so, a lot of online businesses, especially the ones who are hungry for growth have relied on organic traffic as one of their key sources. Now growth could mean an increase in pure numbers (traffic, sessions, users) … or it could mean an increase in more tangible business parameters (revenues, profits). One of the things that I have learnt is that depending on which success metrics we chase, our own identity undergoes a shift.
Search as a major source of traffic
The major contributors to organic traffic are search and social. Wherever there is a site which has great and unique content by the loads, there is a chance for driving organic traffic.
At different points in time, I have been skeptical about Social Media and me-too posting that most brand pages do on platforms such as Facebook. However, Search for me has always been fascinating and I still have faith in Search :).
SEO can’t be a method
Search Engine Optimization (SEO) has evolved over a period of time and I have blogged about it on multiple occasions. Unfortunately, the number of times the algorithm changes and the rate of evolution of what Google (the market leader in this space) construes as quality content ensures that you can’t have a steady SEO “process”.
Having said that, SEO involves a fair amount of design thinking.
The reason behind this statement is because the problem behind search visibility (and the factors that control that) keep changing. It’s a wicked problem. Design thinking can solve such kind of problems because of its test and iterate mechanism.
Data to drive Design Thinking
This is where having the correct data to decide on next steps is crucial. Having a data driven design thinking approach would entail that there are periodical reviews of what kind of data we have available to make the right choices.
Search data has always been plagued with incomplete information. Starting from the 2011 encrypted search announcement, where a bulk of the data in Google Analytics was being reported as (not set). There have been ample approaches to clarify this data, unfortunately, as Google Search goes more towards handhelds and as digital privacy increases, the percentage of data where there is clear visibility will keep going down.
This can’t be helped. What can be done is take these “anomalies” into account and factor those in while doing your analysis.
So what kind of Data anomalies in Search Console do we expect to find?
Google Support has compiled this list. They keep updating their data reporting logic and keep updating this page as well.
One of the major changes that you can see is that last month, they started reporting more data in Google Webmaster Tools. Please bear in mind that this is just a change in the data that is being reported and not the actual search traffic that is on your site.
The link also explains why there is data disparity between Google Analytics and Google Webmaster Tools and any other third party tool that you could be using to generate keyword data.
So, my data is incomplete, what to do?
Work with the list of data anomalies and identify which ones are impacting you the most. Having visibility on which parts of data are not available to you is also better than not knowing anything and assuming that the data you have is complete.
In iterations, the first comparison is always your previous state. In both cases the data being made available to you is pretty much the same. Hence, a week on week comparison report is much more valuable as opposed to a comparison report with your closest competitor.
As long as the measures of success is on the same tool, the data anomaly should be cancelled out. Please bear in mind that for most of our data work, we do not need precise data but can work with coarse data.
A simple approach to identify this would be – if you work with charts and graphs more, then you can work with coarse data and absorb the anomalies. If you work with more than 4 decimals, then you might want to add 3-4 lines of disclaimer below your data.
Recently, I was analyzing some user generated data in a mobile app. The app was sending content on specific categories to a niche audience, and at the end of each content piece, there was a simple 5 star rating feedback for users to rate the piece.
The assumption that the design team who thought of this was that the feedback data was an objective metric.
Objective metric for Subjective behavior
Unfortunately, the behavior of users and how they understood the content piece is a very subjective topic. By subjective, I mean to say that for two different users, the value they would associate to the usefulness of the same piece varies.
We could always say ceterus paribus, but I would say – “Let’s not fool ourselves here”.
In the world of subjectivity, ceterus paribus doesn’t exist
There could be so many factors that are associated to my giving a 5/5 to a piece v/s 4/5 to the same piece, that in the end, I’d be forced to say it depends, and then list out of a whole new set of variables.
Slicing the Data with new variables
This is a problem. Since, my existing data set does not have these new variable. So, from analyzing – now I am back to collecting data. To be frank, there’s no end to this cycle … collect data, realize that you might want more data and rinse, repeat.
Where do we divine the new rules and new variables? We start from the context.
Ergo, the simple and freeing approach of the answer to the questions we were looking for in the data, sometimes lies partially in the data points, and partially in the context.
Let me illustrate this
Let’s take a fairly popular metric – Bounce rate.
Now, if I were to say that my website’s bounce rate is 100%, what would you say?
Now, if I were to tell you that my website is a single page website where I want my users to watch a product launch video. That bounce rate suddenly pales and aren’t you itching to ask me about the number of users who played the video upto a certain point?
If you have been working with Google Analytics, then some of you might even suggest that adding a non-interaction event in GA when the play button is hit.
One more example
Let’s take one more metric. Pages/Session to measure how much content the user is consuming on a site.
Let’s see this in a different spiel. A user is on your site, searching for content and is not able to find what he wants, and keeps visiting different pages. After going through 8-9 pages, he finally gives up and leaves the site. That 8.5 as pages/session now doesn’t seem that sexy now does it?
Understand the context
Therefore staring at a pure data puke may not help. Understanding the context under which that data was collected is as important as going through excel sheets or powerpoint presentations.
TL;DR – Data without context is open to too many interpretations and is a waste of time.
In the month of November 2016, Data Studio was made available for all users in India. The product was launched quite some time back, however, it was only accessible in the US and for premium Google Analytics 360 users.
However, as of today, anyone can use Google Data Studio to create dazzling reports that can be shared with teams and clients.
So how does one go about creating awesome reports?
That’s where Data Studio shines, it allows users to create one template which can be utilized across multiple data sources. I tried to create a quick report using one of the default templates provided, here’s a step by step guide on using Data Studio to create reports.
I had earlier written about Moz’s Aleyda and her checklist for International SEO. Last week Moz had a Whiteboard Friday with Aleyda running us through that Checklist. For people wishing to target other geographies where they do not have any presence, this is a must see video.
Avinash Kaushik gives simple and clear answers to most challenges that digital marketers. The explanation uses Venn Diagrams to make strong points. If you are a HiPPo (Highly Paid Person with an Opinion), then do take some time from your busy schedule and go through this. If your organization is not actively working on any analytics initiatives, then it is high time that you should start!
Annie Cushing shares her presentation from SMX East and drives a point home. For marketers to make sense to the top management, data visualization is crucial. People prefer looking at great looking data instead of just a series of numbers.
I love infographics. The way they break out data into beautiful little pictures and help you understand their impact is excellent. However, it can take a fair bit of efforts to create an infographic … believe me, I have tried and used multiple tools to do this. If you are thinking of doing those from the ground up, then you are faced with challenges such as choosing the colours, typography and which data to show in what manner.
If you are design impaired like me, then this steep learning curve is bound to turn you off.
This is where visual.ly really shines through. It provides you with templates for creating infographics. Templates which have been tried and tested and make your job of creating an infographic easy. What’s awesome is that they keep releasing kickass integrations such as this one, where in you simply have to give access to your Google Analytics data and it will create a weekly infographic such as the one above.
If you are a data nerd, then you may not appreciate the findings of this report, but then you should be able to relate to some of these important points. As a webmaster and a data nerd, I am happy that the organic search results have dropped … since now I am slowly looking at other sources of traffic. This drop in organic traffic has come due to a decent rise in social traffic and that makes me a happy webmaster.