Technology is the collection of tools, including machinery, modifications, arrangements and procedures used by humans. Engineering is the discipline that seeks to study and design new technologies. The Big Fat Geek is an Engineer.
Read through the post, and realized that the title is a bit off. It’s not that Social Media is sending some dangerous traffic, but that the traffic being sent is being incorrectly measured as Direct traffic and therefore, difficult to act upon. This misdirection can lead to a lot of tactical mistakes.
What’s more interesting is the story about World War II that Gareth has nicely illustrated. The deaths due to a D-Day rehearsal were more than D-Day itself. The reason behind this is people coming to the wrong conclusions because of the data made available.
A light skim of this article might put me off Social Media as a marketing channel. As it is I am a bit biased against it, but this would have put the final nail in the coffin. However … this is the blind spot that I am referring to.
Slight misinformation, and there we go jumping to the wrong conclusions. As an analyst, something that you might want to keep in mind is the quality and the veracity of the data that you analyze.
As someone who has led the technology function in multiple start-ups, sometimes as a founder, sometimes as a consultant (a consulting CTO), one of the key learnings that I have seen is that whenever technology is viewed as a silver bullet to all the problems of the business, then that start-up is bound to face a lot of scale up hiccups.
What is a silver bullet?
A silver bullet was considered the only way to kill a werewolf. The term therefore is used as a magical solution to a difficult problem. In India, we have another such term … रामबाण
So, a solution which takes care of your problems.
What does technology represent?
When I refer to technology, I am not using this as a generic term. This is specifically intended to mean information technology resources … including machines, people, code and systems.
Technology usually represents scale through automation. It does not neccessarily mean problem solving. The solution to the problem that technology has to solve, is usually a process, or a product.
This product or process usually has to be designed. This design is not necessarily the domain of someone who knows information technology. Usually, a person who knows the business pretty well is able to do better design, as opposed to a person who can code.
Design requires engagement
Engagement with the problem so that the solution can be found. Therein lies the problem.
Now if the technology team that is within the organization knows the business well enough and if they are willing to engage with the problem at hand, then a proper solution can be designed.
Technology + Business could possibly do this
More often that not, technology resources are not business centric. They are “requirements” centric. I am being a bit harsh, but this is so rampant in India that IT leaders need to start rethinking the way they engage with the business. Perhaps a small business centric subject be included in the engineering courses.
Some symptoms of this problem
However, until business and technology do not partner on an equal ground, this problem will always be seen. What problem you may ask … here are some symptoms of this, folllowed by what typically happens with such teams.
An entrenched technology team which is in a “victim” mode all the time. They do not have any control on what work they are doing, and have no say in the business.
A rigid technology team which raises a mountain of paperwork and bureaucracy for all incoming tasks. Forms need to be filled in triplicate, and multiple documents need to be created and this is then project managed by a committee. A line of code requires a months paperwork.
A technology team that’s viewed as nincompoops or defunct because of their lack of being effective and responsive to the business. Inspite of having inhouse resources, different team chose to outsource work to their vendors.
An overworked team that’s loaded with so much work that they just don’t care about meeting deadlines or creating something of value. Testing is haphazardly done, rarely things get documented, cowboy coding is rampant.
A risk averse team that lacks the confidence to do great things. Inspite having inhouse capabilities, no one is willing to risk their neck and therefore chooses to outsource to vendors.
When you notice such teams in a start-up, more often than not, that start-up is not going anywhere. Until and unless the team and the business undergoes a severe change in attitudes towards each other, the team is not going achieve shit.
Such a technology team is not a silver bullet, they are a white elephant.
One of the major shifts in online advertising that I have observed recently is the rampant use of re-marketing campaigns of late.
What are remarketing campaigns?
I like to think of remarketing campaigns in the form of a popular ad campaign that Vodafone (then Hutch) ran in India.
This brilliant ad campaign that was run in India talks about how the network follows the user and ensures that the telephone network is always available to the end customer. Keep in mind those were the days when network connectivity was a major issue.
Re-marketing campaigns are very similar, instead of the network, its the ad network that follows and ensures that the user is targetting off different websites who are running ad inventory.
If done right, remarketing campaigns can be seen as a serendipitous, even.
For example, let’s say if I went to a Flipkart or Amazon to purchase a particular product, and then I added the product to my cart, because of that particular action, I could be included in a Remarketing audience, and this audience is then shown an ad across different Display Networks. One of the most popular display networks out there is the Google Display Network (GDN).
However, this is not the only display network, there are multiple networks out there who can provide the same facility to the marketer.
It’s all about the spends for Display Networks
Now, you have to realize that for all Display networks and even for Social Media sites, the primary revenue model is advertising. That means, they want to grab more and more wallet share of the brand. A few years back, Google was ruling the roost in India, however, Facebook is now giving Google AdWords a run for its money.
Therefore, whenever a new feature is available on one network, the other ad networks simply duplicate the feature. Did you know that at present if you wanted to run Remarketing campaigns you could do so Facebook, Google, LinkedIn, Twitter, Instagram and YouTube? The list goes on, and the ability to create Custom Audiences and Lookalike audiences is also available across all channels.
Simply put, all the old and new marketing networks out there are willing to provide the features that marketers need in order to target (and re-target) their customers.
So where does that leave us?
Over zealous re-marketing
It leaves us with a whole bunch of over zealous marketers who want to get in front of the user and keep bombarding him/her with their offers. No matter what.
Take this case, I recently visited a website that was being promoted by a known agency. I was doing a routine check of their tag implementation. Satisfied that most of the obvious issues were taken care of, I left the site. Notice, there was no purchase intent.
Now, everywhere I go, I am being bombarded with impressions of this site. On Instagram, on Facebook, on GDN. Cute, but am I going to click on the ad? Not really. Are these impressions wasted? Yes, on me, they are.
I never intended to buy!
Is such a bombarding of the user the only mechanism to deliver results?
So what can be done to make remarketing more effective?
As a marketer who is in charge of running these remarketing campaigns, there are a couple of things that you could immediately do to reduce the spends and therefore increase the efficacy of your campaigns.
Put a frequency cap on each of your creatives. If I am not going to click on your ad the last 20 times, theres a snowflake’s chance in hell that I will click on the ad the 21st time!
Create remarketing campaigns based on user actions on the site, and not just a blatant site visit. If I have done certain things on the site that indicates my intent e.g start filling a form, downloaded a brochure, done an add to cart, etc, then it makes sense for me to be included in the respective remarketing list.
Exclude the users who have already converted from your remarketing lists. If you do not do this, then the ads would also be shown to users who have already converted. Thereby wasting a lot of impressions. If you don’t do this, then its just plain lazy.
Plan your remarketing campaigns on paper first before, think through the entire process and then kick-off the campaigns. Most of the time, remarketing campaigns are launched after the firsst set of campaigns, since you need visitors to be included in your remarketing lists. That means, you have time to plan and think through. Don’t waste that time.
If after all this, your remarketing campaigns still don’t deliver results, do let me know!
After thought on Remarketing campaigns
In the day and age where individuals online are slow to wake up to concept of online privacy, we as marketers often don’t realize that remarketing campaigns being done to death can turn a meeting of chance into oh-my-god-the-brand-is-stalking-me kind of feeling.
The next time you are thinking of remarketing, do tone it down a bit please.
Google AdSense has been around for more than a decade and a half now, this along with DoubleClick for Publishers allows website owners to monetize their traffic.
One of the key challenges in this was to figure out the optimum ad placements without impacting readability and user experience of the site. This trade-off that the publisher had to do was to decide on the different ad slots to create on the web page, and then balance that with the Revenue Per Thousand Impressions (RPM) metric that the digital advertising industry is so familiar with.
In order to help publishers out, AdSense had experiments where you could test different ad layouts and figure out the best layout to monetize the site.
So what has changed now?
This is the applications of artificial intelligence which gives programs the ability to discover new rules and learn from experience without additional programming. So that means, for newbie publishers instead of having to figure out by themselves what ad formats work and what ad placements work for them, you can apply machine learning and let the platform learn on its own.
What that means, is that the publisher is now free to focus on content, and let the AdSense platform figure out the best way to monetize that content on the ad network.
With every new feature, comes a series of disclaimers. Machine Learning requires a lot of data to get things right. If you are a small site such as this blog, then it will take a long time for AdSense to optimally figure out the right ad formats and the proper ad placements.
Having said that, here’s a very simple way using which you can get started with Auto Ads in AdSense.
Setting up Auto Ads
In your AdSense console, in the Ads section you will now find a Auto ads menu item. Click on this, and get started with the setup wizard that’s present there. If you want to know how to embed the Auto ads code in your site, Google also has a helpful support article here.
That’s it! Once the code is setup in your website, you choose the formats you want to add (I chose everything) and let it run.
So far, the results haven’t been that great. However, time will tell if applying machine learning gives great benefits for the publisher.
What benefits should one look at?
Ultimately, it boils down to increasing the aggregate Revenues per thousand impressions metric (RPM). That’s what I’d look at, I would also look at the Click through Rates (CTRs) to go up.
I had blogged about getting traffic through bots leaving referral signatures, and it seemed as if the whole internet saw this happening on their sites. After I blogged this post, Moz.com came out with suggestions on putting filters on Google Analytics to clear our your analytics data.
I wrote this note out for a discussion on Social Media sites and how their relationship with publishers has evolved over a period of time. It goes to show that too much of reliance on any one channel may not be such a good thing after all!
Can we as digital marketers and analysts create a measurement model that can reliably help us to identify whether our social media investments are justified?
Social Media and Creators
One of the problems that new Social Media websites face is generating enough content that users want to consume. This they do by welcoming publishers to come and register on their websites. This is the main fuel for their growth.
The social media site in question (including Facebook) does all it can to attract publishers and creators. The focus is on getting more creators and therefore more users. Users get to follow their favorite brands and celebrities on these sites. Brands and celebrities get a scalable way to engage with their fans. A win-win on paper.
A platform is born
As more users sign-up and start using the site, it soon starts being recognized as a platform. This platform now is independently known and now, creators are attracted to the platform not because its easy to publish their content or its easy to create their content … but because that platform already has their potential target audience.
So, from engagement at scale, the reason why the platform is being used shifts to reach and discovery. The very publisher who used to get throngs of crowds flocking around them now is looking at the platform as the source of that crowd. The shift of behavior due to the change in thinking is not amiss to platform owners.
From Win-Win to Monopoly
The platform owner now knows the dependence of the publisher upon the platform. E.g Facebook single-handedly crippled the stock prices of Zynga (famous for Farmville app on Facebook) by taking it off their Featured apps page.
Take the organic reach that Facebook now provides. Some years back (circa 2012), a single post on your Facebook page would be shown to 10-12% of your followers. This has slowly trickled down to 1% now (3%-4% if you have high engagement on the page). The reason behind this is because every brand out there is pushing out more and more content than what the platform was designed for, and every brand / celebrity out there wants to create content that goes viral.
Pursuit of Viral
Publishers in the pursuit of this holy grail tend to create a Sea of Crappy Content. This is loads and loads of content which does not drive engagement. Platform owners now are scared by the very publishers they used to chase. Not because they don’t need them … but because they are not clearly able to differentiate the good ones from the bad ones. The definition of quality becomes more blurred.
Zero Organic Reach
In the end, the platform owner plays the one card that they can control. Throttle the impressions and reach of the publishers. Quality is then replaced with budgets, with the underlying assumption – if you can create great content, most likely you have enough budgets to buy the impressions required to go viral.
Another example to highlight this is to look at any Facebook page which has over 10,000 likes, the last post of that page won’t even have an engagement rate of 1%. The problem may not with the page or the post in itself, it stems from the throttling down of organic reach.
So what can be done?
Do we pay the piper and buy our followers? Or do we dance to the tune of the platforms and keep pushing more content in the hopes of getting that one beautiful post that gets shared by the millions.
Can we instead, arrive at a scientific method of identifying what platform works and what doesn’t in furthering our objectives?
This has been coming for quite some time. Google Webmaster is slowly rolling out an update to the Search Console. The new search console is a much cleaner interface with most of the reports and insights hidden under multiple layers.
The new Search Console dashboard is much more simpler now, with just three reports that you can view –
Performance – How your site is performing in the search results. This is similar to the Search Analytics report.
Index Coverage – How your site has been indexed and what errors does the search engine detect on the site
AMP – Information about the Accelerated Mobile Pages and how Google detects AMP on your site
The good things
Straight off the cuff, this tool provides just the right information to the user. It forces the user to engage with the reports available, and provides interesting insights that were not available before.
Take a look at this new report which tells me the impact on Search Impressions when I decided to mark 75 odd pages on noindex.
Overall, most reports even if being the same are now much more easier to understand and interpret.
Another example is the Performance report with the Pages section selected. This feature was available in the older interface as well, but now the data is much more clear and this tells me which pages should I work on to improve my CTRs.
The bad things
There are more things yet to come, and this is not a complete experience yet. I cannot only rely on the new interface. Some reports simply haven’t been incorporated into the new Search Console yet.
Reports on Structured Data is completely missing. The diagnostic set of tools and the crawl request tools are not included in the tool.
The update to the search console is welcome and I for one am glad that a much more cleaner interface is made available.