Nitropack review

Those of you who are running some sort of a content management system (CMS) for your websites would be familiar with the problem of improving the site loading speed through different methods. From the age old caching methods of using op cache module, to using an application specific caching method such as WP-Supercache for your WordPress installations, the sheer variety of solutions out there is a lot.

For a non-tech webmaster (these days, this term seems like a conundrum!), it becomes difficult to choose. At the end of the day, what one ends up going for is how fast the website is loading and more importantly how is the web performance of the site.

Let’s take a look at what are some of the common factors that any webmaster would like at for their caching solution.

Server site rendering time

This is effectively how fast is your server giving the response on the browser. Let’s say that you are running a blog on a small instance or a shared hosting solution. This would usually have limited resources associated with it, be it computing or memory. For instance, currently, these pages are being served off a 512 MB droplet.

Needless to say as your traffic increases, these limited resources are then not enough to address the entire traffic and thus, the response time for all your visitors starts to increase. A simple solution for these problems could be to bump up the hardware and increase the computing and memory being made available for the server. The computing speed is obvious, but why the memory you might ask? Well, since most web servers are softwares running on servers (for e.g Apache or Nginx are the servers most commonly used for WordPress), these software processes have to run on the server. The more the traffic, the more the number of processes.

If you are running WordPress and are facing a load of traffic, and if you are running your database on the same server, then you might sometimes be seeing images like the one below –

MySQL error with WordPress

Seems familiar? A common reason for this is when there are too many apache2 processes and not enough memory to handle all of them. The server promptly terminates the other processes, including the MySQL daemon.

Caching to the rescue

This is where server side caching comes to the rescue. Take this blog post for instance. How many times in the week am I going to edit this? Not many right?

In which case, instead of the PHP script executing every time, why can I not serve the static (HTML pre-rendered) version of this post?

WP-Supercache does a good job as a plugin to do this, however, in this case, for supercache to execute, the WordPress PHP scripts are still executing. How can we stop those?

Another option would be to run caching at Apache or Nginx’s level. This is a much better approach since instead of calling PHP scripts, the server will serve the last known cached static file. The problem with this approach is cache management and storage.

With a small server, you may not have a lot of storage, and if you have been maintaining a content heavy site, then caching all pages might be a storage intensive process. The expectation from your instance’s compute power also increases.

This is where you will find reverse proxy servers shining.

Reverse proxy servers

A reverse proxy server is a server that sits in front of the web servers and forwards client requests. One of the older versions for PHP based websites was Varnish. Nginx also offers this, and newer versions of Apache also do offer this functionality.

What the reverse proxy does is for each request, it caches the response from the down stream server and serves that response for each subsequent request. Think of it as a smart cache manager that sites seamlessly between your CMS and the user.

Traditionally, these were a bit difficult to setup, and therefore were the domain of only the tech oriented webmasters. However, of late, there have been a couple of smart SasS based reverse proxies, and that’s what I wanted to write about.

Cloud-based reverse proxies

A cloud based reverse proxy is a reverse proxy server that’s not on your network/server infrastructure, but rather hosted as a separate service that you choose to buy.

I had initially tried Cloudflare, but wasn’t really impressed with the results. There were a couple of Indian service providers as well, but the outcome wasn’t that great.

Then, one of my colleagues pointed me to Nitropack. Getting started with Nitropack was a breeze and I could easily set this up. There was also a plugin to be installed in my WordPress setup and that’s about it. Nitropack even had a CloudFlare integration (since I manage my DNS on CloudFlare), where it made the relevent DNS entries and I was able to use this without too much of a hassle.

I am currently on the free plan, but the immediate impact on my server response times, and my web performance has been substantial.

If you are a website owner and if you have been harangued with web performance issues, do give this solution a try. It makes a sufficient impact on your response times.

Data anomalies in Search Console

In the past 5-6 years or so, a lot of online businesses, especially the ones who are hungry for growth have relied on organic traffic as one of their key sources. Now growth could mean an increase in pure numbers (traffic, sessions, users) … or it could mean an increase in more tangible business parameters (revenues, profits). One of the things that I have learnt is that depending on which success metrics we chase, our own identity undergoes a shift.

Search as a major source of traffic

The major contributors to organic traffic are search and social. Wherever there is a site which has great and unique content by the loads, there is a chance for driving organic traffic.

At different points in time, I have been skeptical about Social Media and me-too posting that most brand pages do on platforms such as Facebook. However, Search for me has always been fascinating and I still have faith in Search :).

SEO can’t be a method

Search Engine Optimization (SEO) has evolved over a period of time and I have blogged about it on multiple occasions. Unfortunately, the number of times the algorithm changes and the rate of evolution of what Google (the market leader in this space) construes as quality content ensures that you can’t have a steady SEO “process”.

Having said that, SEO involves a fair amount of design thinking.

The reason behind this statement is because the problem behind search visibility (and the factors that control that) keep changing. It’s a wicked problem. Design thinking can solve such kind of problems because of its test and iterate mechanism.

Data to drive Design Thinking

This is where having the correct data to decide on next steps is crucial. Having a data driven design thinking approach would entail that there are periodical reviews of what kind of data we have available to make the right choices.

Search data has always been plagued with incomplete information. Starting from the 2011 encrypted search announcement, where a bulk of the data in Google Analytics was being reported as (not set). There have been ample approaches to clarify this data, unfortunately, as Google Search goes more towards handhelds and as digital privacy increases, the percentage of data where there is clear visibility will keep going down.

This can’t be helped. What can be done is take these “anomalies” into account and factor those in while doing your analysis.

So what kind of Data anomalies in Search Console do we expect to find?

Google Support has compiled this list. They keep updating their data reporting logic and keep updating this page as well.

One of the major changes that you can see is that last month, they started reporting more data in Google Webmaster Tools. Please bear in mind that this is just a change in the data that is being reported and not the actual search traffic that is on your site.

The link also explains why there is data disparity between Google Analytics and Google Webmaster Tools and any other third party tool that you could be using to generate keyword data.

So, my data is incomplete, what to do?

Don’t panic.

Work with the list of data anomalies and identify which ones are impacting you the most. Having visibility on which parts of data are not available to you is also better than not knowing anything and assuming that the data you have is complete.

In iterations, the first comparison is always your previous state. In both cases the data being made available to you is pretty much the same. Hence, a week on week comparison report is much more valuable as opposed to a comparison report with your closest competitor.

As long as the measures of success is on the same tool, the data anomaly should be cancelled out. Please bear in mind that for most of our data work, we do not need precise data but can work with coarse data.

A simple approach to identify this would be – if you work with charts and graphs more, then you can work with coarse data and absorb the anomalies. If you work with more than 4 decimals, then you might want to add 3-4 lines of disclaimer below your data.

Life without Google – Gives me the heebie jeebies!

For all the awesome things that Google does, there are always concerns about privacy, data sharing and access to insane personal information. So much so that there is an interesting site out there in the blue nothing – One Day without Google.

This got me thinking. As a collective, we criticize the Search Giant so much, but if it were not for Google, what would we be doing now?

How would life be without Google?

Continue reading “Life without Google – Gives me the heebie jeebies!”

Google Webmasters explains Search Appearance

Google Webmasters Explains Search Appearance

This month, Google rolled out a new help feature in their Webmasters tool. If you have been working on your site’s SEO, then probably you know most of this, however, its good to see how Google has nicely summarised this in one helpful page.

If you are not completely confident about your SEO fundamentals, then this is one thing that you should do immediately!

Top 5 things you should do after launching a site

So after toiling on that idea for more than 4 to 5 months, you have finally launched your site … the journey till here was literally blood and tears. Having to either grapple with the content management system (CMS) or even worse, having to convince the developers working on your site that the site needs to be exactly as the way you want it to be and not what they want it to be. You have managed to stick to the deadline that you set for yourself and finally the site is live!

You feel like celebrating … and you should. Fireworks in the sky, Champagne flowing … you have did it. It feels good to have reached here, doesn’t it?

Reality bites

Now time for a reality check.

Your site’s journey has just started, and its completion is not the end, but just the start. You still need traffic to generate revenues, you need growth in that traffic and you need people to talk about your site. Without traffic and sustainable growth in that traffic, your site is simply going to be a liability. If you are betting your website to do your sales, then have I got a thing (or two) to say to you!

A website once built, can only generate huge traffic if you are relying on the secret sauce. That ingredient which will ensure that thousands and thousands of potential customers come flocking to your website looking to buy your product/service.

The Secret Sauce

To generate traffic, you need to be organized. You need to try out different things to see what works, and then do more of that.

secret_sauceYes. It’s that simple. You need to experiment and see what works for you. This is more work than you actually thought, but trust me … if you like the reason why you started the site, you will love promoting your site! You would love it even more when people reach out to you and seek your help.

If you can Measure, you can Scale

The more help you offer to the good folks who visit your website, the more popular would be. Since people actually start feeling your website useful. It’s a matter of finding out where these people come from and trying to see if you can get more people to visit your site from there. This is easier said than done … and that’s why you need to follow these tips and get to point where you can start doing these experiments.

So buckle up Dorothy, coz Texas is going bye, bye!

To correctly measure the traffic on your websites, you need to have a couple of things setup.

1. Google Analytics (GA)

Google Analytics

I would suggest GA because of the ease with which you can generate insightful reports and also the enterprise class features you get to use for free. There are other alternatives to GA such as Clicky, Omniture and even Alexa for that matter. However, trust me and go with Google! We will save Alexa hacking for later.

2. Google Webmaster Tools (GWT)

Google Webmaster Tools

Using GWT and GA in tandem would be the backbone of your Search Engine Optimization (SEO) operations. I have put up a Starter’s guide for SEO which you definitely need to go through. In case if Bing (Bing Is Not Google) search engine is relatively well known in your country (it’s at 16% market share as we speak in the United States), then go ahead and register on Bing Webmaster Tools (BWT) as well.

3. Start re-writing your content

Schema

I know. You have just completed this task two days back. Life is unfair, I should have told you about this before you finished the site! But in reality, be prepared to keep fine tuning the content on your website on a regular basis. I keep doing this for my top content since the post might be old, but the traffic that comes to those posts is brand new!! Do not be afraid or lazy to re-visit the content on your site. It’s a given that you will revisit it. Embracing this helps you quickly write content better for your users and also for your search engines. Here’s a starters guide to writing for search engines using Schemas.

4. You are your website

Remember, at the start. People do not know of your website. Do not expect too many visitors from the word go. In fact you probably will have to ask your friends to visit your website. So start sharing your website link with people … share it with people whom you think will benefit from the advise you are giving on the website. Share it on your social networks asking your friends to visit the site and give you reviews.

You have to pimp out that site and in style! Use your personal equity and get the traffic rolling in … if the site is good, then soon your friends and contacts will start sharing your site without being asked to! The traffic to your site will slowly grow … but at the start, you are your website. It’s a part of you and you have to ask people to come and visit your digital presence … your website. Do not shy away from this. You would be losing out on a very good initial source of traffic if you are shying away from sharing your site on your own social networks.

Are you yourself not convinced about the content you are dishing out on the site? No? Then start sharing those URLs. Have a few social sharing widgets on your site and ask your good friends to share as well!

5. Prepare for the journey ahead

journey ahead

I have already said this before, but I am going to iterate this again and again until this really sinks in.

Driving traffic to your website is a continuous activity.

This does not mean that you need to stop everything else and only keep doing these things. But, this means that you need to allocate some time in your busy week for Measurements and Experiments. You need to play with your website, and see what different things you can do on your site.

If that means learning HTML or hiring a smart guy to do this work, then so be it. It will pay off in the longer run. Also, do note that if you shy away from this activity and rely on someone else to do this for you, then you are pretty much relying on that person for your traffic and the traffic will dwindle the minute he/she goes away.

So pull up your socks, and get ready for a wonderful journey of web analysis, traffic generation and customer engagement! You are going to love it!!

Demystifying (not provided) Keywords

Removing (not provided)

Google has started protecting signed in users by not reporting their search terms, instead we see a (not provided). People have written this off saying that the number is insignificant, in fact you can make use of this nifty little Custom Report to check the impact of how many searches are not reported to you. Over a period of time, it has grown to be a huge 39% of the total organic results worldwide.

Blog owners have every right to be concerned; they are missing data and for content driven blogs, this can be painful. I have noticed that for all my games posts, the %age is quite high … 70%!!!

However, there is an alternative. If you are concerned with finding out the keyword data about your site, then you can retrieve this data from Google Webmasters (GWMT). This in combination with the Top Content Report for the Non-Paid search traffic segment should get you any data you require.

Combining the GWMT data with Analytics will help you get a better understanding for the funnel data on each of your individual keywords … all the more so if you have Goals configured.

Do note that there is a slight discrepancy in the data that is reported in GA and the one which is reported in WMT. GA reports visits, WMT reports clicks. GA’s SEO report does not provide the data of the past 2 days, WMT does. So if you are just starting out, using WMT would help you gain more insights on your on-page optimization as compared to simply using Google Analytics.

Edit: I found this great (but a tad tedious) article on SEOMoz on how you can decode the (not provided) keyword. If you are not well versed with Google Analytics and Filters, then I would suggest you skip it.