SEO

An In-Depth Guide to Google SSL Security & SEO


Google SSL SEO Guide

On August 6, 2014 came an announcement by Google that SSL is going to be a ranking signal from now on. As SSL becomes a ranking signal, the web is likely to become more secure and SEO strategies would change as a result.

That said, there are over 200 ranking factors that determine where your site would sit in Google’s index.

Google SSL SEO Update

Let’s see if this is true.

In Google’s own words

“Over the past few months we’ve been running tests taking into account whether sites use secure, encryptedconnections as a signal in our search-ranking algorithms. We’ve seen positive results, so we’re starting to use HTTPS as a ranking signal.”

Way back in March 2014, Matt Cutts expressed a desire to see https included as a ranking signal and come August Google has it on the table.

How big is the Google SSL SEO update?

However, according to statements from Google, this is not a very big signal when determining rankings.

It would merely affect 1% of global search queries. There are over 5,922,000,000 searches per day on Google as per data from 2013, which implies the monthly search volume on Google is 177660000000. 1% of which is 1776600000, which is actually a sizeable number.

Plus, I don’t trust Google in ditto. When they had introduced secured search for certain queries they had said that it will affect only 1% of queries but it actually affected 11% of global queries.  Here’s the Hubspot article that discusses the same.

Same has been true for the slight tweaks and twists in their algorithms.

It could possibly mean a few more millions for those who do it right.

And as time goes by this weak signal would become strong.

I could be wrong though and only time will tell. When Google introduced authorship the whole industry went into a mayhem on how it will change the entire SEO landscape.

But this is what happened,

ssl company

Let’s move on.

How to move your site from http to https easily?

These are the steps to follow when moving your site:

1. Get the TLS certificates for your site

(This might differ based on which host you are on. Here are specific information for some popular hosts)

Godaddy, Host Gator. Blue Host, WP Engine.

How to get free SSL/Security certificate.

tlscert_firefox_view_of_personal_cert

To get the TSL/SSL certificates, login to the Cpanel of the domain,  click on TSL/SSL manager.
You will get an option Certificate Signing Registers and below that there will be a link to view, generate or delete SSL certificates. Fill the form for the domain for which you want to generate SSL certificate.
You will get a code (CSR code) which you need to copy to your SSL provider.

2. Include images and videos and other files

Make sure that you include URLs of embedded content in your site move plans: videos, images, JavaScript, and CSS files. These URLs need to be moved in the same way as all other content on your website.

Amit Agarwal of Labnol has created wordpress plugins for generating sitemaps for both images and videos which can come in handy.

Buying a new site? Things to check moving over to secured version

When buying new sites, make sure that they are either free of penalties or were never penalized.

  • Manual action
    It’s when Google issues a penalty after s member from the webspam team manually checks the site for link spam. Spam links, participation in link schemes etc. could warrant a manual action. To know if there was a  manual action penalty, check the Manual Actions page in Webmaster tools and make all the changes necessary before submitting the reconsideration request.
  • Panda penalty
    This is generally issued due to low quality or thin content. You can check on webarchives to see if the site in general had thin posts. Also, the Panguin tool should be helpful in recognizing the penalty awarded.
  • Penguin penalty
    An automated penalty issued when the number and kind of links cross a certain threshold. Too many anchor texts of the same keyword, too many do follow links, websites using links in headers and footers, also called site-wide links etc could bring the Penguin home.
  • Removed URLs
    You need to check the status of URL removals left over from the previous owner.

1. Setting up the new Robots.text file

The robots.txt file is a file with sensitive information that can block access to Googlebots. Your robots.txt should be properly configured and shouldn’t block the urls from Googlebot.

2. How to determine your current URLs?

For simple site moves, list of current urls is not needed.

For complex site moves, a list of urls is however essential.

Here’s a list of ways by which you can obtain them:

  • Look in your sitemaps
  • Check your backlink profile, there are hundreds of tools available like Ahrefs, Majestic SEO etc. You can find a list of links which have backlinks to them.
  • Look into webmaster tools for external and internal links
  • Check server logs

3. Create a mapping of old to new URLs

Once you have the listing of old URLs, decide where each one should redirect to. How you store this mapping depends on your servers and the site move. You might use a database, or configure some URL rewriting rules on your system for common redirect patterns.

Backlink checkers like Ahrefs can help you in the process.

For example, here’s a list of backlinks for the DailySEOBlog from Ahrefs.

External Backlinks www.dailyseoblog.com  on Ahrefs

4. Update all URL details

Once you have your URL mapping defined, you’ll want to do three things to get the final URL mappings ready for the move.

Each destination URL should have a self-referencing canonical meta tag.

If link 2 is a duplicate of link 1, and the canonical tag points to link 1, then this code will appear in the head tag

<link rel=”canonical” href=”http://www.example.com/url-1.html” />

If the site you moved has multilingual pages annotate using rel-alternate-hreflang annotations.

If the site you moved has a mobile counterpart, they should be annotated using rel-alternate-media annotations.

  1. Update all internal links
    All the old urls need to be changed to new ones
  2. For the final move you need to save sitemap files
    1. A sitemap file with new URLs in the mapping
    2. A sitemap file with old URLs in the mapping
    3. List of backlinks

Use 301 redirects properly

  • Use HTTP 301 redirects. Google suggests using http 301 redirects
  • Avoid chain redirects. It’s highly advisable to not use a chain of multiple redirects. All browsers don’t support redirects from page 1 to page 2 to page 3 to page 4 to page 5 and so on.
  • Finally use Fetch as Google option to test the redirects

How much would it cost?

Decide the kind of certificate you need: single, multi-domain, or wildcard certificate — Godaddy charges around $70 for a single domain, $140 for multiple domains and $280 for all subdomains.

There are three kinds of SSL certificates:

  • Extended Validation (EV) SSL Certificates:
    Here the Certificate Authority checks both the right of the organization to use the domain as well as a thorough check on the organization. For most website owners Extended Validation is the ideal choice as most modern browsers like Chrome, Firefox, IE etc. support it.
  • Organisation Validation (OV) SSL Certificates:
    Here the Certificate Authority checks both the right of applicant to use the domain as well as does some background checking on the organization. When you click on the security seal some additional information about the organization will be revealed too.
  • Domain Validation (DV) SSL Certificates:
    Here the Certificate Authority  checks the right of the applicant to use a specific domain name only.

Things to be mindful of on the new site

  1. Remove all robots.txt directives from the source site so that Googlebot can easily discover all redirects and crawl the site effectively .
  2. For the robots.txt file on the destination site, the instructions be such that it allows crawling
  3. Update the Web Master tools with a change of address for the old site.
  4. On the destination site, submit the two sitemaps containing the old and new URLs.

The right Google Webmaster Tool settings making the SSL change?

Here the correct webmaster settings:

URL parameters:

Old URL parameters used to control the crawling of the site should be applied to the new site as well if it’s necessary.

Google_URL_Parameters
Ownership

Verify in Google Web Master Tools that you own both the source site and the destination site.

Geotargeting

If geo-targeting was employed in the source site, keep it for the destination site too. To access this option go to Search traffic>> international targeting and choose the country

geographical_setting_google_ssl_seo
Crawl rate

Don’t configure a crawl-rate setter as it will slow down Googlebot.

domain-settings
Disavowed backlinks

Re-upload disavowed links once again if you have used the Disavow tool in the past.

removal
Sitemap

Also submit your sitemap to Web master tools.

sitemaps

What happens if you don’t move over the site properly? Examples.

When moving to https make sure that all the urls are now https. If everything goes right, then you will find only https urls in the SERPs but if it doesn’t go right then Google may have both http and https urls in the index. Presence of these two versions together can trigger a penalty for duplicate content.

I will share with you two instances where this has happened.

Sometime back Harsh Aggarwal from Shoutmeloud noticed a dip in traffic for his blog after the Panda update 3.8.

Panda as you might is often triggered by duplicate content and Harsh had removed all thin content during the prior updates.

The problem was caused due to Google indexing both http and https versions of the same posts on the blog. The site however didn’t have an SSL certificate. There was however another site on the same shared hosting server using an SSL certificate and somehow the goodness of SSL overflowed.

As soon as Harsh changed the hosting which in this case was Hostgator the issues settled down and traffic was normal once again.

WordPress-https-links-indexed

Another site that faced similar problems was Kissmetrics, yes Kissmetrics.

In an audit of Kissmetrics and the reasons on why there was a traffic drop for the site, Eli Overbey  suggested that indexation of both http and https versions may be the culprit.

You can find the same article in two versions:

KISSmetrics Dual URLs 1024x577.png  1024×577

  • http://blog.kissmetrics.com/website-testing-mistakes/
  • https://blog.kissmetrics.com/website-testing-mistakes

That’s not all each post on the Kissmetrics blog uses the rel=”canonical” tag. However in their case both https and http version are self-referential. It means that both of the versions are claiming that it is version is the original one.

How does the SSL update impact eCommerce sites?

Usually eCommerce sites employ it on their checkout pages but if you’d like a ranking boost you need to implement it site wide

For eCommerce sites, introducing https throughout the site has the added advantage of increasing shopper confidence.

Many eCommerce sites have their checkout pages as a subdomain on some SSL provider like BigCommerce. Customers may not like it when this happens, they are shopping on site xyz and when the time comes for checkout they are sent to https://.

Keeping it consistent throughout may help conversions. In this industry a 1% lift may mean additional millions so everything counts.

Renew SSL Certificates on time. Not doing that will display a page forbidding users from visiting your site.

Let’s see one instance where non-renewal of SSL certificate led to traffic loss.

In this blogpost Glen Gabe about how an eCommerce site lost tons of traffic because of an expired SSL certificate.

panda-ssl-certificate-seo

If the SSL certificate expires search engines will have no option other than displaying a red screen. The eCommerce site owner thought it to be because of Panda but sometimes the underlying issues can be this simple.

There are a couple of tools available which can show you if your SSL certificate is valid or not like SSL Shopperand Site Check to name a few. Just key in your domain in the space provided and you are good to go.

ssl checker

Does https really keep everyone safe?

Heartbleed is a security bug that came into light as recently as April 2014 that affected the OpenSSL cryptography library.

This bug could have been used to gain access to user sessions, session cookies and login data.

At that time it affected around 17% (around half a million) of web servers worldwide.

So even when https is touted as the new web order there are things that can go wrong.

Concluding thoughts!

Just because you added SSL don’t expect your site to rank on the first page.

In other words,

SEO Industry Tweets Its Reactions To Google s SSL Ranking Boost

Source

Google introduced the concept of HTTPS everywhere in its I/O 2014.

More details on that here:

Another interesting development in that on June this year, Google had been assigned a patent from AT&T on speeding up SSL networks. You can read the entire article here.

Even prior to that, back in November 2013, Google initiated and completed adding 2048 bit RSA security for web privacy.

Google seems pretty serious about the update and I can only see it to be becoming a ranking signal of greater importance.

One of the major hindrances to the implementation of SSL on more sites was the slower loading speed of such sites. And site speed too is ranking signal.

However as the web becomes more secure in general, SSL speeds would catch on. Also, there’s the patent from Google on speeding up secure sites.

If you are on the fence concerning this, don’t be, it won’t hurt your rankings but only benefit it.

Amazing WordPress Theme


I can talk about Marketing and SEO all day long. Passionate about blogging, SEO & Online marketing. Perpetual learner.

Get updates!