So, I recently noticed that for certain highly competitive keywords, comparatively younger webpages popped on the top five search results, beating some very strong authority sites.
Obviously, this raised my curiosity.
Why would a young page beat the toppers unless it had some strong mentions/links? What about historical performance?
Was it so easy to get into these keywords (which as far as I know were very competitive)?
And almost all the time (excluding the first few time I noticed it), my SERPs wasn’t customized or personal (incognito), so what was happening here?
Diving in, I also noticed that some of these websites had common features that made the user (organic search traffic) spend time on the website.
Some of them were a) Photo sliders b) Pagination and c) Indexed pages (like an e-book).
In the case of photo sliders, there wasn’t much meat on the page, the content was comparatively thin, but the content was compelling so visitors (I assume) were clicking through to deeper pages (next buttons).
In the case of pagination too, the content was most of the time, very compelling so that I was “forced” to click through (enticing, teasing, promising content).
This led me to the assumption that may be, it’s the “relevancy factor” that’s playing out.
What if, searchers found a particular URL interesting for a particular keyword, clicked through and spent more time on the site compared to all other results on the SEPRs?
What message would it give to search engines? (Assuming they have all the data…well GA!)
To illustrate, for a particular keyword (Say Keyword A), let’s assume this is how the search results would look like…
Rank 1 – Site A – Keyword A – Avg. Time Spent on site – 1 Minute
Rank 2 – Site B – Keyword A – Avg. Time Spent on site – 9 Minutes
Rank 3 – Site C – Keyword A – Avg. Time Spent on site – 3 Minutes
With the above assumption, it would not take much for Site B to overtake Site A, because it has more stickyness. i.e more time spent on the site, against the corresponding organic traffic compared to other sites.
Test & Data
Unfortunately, I do not have the avg. time spent data for the URLs I noticed. All I knew was that they had content formats that would let the user stick around for more time and would let him click deeper.
So, I tested this out on a website I had data for.
The site targets the financial sector and the keyword I chose was extremely competitive.
Here’s the data.
For privacy reasons, I have masked the keywords (can provide you details in private). But the keyword in question was No.3 in January 2014.
It was a three word phrase. Ex: “Competitive keyword phrase”
Then I made some changes to the page that was getting the traffic for this keyword with the above assumptions, so that would make the user “stick around”.
Note that the average time spent on the site for this keyword is around 2: 29 minutes at this point.
After 4 months of experimentation, here’s the result so far.
Notice how the average session duration went from 2:29 to 5:11 ?
Also, note that the time spent duration went up for other keywords too. But interestingly, the ranks for those keywords were not as interesting as the keyword in question.
Notice how the traffic for the keyword went from 19 to 132 in 4 months? That is a jump from Rank 16 to Rank 2 on Google.com.
Please also note that the term narrowed from “Competitive keyword phrase” to “Competitive keyword”.
So, essentially, in four months, the relevance for the particular keyword went up significantly due to more “stickyness”.
This experiment is far from a proper one. But here’s what I’ve learned.
– Improving “stickyness” for a webpage against a particular keyword, improves it’s relevancy on Google.
– Onsite optimization has very little significance compared to “relevance”. (The page scored an F grade, for the keyword both in January and four months later on Moz Grader.)
– Methods to increase time spent on site such as photo sliders, paginated content, indexedcontent could help improve “stickyness” and help positively in improving ranks for targeted keywords.
– Reducing “bounces” could help improve relevancy of a page against a particular search term.
– Deeper clicks to more pages within the site from a first touch point page, might help improve relevancy against a particular search term.
I do not think I have approached this test scientifically.
Some of my observations may be linked to multiple metrics which I might have not noticed.
The site I experimented with is less than an year old and the keyword is a pretty competitive one.
This experiment was pretty much basic. Would love to see someone experiment this in a more elaborate way.