5 Big SEO Lies Google Wants You To Believe

Written by   in  Uncategorized

But for the sake of your rankings, don’t.

By: Masha Maksimava
June 14th, 2016

It’s 2016, and SEO is the farthest from a bed of roses it’s ever been. In fact, most of it has turned into a bed of itchy, sharp, potentially lethal thorns.

In the elaborate metaphor above, thorns stand for the risks SEOs constantly face. The risks of being penalized by Google for a not-so-white-hat tactic. Getting de-ranked because of an algorithm change, when a previously passable strategy suddenly turns into a no-no. Or — worst of all — being outranked by a competitor, only to discover they built a trillion of links overnight.

We hear fragments of those SEO horror stories every week, so we end up full of promising ideas we can’t test for fear of doing more harm than good. And indeed, a lot is at stake. But… What if we put those SEO fears to the test? In this article, I’ve compiled a collection of 5 biggest myths about SEO that have been empirically proven wrong, along with some real-life examples debunking each.

Now, let the mythbusting begin.

1. You shouldn’t build links for the sake of SEO.

The Myth

We’ve heard it a million times: Google’s not too thrilled about link building for SEO. The search engine’s distaste for “unnatural” links starts right in the Webmaster quality guidelines:

“Any links intended to manipulate PageRank or a site’s ranking in Google search results may be considered part of a link scheme and a violation of Google’s Webmaster Guidelines.”

The same approach is reinforced by Googlers whenever they are publicly asked about it. To give you an example, Google’s John Mueller famously said that webmasters should avoid focusing on link building:

“We do use links as part of our algorithm but we use lots and lots of other factors as well. So only focusing on links is probably going to cause more problems for your web site than it actually helps.”

Yup, the idea is that your link profile should grow naturally over time, because you have this great content everyone’s willing to link to. But would you survive in an industry where everyone is building links? And is link building really the Dark Side?

The Evidence

Contrary to what Google may say, links remain the search engine’s strongest ranking signal. Here’s a huge recent example to show you just how true this is to this day.

A couple of weeks ago, a Reddit user spotted a new site in Google’s results, ranking for some pretty competitive keywords and rising to the top unusually quickly. The site turned out to be a collection of posts about “best” products of all kinds — be it kettles, laptops, or Father’s day gifts. And yup, each post is full of affiliate “buy now” links.

Who’s that little buddy there? Oh right, an Amazon affiliate link.

The site launched a little over half a year ago, and since then they’ve racked up some pretty impressive rankings in competitive niches. But how did they do that so quickly?

Footer links. No, seriously — footer links. If those aren’t “intended to manipulate a site’s ranking in Google”, I don’t know what is.

Apparently, this site belongs to the same publisher as US’ biggest magazines, and these magazines’ websites are all religiously linking to it from the footer. That includes esquire.com, elle.com, cosmopolitan.com, marieclaire.com, seventeen.com, popularmechanics.com. Et cetera, et cetera, et cetera. Go on and check those yourself.

But really, sitewide links from the footer? Wasn’t that a bad thing ten years ago already? Well, apparently not. Here’s the website’s summary from SEO SpyGlass:

You can see that almost all of the site’s links are sitewide, and that they only started growing their link profile in January this year. Not the most natural-looking links I’ve seen in my life for sure, but they really are working out nicely for the reviews site.

The Takeaway

Whether Goolge will admit it or not, link building continues to be the most important part of any SEO strategy. If you choose not to build links, you’ll probably end up far behind your competitors who almost certainly are.

While it is true that you should ideally aim for high quality links from reputable sources, the concept of quality varies from industry to industry. The best actionable takeaway here is to closely examine the link profiles of your best ranking competitors to get an idea of what kind of links work in your industry. You can do it in SEO PowerSuite‘s SEO SpyGlass.

  1. In SEO SpyGlass, create or open a project for your website.
  2. Go to Domain Comparison and type in the URLs of your top ranking competitors’ sites.
  3. In a moment, SEO SpyGlass will collect the backlink data on each of the sites you specified, and put up a comparison table so you can see each of the domain’s core strengths and instantly tell where you are lagging behind.
  4. Navigate to the Domain Intersection submodule to see the domains that link to competitor sites but don’t link to you. This is a perfect place to find the best link opportunities in your niche; specifically, look for sites that link to multiple of your competitors — these are more likely to link to you, too.

Download SEO PowerSuite

2. Clicks don’t influence rankings.

The Myth

Google’s official position on whether or not SERP click-through rates have an impact on rankings has been inconsistent at best. If they could, they equivocated; but when asked about it upfront, they usually denied the idea, leading SEOs to believe that click data is too noisy and easy to spam to be used as a ranking signal. According to Gary Illyes:

“CTR is too easily manipulated for it to be used for ranking purposes”.
The Evidence

There’s so much evidence against this myth I’m not even sure where to start. First, Google has quite a few patents on using clicks for rankings.

“The general assumption […] is that searching users are often the best judges of relevance, so that if they select a particular search result, it is likely to be relevant, or at least more relevant than the presented alternatives.”

Interestingly, these patents also include methods of getting rid of the noise and spam Google refers to as the reason for not using clicks for rankings.

But of course, just because there is a patent about something doesn’t always mean that something is currently being used. That’s where real-life experiments come in.

Rand Fishkin of Moz has run mutitple tests that proved that clicks have a massive impact on rankings. Most of the tests were the same in nature: Rand reached out to his Twitter followers and asked them to run a Google search for a specified term, click on result #1 and bounce back, and then click on another result and stay on that page for a while.

Guess what happened next? That other result quickly rose to the very top.

Interestingly, CTR seems to affect rankings in real time. After Rand’s experiments, the listing users were clicking and dwelling on eventually dropped to about the same postition it occupied before. This shows us that a temporary increase in clicks can only result in a temporary ranking improvement.

But here’s a word of caution: while Rand’s tests with human participants have shown impressive results, experiments that used bots to manipulate this data did not show any. Google stores lots of information on every individual searcher, including searching and browsing history, and is well aware of the techniques that can be used to artificially inflate clicks. The search engine’s just too smart to let the feedback of a bot — a searcher with no history, or one with a history that does not look natural — mess with the search results.

The Takeaway

Rand’s experiments clearly show that the more your pages beat the expected organic CTR for a given position, the more likely you are to be upranked. So yup, that’s another reason to do what you should be doing anyway — optimize your Google snippets for clicks.

  1. Check on the CTR of the snippets of your landing pages in Google Search Console to identify the ones you need to focus on first. While CTR values for different positions in Google SERPs can vary depending on the type of the query, you can refer to this study by Chitika for average click-through rates to help you identify the low-hanging fruit.
  2. Got a list of those? Open up SEO PowerSuite‘s WebSite Auditor and create a project for your site.
  3. Go to Content Analysis and select a page to optimize its snippet. Give the tool a moment to analyze the page’s content.
  4. In the Content Editor submodule, switch to the Title & Meta tags tab, and start composing your title and description. Make sure they clearly communicate the value of clicking through to your page to searchers. If appropriate, use a call to action and instead of simply describing what your page is about, address the searcher directly, and inform them about the benefits of navigating to your page, choosing your product, and so on.
    As you do that, remember about your keywords — it is still recommended that you include your target terms both in the title and description, and the closer they are to the beginning, the more prominent they’ll likely appear to Google.
  5. Once you’re happy with your snippet, hit Save page to save the upload-ready HTML file to your hard drive.

Download SEO PowerSuite

Here’s a bit of a pro tip for those of you who do paid search: consider A/B testing the snippets of your AdWords ads, and then incorporating the insights you get from those tests into your organic listings.

3. Keywords are no longer important.

The Myth

In 2013, Google’s Hummingbird update seemed to shatter everything we knew about keywords and on-page SEO. Apparently, keywords were being replaced by concepts and topics, and keyword targeting didn’t make much sense anymore.

“Hummingbird is paying more attention to each word in a query, ensuring that the whole query — the whole sentence or conversation or meaning — is taken into account, rather than particular words.”

And it wasn’t just Hummingbird. The Knowledge Graph and RankBrain were also geared, at least in part, towards understanding the meaning of queries in a more human-like manner and focusing less on individual terms within the query. Here’s what Google’s Amit Singhalsaid when annoncing the Knowledge Graph.

“[Thanks to the Knowledge Graph], your results are more relevant because we understand these entities, and the nuances in their meaning, the way you do.”

To be fair though, Googlers never openly stated that keywords were no longer important. This myth, in fact, was put forward by SEOs themselves, extending what Google actually said about their objective to understand queries as “things, not strings” to a more extreme “concepts and topics are replacing keywords completely”. Some even suggested that marketers switch to ‘theme-based SEO’ and forget keywords for good.

The Evidence

Sure, there’s a lot of cutting-edge technology behind how Google understands queries, but there’s little magic. Let us be honest: strings are still strings, because there’s no such thing as a ‘thing’ in computer language processing — and there never will be.

Of course, this is not to undermine Google’s advances in the field. Hummingbird and, later, RankBrain have undoubtedly changed the search engine into a more understanding creature that considers context and related concepts in addition to the phrase you type into the search bar. All of that changes keyword research immensely, but it doesn’t make it any less important.

OK, let’s put this into practice. Let’s say you have a site about plants. Maybe it sells plants, or maybe it’s a blog for anyone looking for plant care tips. Anyways, you’re writing an article about this little cutie called Aglaonema.

You know it’s called that, because you’re a plant geek. What you don’t know however, is that people who are less plant-savvy would call it a Chinese evergreen.

So ideally, you’d just go ahead and create your page about Aglaonema, and provided it’s a really good page with quality content, it’d natually attract backlinks and you’d rank in Google for both keywords.

Alas, even though these “strings” are a “thing”, Google doesn’t seem to know about it — yet.

Take a closer look — these are two very different SERPs for two keywords that are absolute synonyms. There is a bit of intersection further down the search results, but it only occurs on pages that mention both words in their content.

Looks like a bit of keyword research wouldn’t hurt after all.

The Takeaway

In the age of semantic search, keyword research has surely gotten less straightforward, but no less important. Its complexity has stretched well beyond something you can do manually.SEO PowerSuite‘s Rank Tracker has 20 keyword tools integrated right into it to help you out.

  1. Launch Rank Tracker and create a project for your site.
  2. Go to the Keyword Research module and click Suggest Keywords.
  3. Pick one of the 20 keyword research methods to use (you can later repeat the process for the rest of the methods).

  1. Type in the seed terms to base your research upon and hang on a moment while the tool is looking up suggestions and automatically grouping them by topic (so you can target entire groups of related terms with each of your landing pages).
  2. Now, examine the keywords and keyword groups Rank Tracker has found. Look at their competition, search volume, and efficiency, and pick the best terms for targeting by selecting them and hitting the Move to Target Keywords Module button.
  3. Under Target Keywords, you can now analyze the Keyword Difficulty for the terms you white-listed and build your keyword map by assigning the keywords to specific pages on your site.

Download SEO PowerSuite

4. Social signals do not have SEO value.

The Myth

Like click rates, social signals are perhaps among the most controversial factors for SEOs. Back in 2010, Matt Cutts openly said that engagement metrics from social media are used in the ranking algo. But later, Google started to deny the idea. For instance, here’s what Matt Cutts mentioned in one of his videos from 2014:

“To the best of my knowledge, we don’t currently have any signals like that in our web search ranking algorithms.”
The Evidence

To be fair, the statements Google made about not using social mentions for ranking my very well have been true at the time. But things change; and if you’ve been following Google’s comments on the topic, you’ve probably noticed that they have not been denying the impact of social signals on rankings lately. A little over a month ago, when asked about Rand Fishkin’s click experiments in a video Q&A, Google’s Andrey Lipattsev said the rankings of those pages didn’t go up just because of the clicks:

“You generated exactly the sort of signals we are looking out for: mentions, links, tweets and social mentions — which are basically more links to the page.”

And facts also tell the same.

In a big case study by Branded3, the agency ran an experiment to analyze the effect of tweets on rankings in Google on one of their sites.

The 8,528 pages analyzed were divided into three groups:

  • 1-99 tweets (5,322 pages)
  • 100-499 tweets (1,382 pages)
  • 500+ tweets (1,824 pages)

Then, Branded3 checked the rankings of each page, using the first four words from the page’s title as a keyword.

While there was a bit of correlation between tweets and rankings in all three groups, the pages with 500+ tweets provided the most interesting results. The average ranking positions for the group were as follows:

# of tweetsAverage Google rank

The graph below is particularly interesting: it shows the Google ranking for pages with over 1,000 tweets. As you can see, with 7,500+ tweets, a first page ranking is almost inevitable.

The Takeaway

With the strong connection between social shares and rankings, you can’t afford to overlook these metrics in your strategy. SEO PowerSuite‘s WebSite Auditor can give you a pretty solid idea where each of your site’s pages currently stand in terms of social media engagement.

  1. Launch WebSite Auditor and create or open a project.
  2. Once the tool has crawled your site, switch to the Pages module and go to thePopularity in social media tab. Here, you’ll find social share counts for every page of your site, including Facebook shares, Google +1s, LinkedIn shares, Pinterest pins, and more.

Download SEO PowerSuite

Now that you can see the status quo, you’ve got to do something to earn more of those social signals. This topic deserves a guide of its own, but your first steps should definitely include monitoring the new mentions of your posts and pages, along with some competitive research to give you an idea of how and where your competitors promote their content. You can do it all with an app for social media and Web listening, like Awario.

5. Keyword-optimized anchor text is bad for your SEO.

The Myth

Ever since the first Penguin update in 2012 (whose main focus were spammy links, over-optimized anchor text, and link relevancy), we’ve been hearing about the dangers of keyword-rich anchor text. In its guidelines, Google tells webmasters explicitly that optimized anchor text can do more harm than good.

“Links with optimized anchor text in articles or press releases distributed on other sites are examples of unnatural links.”
The Evidence

Let’s be clear on this: Penguin’s no joke and can be hard to recover from. Anchor text diversity should be one of your top concerns if you’re building links.

But none of that means your anchors shouldn’t be optimized for your target keywords. In an interesting study on the effects of Penguin, Microsite Masters thousands of sites to investigate whether websites that saw a ranking drop after Penguin were guilty of over-optimizing too many of their anchors. Interestingly, it turned out that websites that got hit had their target keyword for anchor text in over 65% of their backlinks, while sites that had keyword-optimized anchors for 50% fo their links or less were “all but guaranteed” not to be affected by Penguin.

But let’s put this to life. If you do a Google search for “click here”, you might be slightly surprised by the #1 result:

Just in case you haven’t noticed: the top page doesn’t even have the words “click here” in its content.

A quick analysis in SEO SpyGlass shows that 3% of the page’s links has the “click here” anchor text. To be clear, that’s the 5th top used anchor in the page’s link profile, after “adobe reader”, “get adobe reader” , “adobe acrobat reader”, and “acrobat reader”.

That tells us that anchor text is still a strong relevance indicator for Google, and you can (and should) use keywords in your anchors — or, you’ll be outranked by competitors that do.

The Takeaway

Backlinks with exact match anchor text still robustly correlate with rankings, and it’s not optimized anchors that can get you in trouble with Google, it’s lack of anchor text diversity.

As things go in SEO, there’s no universal share for keyword-rich anchors in your link profile that will guarantee you top rankings. The best way to go about it is taking a look at your top competitors for hints on anchors and their distribution that works in your niche. Again, SEO PowerSuite‘s SEO SpyGlass is your best friend here.

  1. Launch SEO SpyGlass and create a project for your top ranking competitor.
  2. Switch to the Summary submodule for a breakdown of the competitor’s link profile. Here, you’ll get a solid idea on the diversity of their anchors and get an anchor cloud for their links. Apart from helping you out with diversity, this dashboard can give you a good idea of anchors you may not have thought about before.

Do the same for your other competitors to find out the anchor trends that work in your niche, and… link build away!

Download SEO PowerSuite

To do the search engines justice, none of these myths are in fact their fault. Not entirely, anyway. They are often born and reinforced by SEOs themselves, heated up by fears and the gaps in what we know about Google’s algorithm.

If you think about it, all these myths boil down to one: “the SERP is a democracy”. It’s far from being true, and that’s not because Google’s imperfect. It’s because technology at large is imperfect. Every new cutting-edge system and algorithm is inevitably followed by equally cutting-edge ways of gaming it.

It is especially hard for Google to stay “fair” because it is trying equally hard to stay fresh. If you think about it, it’s a bit of an “either or” situation, and a lot of cheating goes unnoticed as Google tries to deliver content to searchers as it happens. It’s a sacrifice they make for the sake of the real-time Web Google’s — successfully — trying to create.

As always, I’m looking forward to your comments and some great discussion. Any SEO myths you can think of that are not on this list? Please share your thoughts and experiences below.

By: Masha Maksimava

8 Steps to a Penguin-Proof Link Audit

Written by   in  Uncategorized
Get ready for Penguin 4.0: it’s coming any minute now!
By: Masha Maksimava
June 21st, 2016

If Google updates were hurricanes, Penguin would be Patricia. It’s by far the most sweeping of all updates — and perhaps the hardest one to recover from. The first Penguin, released in 2012, made unnatural links (the ones SEOs have been carefully building for ages) a spam signal that removed sites guilty of “link schemes” from the SERPs, and changed the way we do off-page SEO completely.

Today, Penguin 4.0 seems to be just round the corner, and it’s going to be a very different breed of Penguin. Apart from the fact that the mechanisms used to detect spammy, unnatural links will be tweaked, it’s going to be real-time; that means Penguin, instead of being rolled out every year or every quarter, will always be “on” and updating. You read that right — no more calm before (or after) the storm for SEOs.

So when is the scary thing coming? Google warned SEOs about it “coming soon” a couple of times by now; the original release was planned for early 2016, but it looks like the update needed more testing than Google initially thought. In any case, there hasn’t been an official Penguin update for over a year now, and there’s no doubt it’s coming super soon. The scary, we-are-talking-weeks kind of soon. The you’re-totally-getting-hit-if-you-have-not-been-preparing kind.

6 SEO trends to get ready for Insights from the inside: key takeaways from SMX West 2016

Written by   in  Uncategorized

Using keyword research for consumer intelligence

We’ve heard a lot about keyword research and its importance for on-page SEO. But have you thought about keyword research being the most trustworthy source of consumer and market insights?

Interestingly, ‘most’ is the key word here. Whatever happened to the good old customer surveys? Isn’t asking your customers what they need the best, easiest, and most effective way to get honest answers?

Well… According to Tony Verre from DreamFire, their answers are only honest half of the time — at best.

Tony Verre @TonyVerre
Founder, DreamFire Digital Marketing
“50 to 84% of consumers don’t provide honest responses about their behaviors in marketing surveys.”

This can be hard to believe, but Tony quotes some amazing examples of how big brands went through equally big losses when they invested in marketing strategies based on customer feedback:

  • Walmart decluttered its shelves based on consumer surveys, losing $1.85 billion in revenue.
  • McDonald’s spent $300 million to create an “adult burger” in response to consumer feedback, which they eventually discontinued and labeled a failure.

There’s a bunch of reasons why consumers aren’t always completely honest in surveys. First, they might want to sound superior to the guy or girl next to them. Or, on the contrary, be trying to fit into the social norm. They could be too polite to tell you your idea, product, or service sucks. Quite often, they aren’t even being dishonest on purpose; sometimes it’s hard to give a truthful answer when asked about something upfront — because you’ve never really given the subject much thought.

So if we can’t trust consumers, who can we trust? Surprise surprise: search engines! The queries people type into search bars are the best — and extremely honest — examples of what they really want.

The great part is, there’s a bunch of keyword tools available to give you all that information (Tony’s top 3 are Keyword Planner, Google Related Searches, and Google Trends); all you need to do is group and interpret it right. The greatest part? You’re probably doing keyword research for SEO anyway — so you can get actionable insights with barely any extra effort. But before we move on to the how, let’s make things clear with the what. What are those ‘actionable insights’, exactly?

First, they are answers to the most burning questions for every marketer and enterpreneur.

What problems are people facing in your industry?
What needs are they solving with products like yours?
What bothers them most about your competitors?
What, on the contrary, do they like?

Second, they are the phases in the consumer journey that your customers go through. Being aware of those will let you make sure you meet your customers’ needs at each stage, and guide them smoothly on to the next one.

The result? You end up with both an on-page strategy and a holistic marketing strategy, built on the data you can trust.

Actionable tip

You can always do the research manually — but who wants that when there are free tools to save you time? 🙂 You can complete the entire research workflow described by Tony with SEO PowerSuite‘s Rank Tracker. Rank Tracker lets you get keywords from 20 keyword tools, including Tony’s favorites:

  • Google AdWords Keyword Planner
  • Google Trends
  • Google/Bing Related Searches
  • Google/Bing/Yahoo Autocomplete
  • Competitors’ pages
  • Google Search Console & Analytics
  • SEMRush
  • Keyword discovery

OK, I might want to stop here, but there are 20 keyword tools integrated into Rank Tracker in total. To start your keyword research, launch Rank Tracker, go to the Keyword Research module, and click Suggest Keywords. Pick any method you’d like to use (you can repeat the search for as many keyword tools as you like) and give the app a second to dig out the terms. The keywords you get will be readily grouped by topic to make organizing and navigating through them easier. You can sort keywords and groups by search volume to pick the most frequently used ones, and mark the terms that belong to one phase in the consumer journey with tags or color markers.

Oh, and you can do all of it in Rank Tracker’s free-forever version. But if you had to export your data, you’d need a Professional or Enterprise license.

Download SEO PowerSuite

Voice search and the staggeringly near future of SEO

Google talks a lot about its increasing focus on mobile. We know that in 2015, people conducted more searches on phones and tablets than on desktop. But what exactly does it mean for SEOs? In the long run, it’s much more than going mobile-friendly.

Here’s one thing you may not have thought about yet: in the mobile world, people increasingly use speech. Google reports that 55% of teens and 40% of adults use voice search every day on the Google app. And it’s only logical that the amount of voice search is growing so fast. The concept may seem a little sci-fi-ish at first, but if you think about it, there’s not one reason why search shouldn’t be shifting away from typing. Voice search is quicker, it’s hands-free, and most importantly, it’s working extremely well: the current speech recognition word error rate is 8 percent, and it’s going down steadily.

Behshad Behzadi @GoogleZurich
Principal Engineer, Google Zurich
“The ratio of voice search is growing faster than type search.”

From the SEO standpoint, how is voice search different from typing? First, we must acknowledge that people don’t search the same way with their keyboards as they do with their vocal chords. Voice search is conversational search where people use more natural sentences instead of the odd-sounding query language:

“weather paris” -> “What’s the weather like in Paris?”

For SEOs, that means semantic context and conversational search is getting even bigger than before. We’ve been bidding farewell to short-tail keywords on over-optimized pages since the launch of Hummingbird in 2013, but voice search clearly makes exact-match keywords a thing of the past for good.

Second, it’s crucial to understand that a person using voice search has different intentions. Typically, they’re looking to get quick answers to their questions or fulfill an immediate need.

This means you’ll need to think about the questions your customers may have, and figure out quick, mobile-friendly ways to give them the answers.

Actionable tip

With the growing share of voice queries in search, you need to optimize for context, topics, and conversational keywords, and offer quick answers to fulfil the searchers’ immediate needs:

  • When you do keyword research, put even more focus on long tails, questions, and conversational queries.
  • Consider adding pages (or editing some of the ones you already have) that can give searchers quick information. Think about what questions your customers or audience are asking, and address those on your blog, FAQ pages, and social media posts.

You can find commonly searched-for questions with the keyword research module I already mentioned in SEO PowerSuite‘s Rank Tracker. Simply launch Rank Tracker (free version is just fine), go to the Keyword Research section in the left-hand menu, and press the Suggest keywordsbutton. Pick the Google Autocomplete method from the list, and type in keywords with wildcards. If, say, you’re running an SEO blog, these could be some good terms to enter:

Why * SEO
How * Google
What * search engines

Give Rank Tracker a moment to complete the search, and you’ll end up with hundreds of questions you can potentially target along with each term’s search volume to let you pick the most commonly asked ones.

Download SEO PowerSuite

The AI revolution: how RankBrain is shaking up the SERPs

Google’s known for its persistent thrive to provide the searchers with the best search results possible, and RankBrain is one of the new things that helps them achieve that. RankBrain is a machine learning system that analyzes search results and picks the most relevant ones to display at the top of the SERP based on the data from past searchers. Google did mention that RankBrain is the third most important factor in its ranking algorithm.

SEOs belive that to figure out relevance, RankBrain looks at all kinds of contextual features on pages that come up for any given query. Then, it gives every result a relevance score based on the presence of absence of those features.

It’s important to understand that those features are different for each keyword, so there’s no universal list of things to add to your pages to make RankBrain fall in love with your site. But one thing you can do is look at the pages of your best ranking competitors and try to figure out the common characteristics that make up their relevance score.

Such features can be literally anything on the page that can have a positive effect on user experience. To give you an example, Marcus Tober and the Searchmetrics team found that for ecommerce and health, pages with more content and more interactive elements are more successful.

To see how RankBrain has already impacted search results, Eric Enge and his colleagues at Stone Temple Consulting analyzed 163 queries from their database of cached Google SERPs. All of the 163 queries met the following criteria:

1. The search results shown indicated that Google didn’t understand the query;
2. There is a reasonable set of results that Google should be able to find for the query.

Then, they compared the cached results from July 2015 (before RankBrain) to the ones Google currently provides for these queries, and found something pretty staggering: 54.6% of the results have improved.

Here’s an interesting example from Stone Temple’s study: only half a year ago, Google provided some pretty confusing results for a keyphrase that, to a human, seems pretty straightforward. Now, they are doing a much better job:

Eric does admit that not all of the improvements they saw may have been due to RankBrain, but he strongly believes that a lot of the changes were RankBrain-related.

Eric Enge @stonetemple
CEO, Stone Temple Consulting
“RankBrain will do a better job of matching user queries with web pages, so you’d be less dependent on having all the words from the query on your page.”
Actionable tip

Relevance is crucial for good rankings, and RankBrain is perhaps Google’s best way of detecting relevance to date. When optimizing for your keywords, it’s more important than ever to think about user intent and fulfilling the need behind the query — even if that does not imply using the exact keywords in your page’s copy.

You can get a good idea of relevance features that may be used as signals by RankBrain if you analyze the pages of your competitors who currently rank in Google’s top for your set of keywords. To get a list of your major competitors, open SEO PowerSuite‘s Rank Tracker, go toPreferences -> Competitors, click Suggest, and enter your keywords (you can and should make the list long, but make sure you only enter the terms that belong to the same topic at a time). Rank Tracker will look up all the terms you entered and come up with 30 sites that rank in top 30 most often. You can then choose up to 10 of those to add to your project, examine their pages in the browser, and look for relevance features you may want to incorporate on your site.

Download SEO PowerSuite

AMP is Google’s next big thing (here’s why you should care)

AMP (short for Accelerated Mobile Pages) is an open source project by Google that helps webmasters create lightning-fast mobile web pages. AMP includes three parts: AMP HTML (regular HTML with some restrictions and extensions), AMP JS (a library that ensures the fast rendering of AMP pages), and AMP Cache (Google’s cloud cache intended to reduce the time it takes for content to load on a user’s mobile device).

Google officially launched AMP into the search results this February for some queries, but we expect it to affect more searches soon.

With AMP, Google is looking to solve 2 major problems:

1. The poor user experience on mobile due to slow loading pages.
2. The increasing use of adblockers on mobile devices.

Google believes that more and more mobile users opt for adblockers due to the fact that pages with ads load slower. Subscequently, publishers, advertisers, and Google itself aren’t getting the same revenue from ads they used to.

Initially, webmasters were skeptical about AMP as it seemed too restrictive to achieve most websites’ purposes. The biggest concern was that JavaScript basically wasn’t allowed at all. However, now AMP has what Paul Shapiro from Catalyst calls a “magic extended component”, amp-iframe, that lets you do awesome things with your accelarated pages:

  • Web forms, for lead generation or other purposes,
  • Third-party JavaScript,
  • Embedded comment systems,
  • Some unsupported ad formats,
  • Videos via players not supported by AMP,
  • Interactive visualizations/charts (Google Maps, etc.),
  • Pop-ups.

But the main question is still the same: who is AMP right for? Currently, AMP results are only displayed in Google for publishers and news sites. But according to Googler Dave Besbris, it’s “an important time to get your feet wet” for all other industries as well.

Dave Besbris @tweetbez
Vice President of Engineering, Google
“We are going to be rolling out AMP elsewhere, including in search.”
Actionable tip

For non-publishers, going AMP is a matter of choice. Before you make your decision, it’s important to understand whether or not there are speed issues on the mobile versions of your pages. If you you detect some problems — and if those are numerous and require a lot of your time and effort — you may consider AMP as an alternative to fixing the issues.

You can run a comprehensive mobile-friendliness test in SEO PowerSuite‘s WebSite Audior underContent Analysis. Just select the page you’re looking into, click Update Factors, and look at the Page Usability (Mobile) section of the technical factors for any errors or warnings.

Download SEO PowerSuite

The backlinks of the future are… linkless?

For years, links have been the top ranking signal that SEOs tried hardest to optimize (and quite often, manipulate). But times are changing, and search engines are getting smarter.

Today, search engines have already figured out a better way to understand what to trust. In fact, the top important ranking signal has already evolved way beyond the technical definition of a ‘link’. You’re in for a surprise: linkless mentions can be just as important.

Duane Forrester, formerly Sr. Product Manager at Bing, points out that unlinked mentions can be just as strong a signal as regular links, confirming that search engines can easily identify mentions, associate them with brands and products, and use those as a signal in determining a site’s quality and authority.

Duane Forrester @DuaneForrester
Vice President of Organic Search Operations, Bruce Clay
“Years ago, Bing figured out context and sentiment of tone, and how to associate mentions without a link. As the volume grows and trustworthiness of this mention is known, you’ll get a bump in rankings as a trial.”

Of course, links continue to be just as important as they’ve always been. What has changed is that now, along with links, you absolutely need to gain reviews and other positive mentions. You may have already been doing this for reputation management and brand awareness; now you have every reason to do it for SEO, too.

Actionable tip

In addition to a backlink checker, you’ll now need a monitoring tool that will find mentions of your brand and product across the Web. Mind that a lot of apps only look for mentions on social media, so make sure the one you choose is good at digging up mentions that come from the regular web resources, too (like review platforms, forums, and industry blogs). AWARIO is reallygood at this (they have a free trial, too). If you have an SEO PowerSuite license, you can also get AWARIO at half its regular price for 6 months here.

Make sure you watch and promptly reply to any negative comments about your brand to stop them from spreading further, and follow up on positive ones. If there are little or no mentions about your brand, it’s a good idea to start offering perks and bonuses to your customers in exchange for reviews or blog posts, or begin partnerships with influencers in your niche.

Google confirms using click data to rank pages (in one way at least)

The debate on whether or not Google uses user behavior factors in its ranking algorithm has been on for ages, and at SMX, Google’s ranking engineer Paul Haahr finally shed some light on the issue.

According to Paul, Google uses a bunch of “metrics” to evaluate and rank search results. Those metrics can come from human quality ratings and, more importantly, live experiments. Thousands of such experiments are run for Google searchers in real time, at any given second.

Paul Haahr @haahr
Ranking Engineer, Google
“When you run a search on Google, you’re in at least one experiment.”

How do those tests work? Google swaps search results returned in response to queries and then looks at how the change affects the click-through rates of those results.

After an experiment is finished, Google will re-rank search results accordingly. So while Google may not have confirmed using click data to re-rank the search results in real time, they sure are using it in experiments and re-rank the results once the experiment is complete.

In other words, we’re still not sure if they do this:

Gather click data for search results -> Determine an unsual range of clicks for result A -> Re-rank the result accordingly

But we now know for sure they do this:

Swap results A and B on page 1 of search results -> Gather click data for search results -> Determine an unsual range of clicks for result A -> Re-rank the result accordingly

Tomayto tomahto, if you ask me. For SEOs, this still means that click data is important for rankings — and it’s just another reason to optimize your Google snippets.

Actionable tip

The logical prerequisite of a good SERP click-through rate is a click-worthy snippet, so the first thing to do is check on your snippets’ current click-through rates in Google Search Console to identify the ones you need to focus on first.

Consider A/B testing your snippets if you do PPC, and then incorporating the insights you get from those tests into the snippet of your organic listing.

To edit and play around with your snippet with a Google SERP preview, launch SEO PowerSuite‘s WebSite Auditor, go to Content Analysis -> Content Editor, and switch to the Title & Meta tags tab.

As you compose your title and description, make sure they clearly communicate the value of clicking through to your page to searchers. If appropriate, use a call to action and instead of simply describing what your page is about, address the searcher directly, and inform them about the benefits of navigating to your page, choosing your product, and so on.

Once you’re happy with your snippet, hit Save page to save the upload-ready HTML file to your hard drive.

Download SEO PowerSuite

In summary, we can totally see how quickly SEO is changing — often in ways we could not have predicted. This way, search engines are keeping the game hard, fair, and — if you think about it — interesting.

Even if user behavior isn’t a ranking factor yet, it will surely be one soon

Written by   in 

This article first appear on SEOPowerSuite.

Whether or not user behavior factors affect rankings is a controversial topic: Google denies it, while many experiments prove the opposite.

Anyhow, even if user signals are not influencing your rankings right now, common sense and logic say they are the ranking factors of tomorrow, so it may be wise to get ready today.

Rand Fishkin presented his 2-algorithm concept of SEO, suggesting search marketers to combine classic Google-oriented and the new searcher-oriented SEO.

In a two-algorithm world, we need to focus on 5 new elements of SEO — the so-called searcher outputs.

1. Click-through rates. With behavior factors in play, your CTR will be one of the things that determine how you rank — so revising your titles and descriptions one more time could pay off in even more ways than before.

2. Engagement. Do the searchers find on your page what they are looking for, or do they go back to click other search results? Do the searchers stay on your page, proceed to other pages, or do they bounce in just a second or two? The way users engage with your content is likely to influence your rankings, so Rand suggests a list of things to attend to for better engagement:

  • Content that fulfills the searcher’s conscious and unconscious needs;
  • Speed, speed, and more speed;
  • Delivering the best UX on every browser;
  • Compelling visitors to go deeper into your site;
  • Avoiding features that annoy or dissuade visitors.

3. Information that fills gaps in searchers’ knowledge. With the purpose of delivering a rewarding search experience, Google’s machine learning models could look at search results that people eventually land on when they search for keyword X to identify what those have in common. For example, the presence of certain words could predict more successful searches. Watching users searching for “New York”, Google could conclude that a page about New York that doesn’t mention Brooklyn or Long Island is not offering the information searchers are looking for (and hence should not rank very high).

4. Shares, links, and loyalty per visit. Though social signals aren’t officially a ranking factor (while backlinks are), experiments show that pages with lots of social activity and few links outperform ones with more links and fewer shares — even for insanely competitive keywords.

But it’s not just the bare numbers that count – Google may also be looking at how social activity grows over time, and whether or not social engagement results in loyalty and returning visits.

5. Fulfilling the searcher’s task (not just their query). Google wants searchers to complete their tasks quickly, so it’s quite possible that the ranking results will differ depending on the user intent (for instance, purchase) that Google associates with a particular query.

This article first appeared on SEOPowerSuite’s Web site.
I strongly recommend this site to anyone wanting to better understand SEO. 

Higher Google ranks no longer mean more organic clicks

Written by   in 

This articles first appeared in SEOPowerSuite. The content was provided by:

Chris Pinkerton
Vice President, Client development at North America Mediative,@chrispinkerton
“With different layouts of search result pages, where searchers are conditioned to look and click changes”

Ehren Reilly

Director of Product at Glassdoor, @ehrenreilly
“It’s no longer viable to have high-quality non-proprietary content.”

Eric Enge,

CEO and Founder at Stone Temple Consulting, @stonetemple
“We ran a test, publishing 5 pieces of content that answered 5 common SEO questions. In 3 days, we saw our content included into Google’s rich answers.”

Barry Adams,

Founder at Polemic Digital, @badams
“If you waste crawl budget because of slow page loading, the right pages are unlikely to be crawled & indexed.”

Adam Audette

SVP Organic Search at Merkle, @audette
“If an ecommerce site makes $100k in revenue per day, a 1 second load delay equates to $2.5M in lost revenue annually.”

Marshall Simmonds,

Founder of Define Media Group Inc, @mdsimmonds
“Dark traffic is a problem for marketers because we’re not getting credit for the traffic that we’re creating. Be aware and try to analyze it.”

Christine Churchill

President and CEO of KeyRelevance LLC, @chrischurchill
“Research core phrases plus synonyms and words that frequently occur with your main terms. Then build a keyword matrix, assigning each of the keyword groups to different pages.”
Sha Menz
Link Removal Specialist, rmoov.com, @ShahMenz
“You don’t have a link penalty and think you’re safe? Don’t bet on it. You need to actively disavow and remove low quality links that you find during link audits even if you don’t have a penalty yet.”
Rand Fishkin
Founder, MOZ, @randfish
“Make pages for people, not engines” is terrible advice. Engines need the things we’ve always done, and we better keep doing that. People need additional things — and we better do that,

Are you still assuming the #1 Google ranking guarantees you the most organic clicks?

Well, here comes the bad news… While the click-through rate (the percentage of searchers that click on a site’s listing when they see it in search results) is surely still correlated with positions, lots of other factors can influence your organic clicks in Google of 2016.

Paid ads, local packs, carousel results, knowledge graphs and rich snippets — all these attention-grabbing SERP elements can drain away the clicks from your #1 ranking website.

The latest study by Mediative (that was tracking searchers’ eye movement and eventual clicks across different SERPs) showed that the way searchers interact with the SERP varies a great deal from query to query.

Here’s, for example, how the views and clicks are distributed on a SERP with a local pack, a paid ads pack and a carousel:

For search marketers, this change means two things:

1) Keyword search volume itself is no longer a reliable metric to predict how much organic traffic this keyword is able to generate for your website. Before investing effort into optimizing your site for a keyword, take a look at the search result page. See if there are many SERP elements that may be stealing clicks from your organic listing to get a better idea of the traffic potential of this word.

2) Besides trying to get higher search rankings, you need to pay attention to the extra click opportunities that the SERPs give. Can you squeeze into the local pack results? Can you utilize structured data to get a rich snippet listing? Shouldn’t you launch a Google AdWords campaign for this particular query?

Rich answers are on the increase — that’s both a threat and an opportunity

Another huge tendency in search is the increase of Google’s rich answers. According to a study by Stone Temple Consulting, Google returns rich answers to 35% of search queries these days. This is a 38% increase over the past 6 months, and we are clearly in for some further growth.

Quite often, rich answers are built based on public data (like “President of the USA”) or the data licensed to Google (like song lyrics). So, if your SEO strategy was built on public domain data… you better change direction right now, because getting search traffic from Google will only be getting harder for you.

But what if you have high-quality, unique, proprietary content that can help Google answer common searchers’ questions? Well, rich answers are an opportunity for you. According to Eric Enge of Stone Temple Consulting, for 75% of rich answers Google uses external data and includes a link to its source.

Getting your page featured in a rich answer can give you a massive traffic boost — CTR for clickable rich answers is ~2X better than that for the #1 ranking page on a SERP with no rich answer. And the best thing is, getting featured in rich answers is absolutely possible even if your site’s authority is not very high yet.

So, how about pushing your site to direct answers and thus making it rank higher than Her Highness Wikipedia? 🙂

  • 1. Start with long tail keyword research — you need to identify the commonly searched questions in your niche.
  • 2. Create a piece of content that directly answers these questions. Make sure to include the question itself, and a direct answer to it — keep in mind that for rich answers, the structure of your answer is more important than your site’s relevance and authority.
  • 3. Make sure your article is truly helpful and provides additional information on the matter. This will not only increase your chances of getting featured as a rich answer, but will help you entice more clicks.
  • 4. Make your content easy to find for people and search engines (make sure it’s available to Google bots and easily accessible through your site’s navigation; share links to it on your social accounts; submit them via Google search console, etc.)

Page speed is utterly important — optimize it today, don’t put it off till tomorrow

First things first: page speed is a ranking factor. All other things being equal, the site that loads quicker will outrank a competing site, hands down.

Second, slow-loading pages waste your site’s crawl budget (yeah, Google has allocated a specific time for crawling your website, and Google”s bot won’t stay on your website longer than that specific period). For a bigger website this means that the slower your pages load, the fewer of them get indexed by Google.

Tracking your organic traffic with Google Analytics gets even more difficult

If you’re staring in despair at your Google Analytics traffic report, unable to figure out where all this direct traffic is coming from… Your problem is “Dark traffic”.

According to Marshall Simmonds, when Google Analytics is unable to identify where your site’s visits are coming from, the visits are recorded as direct traffic. And as in the modern Web these un-identifiable visits keep growing in numbers, your Google Analytics reports get less precise: they report a direct traffic growth – while in reality you’re growing your organic, social and mobile traffic.

This makes tracking your marketing activities even more complicated.

So, first of all, you need to be aware of the cases when your traffic goes dark. According to Marshall, the common cases are:

  • Traffic from a secure site to non-secure;
  • Traffic from image search;
  • Traffic via links in applications;
  • A big portion of traffic from Facebook, SnapChat, WhatsApp;
  • Traffic from the Android search app.

Second, though checking the precise amounts of dark traffic is impossible, you can at least get a better idea of how it affects your site. To do that, Marshall recommends to:

  • Create a direct traffic report in Google Analytics
  • And then filter out the traffic to pages that are naturally visited “directly” — like your homepage or the front pages of important content sections which users are likely to bookmark.

5. Keywords are neither dead nor dying — they are still the basis of your SEO campaign

Keywords and keyword targeting are the most basic and longest-running concepts in SEO. And if you’re in search for quite some time, you may remember the days when SEO meant just having the right words in your meta keywords tag.

Sure, these times have passed and will never come back: search engines now use much more complicated algorithms to determine webpages’ quality and relevancy. But does this mean keywords are dead? Experts agree — keywords and keyword research should still be the basis of your SEO and content marketing campaigns. However, Google’s Hummingbird update shifts our focus from researching separate keywords to researching groups of related terms and synonyms.

Now that Google is able to recognize the meaning behind a search query, it gives a common answer to a number of “different-in-keywords” but “same-in-meaning” queries. So if you want to grab yourself a place in the Post-Hummingbird search results, you need your pages to be relevant not only to the core term, but for a whole group of its synonyms and related terms.

The aim of your keyword research is now in identifying not individual keywords, but the groups of thematically connected terms your pages will target.

This article first appeared on SEOPowerSuite’s Web site.
I strongly recommend this site to anyone wanting to better understand SEO. 

User Behavior as a Ranking Signal

Written by   in 
How user metrics determine where you rank in Google (and how you can use that to your advantage)

The debate around the correlation between behavioral metrics and rankings has been on for a few years at least. If you think about it, the topic seems to be pretty uncontroversial — after all, that’s a yes or no question. Then why are opinions on it so divided?

The problem is, what Google representatives have said on the issue has been surprisingly inconsistent. They confirmed it, and they denied it, and they confirmed it again, but most often, they equivocated.

Here’s a little thought experiment: for a moment, let us assume that user behavior is not a ranking signal. Can you think of a reason why Google employees would often publicly say the opposite?.. (You’ll find some pretty straightforward quotes later on in this article.)

Now, let’s assume for a second that Google does consider user metrics in its ranking algorithms. Why wouldn’t they be totally open about it? To answer that question, I’d like to take you back to the days when links largely determined where you ranked. Remember how everyone did little SEO apart from exchanging and buying backlinks in ridiculous amounts? Until Penguin and manual penalties followed, of course.

With that experience in mind, Google probably doesn’t have much faith left in the good intentions of the humankind — and it’s very right not to. Revealing the ins and outs of their ranking algorithms could be an honorable thing to do, but it would also be an incentive for many to use this information in a not-so-white-hat way.

If you think about it, Google’s aim is a very noble one — to deliver a rewarding search experience to its users, ensuring that they can quickly find exactly what they’re looking for. And logically, the implicit feedback of Google searchers is a brilliant way to achieve that. In Google’s own words:

“The general assumption […] is that searching users are often the best judges of relevance, so that if they select a particular search result, it is likely to be relevant, or at least more relevant than the presented alternatives.”

In this article, you’ll find the evidence that Google uses behavioral metrics to rank webpages, the 3 most important metrics that can make a real SEO difference, and real-life how-tos on applying that knowledge to improve your pages’ rankings.

The evidence

Using behavioral metrics as a ranking factor is logical, but it’s not just common sense that’s in favor of the assumption. There’s more tangible proof, and loads of it.

1. Google says so.

While some Google representatives may be reluctant to admit the impact of user metrics on rankings, many aren’t.

In a Federal Trade Commission court case, Google’s former Search Quality chief, Udi Manber, testified the following:

“The ranking itself is affected by the click data. If we discover that, for a particular query, hypothetically, 80 percent of people click on Result No. 2 and only 10 percent click on Result No. 1, after a while we figure probably Result 2 is the one people want. So we’ll switch it.”

Edmond Lau, who also used to work on Google Search Quality, said something similar:

“It’s pretty clear that any reasonable search engine would use click data on their own results to feed back into ranking to improve the quality of search results. Infrequently clicked results should drop toward the bottom because they’re less relevant, and frequently clicked results bubble toward the top.”

Want more? Back in 2011, Amit Singhal, one of Google’s top search engineers, mentioned in an interview with Wall Street Journal that Google had added numerous “signals,” or factors into its algorithm for ranking sites. “How users interact with a site is one of those signals”.

2. Google’s patents imply so.

While Google’s employees can equivocate about the correlation between user metrics and rankings in their public talks, Google can’t omit information like this in their Patents. It can word it funny, but it can’t omit it.

Google has a whole patent dedicated to modifying search result ranking based on implicit user feedback.

“[…] User reactions to particular search results or search result lists may be gauged, so that results on which users often click will receive a higher ranking.”

There are a bunch of other patents that mention user behavior as a ranking factor. In those, Google clearly states that along with the good old SEO factors, various user data can be used by a so-called rank modifier engine to determine a page’s position in search results.

3. Real-life experiments show so.

A couple of months ago, Rand Fishkin of Moz ran an unusual experiment. Rand reached out to his Twitter followers and asked them to run a Google search for best grilled steak, click on result No.1 and quickly bounce back, and then click on result No. 4 and stay on that page for a while.

Guess what happened next? Result No.4 got to the very top.

Coincidence? Hmm.

The 3 user metrics that make a difference

Okay, we’ve figured out that user behavior can affect your rankings — but what metrics are we talking about exactly?

1. Click-through rate

It is clear from numerous patents filed by Google that they collect and store information on click-through rates of search results. A click-through rate is a ratio of the number of times a given search listing was clicked on to the number of times it was displayed to searchers. There isn’t, of course, such a thing as a universally good or bad CTR, and many factors that affect it are not directly within your control. Google is, of course, aware of them.

To start with, there’s presentation bias. Clearly, SERP CTR varies significantly for listings depending on where they rank in search results, with top results getting more clicks.

“The basic rationale embodied by this approach is that, if a result is expected to have a higher click rate due to presentation bias, this result’s click evidence should be discounted; and if the result is expected to have a lower click rate due to presentation bias, this result’s click evidence should be over-counted.”

Secondly, different click-through rates are typical for different types of queries. For every query, Google expects a CTR in a certain range for each of the listings (e.g. for branded keywords, the CTR of No.1 result is around 50%; for non-branded queries, the top result gets around 33% of clicks). As we can see from Rand Fishkin’s test mentioned above, if a given listing gets a CTR that is seriously above (or below) that range, Google can re-rank the result in a surprisingly short time span.

One other thing to remember is that CTR affects rankings in real time. After Rand’s experiment, the listing users were clicking and dwelling on eventually dropped to position No. 4 (right where it was before). This shows us that a temporary increase in clicks can only result in a temporary ranking improvement.

2. Dwell time

Another thing that Google can use to modify rankings of search results, according to the patents, is dwell time. Simply put, dwell time is the amount of time that a visitor spends on a page after clicking on its listing on a SERP and before coming back to search results.

Clearly, the longer the dwell time the better — both for Google and for yourself.

Google’s patent on modifying search results based on implicit user feedback says the following user information may be used to rank pages:

“The information gathered for each click can include: (1) the query (Q) the user entered, (2) the document result (D) the user clicked on, (3) the time (T) on the document […]. The time (T) can be measured as the time between the initial click through to the document result until the time the user comes back to the main page and clicks on another document result. In general, an assessment is made about the time (T) regarding whether this time indicates a longer view of the document result or a shorter view of the document result, since longer views are generally indicative of quality for the clicked through result.”

3. Pogo-sticking

Google wants searchers to be satisfied with the first search result they click on (ideally, the No.1 result). The best search experience is one that immediately lands the searcher on a page that has all the information they are looking for, so that they don’t hit the back button to return to the SERP and look for other alternatives.

Bouncing off pages quickly to return to the SERP and look for another result is called pogo-sticking.

“Additionally, the user can select a first link in a listing of search results, move to a first web page associated with the first link, and then quickly return to the listing of search results and select a second link. The present invention can detect this behavior and determine that the first web page is not relevant to what the user wants. The first web page can be down-ranked, or alternatively, a second web page associated with the second link, which the user views for longer periods or time, can be up-ranked.”

All of this affects your website’s overall quality score

OK, this one is really big: good performance is terms of clicks and viewing times isn’t just important for the individual page (and query) you are going after. It can impact the rankings of your site’s other pages, too.

Google has a patent on evaluating website properties by partitioning user feedback. Simply put, for every search somebody runs on Google, a Document-Query pair (D-Q) is created. Such pairs hold the information about the query itself, the search result that was clicked on, and user feedback in relation to that page. Clearly, there are trillions of such D-Qs; Google divides them into groups, or “partitions”. There are thousands of “partition parameters”, e.g. the length of the query, the popularity of the query, its commerciality, the number of search results found in Google in response to that query, etc. Eventually, a single D-Q can end up in thousands of different groups.

“So what?”, you might reasonably ask. The reason why Google is doing all of this mysterious partitioning is to determine a certain property of a website, such as its quality. This means that if some of your pages perform poorly in certain groups in comparison to other search results, Google could conclude that it is not a high-quality site overall, and down-rank many — or all — of its pages. The opposite is also true: if some of your pages perform well compared to competitors’, all of your sites’ pages can be up-ranked in search results.

“In general, one innovative aspect of the subject matter described in this specification can be embodied in methods that include the actions of: […] evaluating, by a processor, a property parameter of the website based on aggregated user feedback data of the D-Qs included within at least one of the one or more groups; and providing the evaluated property parameter as an input for ranking documents from the website as result documents for searches.”

How-to: Optimize user metrics for better rankings

The three metrics above are a must to track and improve on. Offering a great user experience to visitors is a win-win per se; as a sweet bonus, it can also improve your organic rankings.

1. Boost your SERP CTR

The snippet Google displays for search results consists of a title (taken from the page’s title tag), a URL, and a description (the page’s meta description tag).

  • Find pages with poor CTR

Once you’re positive your meta description and title meet the technical requirements, it’s time to go after the CTR itself. The first thing you’d want to do is to log in to your Google Webmaster Tools account and go to the Search Analytics report. Select Clicks, Impressions, CTR, and Position to be displayed:

While CTR values for different positions in Google SERPs can vary depending on the type of the query, on average, you can expect at least 30% of clicks for a No.1 result, 15% for a No.2 result, and 10% for a No.3 result. Here’s a graph from Chitika’s CTR study:

If the CTR for some of your listings is seriously below these averages, these could be the problem listings you’d want to focus on in the first place.

  • Make sure your title and meta description meet the tech requirements

If you’re just starting at optimizing your listings’ organic click-through rate, the first thing you need to do is make sure that all your titles and meta descriptions are in line with SEO best practices.

By: Masha Maksimava

This article first appeared on SEOPowerSuite’s Web site.
I strongly recommend this site to anyone wanting to better understand SEO. 

SEO for e-commerce sites

Written by   in 

The article first appeared in SEOPowerSuite.

Doing marketing for e-commerce websites is associated with very specific challenges. First, e-commerce website promotion requires the marketer to balance between user experience, conversion optimization, and SEO. Second, online stores are typically large-scale and complicated — so trivial on-page optimization tasks can turn into a never-ending nightmare.

In this guide, you’ll find a set of content and technical issues anyone doing e-commerce SEO will face, and actionable how-tos and time-saving tricks on tackling them.


Keyword research
Forget generic keywords, think smart word combinations

Keyword research for e-commerce websites goes far beyond search volume and competition analysis. The reason for that is the complexity of the buying cycle — each step in it calls for different keywords you’d need to target.

Typically, customers of an online store go through 5 stages before making a purchase; different keywords and keyword phrases are used at each stage.

1. Compile 3 separate keyword lists

As you can see, keywords that customers use at the Research, Comparison, and Purchase stages offer the most value in terms of conversion potential. To get a comprehensive list of those, you’ll need to come up with 3 groups of keywords that you’ll later combine into long tail queries.

But before you start, remember to research search patterns typical of your target audience: consider gender, age, and social status. For example, if you are a male and you are struggling to get the organic traffic for a skin care store, take care to talk to your female colleagues or friends to find out the jargon they use when they talk about this stuff. Spend some time on relevant social media resources to learn your audience’s language.

When you’re positive you understand how your customers talk and which words they use, get down to putting up your keyword list.

  • Prepare a list of action keywords that customers might use at the Comparison and Purchase stages as part of their query. Don’t add the product or category names to these keywords yet.
    E.g. “buy”, “purchase”, “price”, “compare”, “review”.
  • Get a full list of brands available at your store.
    E.g. “Sony”, “Samsung”, “Apple”.
  • Compile a list of categories, product names, and their core properties, like size or color.
    E.g. “TV”, “laptop”, “smartphone”; “Iphone”, “Galaxy Note”, “34-inch display”.

2. Mix the keywords up

Once you’ve got these three lists ready, it’s time to move on to putting together search phrases. Combining generic keywords with product keywords and their properties should give you dozens of long tail keywords — like “buy 42-inch Samsung TV”. It works like a slot machine: you turn the reels and get new keyword phrases.

You can do it manually if you need to mix up a dozen of keywords. However, given the size of the inventory in most online stores, you will likely need software tools to get things done quickly.

Try using Rank Tracker‘s Word Combination option to get a full list of possible long tail keywords instantly.

1. Create or open a project in Rank Tracker.
2. Click the Suggest Keywords button.
3. Select Word Combination from the available keyword research methods, and hit Next.
4. Select the number of parts to combine, enter your keywords in the columns, and click Next once more.

(By the way, it looks exactly like a slot machine!)

In an instant, you’ll get plenty of long tail keyword phrases.

Select the keywords to add to your project and hit Update KEI to get their search volume, competition, and Keyword Efficiency Index.

Voila — you’ve just saved yourself a couple of hours!


Keyword matrix
Do smart keyword targeting to avoid cannibalization

Have you heard about keyword cannibalization? To put it short, if several pages of your website contain a response to the same search query, these pages will compete with each other in SERPs for rankings. The search engines may rank the page that is less suitable or important from your standpoint.

In order to avoid keyword cannibalization, create a keyword matrix. Fill the rows in a spreadsheet with the URLs of your site’s most important pages (most likely product category pages), and create columns for your keywords. Put a mark at the intersection of row/column to assign a certain keyword to a certain page. This method will help you make sure that you don’t use the important keyword across multiple pages.

Samsung tvToshiba tvSony tv

If the CMS of your online store creates separate pages for such variations of a product as size and color, it will make sense to restrict such pages from indexing using robots.txt or <meta name=”robots” content=”noindex”> tag. Canonicalization is another solution (see Google’s guidelines for detailed instructions).


On-page optimization
Save time on optimizing thousands of product pages

An e-commerce website typically has a limited set of category pages and thousands of product pages. Everything is more or less clear with category pages (they are usually subject to the traditional on-page SEO approach; if you are new to SEO, check out the A to Z SEO Guide for the steps).

Things get trickier when it comes to product pages. You’ll hardly have the time and resources to create unique titles, H1 tags, and descriptions for each product page.

Luckily, the slot machine approach (see the Keyword research section) can be used for meta tags just as well.

Create title, meta description and H1 templates for your product pages. For example, you may use this template for the title tag: Buy [ProductName] online | Your store name

[ProductName] is a variable that changes for every page depending on the product. If your CMS does not support variables, ask your development team for help.

Do the same for your H1s and descriptions — and remember that titles and meta descriptions are displayed in your listing’s snippet in SERPs, so make sure to use strong calls-to-action to entice clicks from search results.


Make sure every page on your site is unique

Duplicate content issues for e-commerce sites fall into two categories:

  • Off-site — the content appears on many websites across the web.
  • On-site — many pages of the website feature the same content.

1. Fix off-site duplication

Off-site duplication is natural for e-commerce. Online stores often use product descriptions, images, and specifications provided by the manufactures. This is logical, since you cannot invent new specs for the latest iPhone. However, there are a number of solutions for the problem.

  • Write unique descriptions for each item. If you have a team of copywriters to get the entire inventory covered — go for it. Just keep in mind that as the inventory scales up, you’ll need to keep up with the copy as well.
  • Leverage user-generated content. Create incentives for visitors to write reviews of the items they purchased. Send follow-up emails and ask for a review nicely, or offer discounts or bonuses to customers who leave a review. On the downside, there’s no guarantee that you will have a steady flow of reviews for all the items being sold. Additionally, reviews should be moderated to avoid spam or obscene language, which requires additional resources.
  • Add a Q&A section for each product. You can make your product descriptions unique by adding a FAQ section with questions customers often have about the product. Again, doing this will require additional human resources.
  • Optimize product category pages only. If you don’t have the time and resources to work on product pages, you can choose to create unique content for category pages only. In this case, it’s necessary to prevent the indexation of the product pages (using robots.txt or meta tags) — this means that the product pages will not appear in the SERPs.

2. Fix on-site duplication

On-site duplication is a frequent problem across the pages of online stores. It can be caused by the e-commerce content management system or an illogical website structure.

There are two typical scenarios. First, a product may belong to several categories, e.g. one Samsung TV set could be found in “Home”, “TVs”, and “Samsung”. The CMS may generate different URLs for the very same product depending on the path a user takes in the product catalog. For example:


Second, the CMS could generate a separate URL and page for variations of one product (e.g. size, color or other specifications). This approach wasn’t a problem before Google’s Panda algorithm update; currently, Google can penalize websites for duplicated product pages across different URLs. For example:


There are several ways to get around on-site duplication:

  • 1. Master URLs. No matter what path a user takes in the catalogue, the CMS must always return only one URL for a particular product. All product variations should be represented on one page reachable via one URL, so that the user is not redirected to other pages. This approach eliminates content duplication and ensures that your site’s Crawl Budget is used wisely.
  • Canonicalization. This technique does solve the duplicate content problem, but it can have drawbacks in terms of user experience and crawl budget. See Google’s Canonicalization guide for detailed info.


Out of stock and discontinued items
Create search-engine-friendly pages for unavailable products

Clearly, there are times when you store would run out of a certain product — or even discontinue an item completely. These two cases should be handled differently.

1. Create smart pages for temporarily unavailable products

If an item is temporarily unavailable, removing the page is not an option. The page should clearly state that the product is out of stock, and provide all the relevant information the visitor may need to make sure they either wait until the item arrives or order an alternative from you.

  • Include the item’s planned arrival date. This will help the visitors decide whether they’re ready to wait until the item is available, or if they should look for alternatives.
  • Offer an opportunity to get a notification when the item arrives. Even if you don’t know when the item is going to be available, it’s a good idea to give your visitors an option to get notified via email when it’s back in stock.
  • Give visitors a preorder option. If you’re positive the item is going to be available soon, let users preorder it. This will assure your customers that when the product is in stock, they will be the first to receive it.
  • Add a list of similar products. When you can, offer visitors alternative options to make sure they purchase from you and don’t go to competitors instead.

2. Choose how you’ll handle permanently discontinued products

If the item is permanently removed from sale, you have several options to deal with its product page.

  • Return a 404 page. 404 is a natural way to remove pages from the search engine index; the overall rankings of the website will not be affected. Make sure to remove 404 pages from your site’s XML sitemap — this will send a strong signal to the search engines that the page should be removed from the index. This approach is suitable for pages that don’t have a lot of backlinks and don’t serve as an entrance point to the website. If the page ranks well for some queries though, consider other options.
  • Create a 301 redirect to a similar item or relevant product category. The redirect will help you save link juice; on the downside, 301 redirects can increase load time and confuse the visitor.
  • Keep the product page, but state that the item is discontinued and offer an alternative. In this way, you will preserve the link juice and the page’s rankings. However, this option is not recommended if the online store’s inventory changes often — you don’t want to end up with thousands of ghost products wasting your Crawl Budget.


Use pagination properly to avoid duplication and indexing problems

Pagination is the practice of segmenting a piece of content into multiple pages. On an e-commerce website, pagination can create a series of very similar pages with overlapping content. If the pagination bar on your site only includes a few pages, and each number representing a subsequent page is visible and clickable, this will not usually pose a problem. Here’s an example:

But if the number of pages exceeds a certain amount, the pagination bar will display only a couple of initial pages and a few final pages. The in-between pages won’t be linked to from the main page — as a result, they will be crawled by search engines less often.

This issue may be addressed in two ways:

  • Add a View All option. Consider adding a page that contains the products from all pages. In this scenario, each split page should contain the rel=”canonical” link pointing to the view all page. See Google’s blog post for a detailed how-to.
  • Add rel=”next” and rel=”prev” tags. These tags can be used inside the <head> tag of a page to indicate next and previous pages in a series. The first page will only have a rel=”next” tag, and the last one — just a rel=”prev” tag, while the pages in-between will contain both. These tags give Google a hint to treat the split pages as one. This approach will help you consolidate backlinks, and Google will likely display only the most relevant page (the first one) in SERPs. For more information on rel=”next” and rel=”prev” , see this post on Google Webmaster blog.


Site Speed
Optimize your pages’ load time for better rankings and user experience

Site speed is a factor that has a double effect on e-commerce websites. A slow website is poor user experience; poor user experience often translates into lower sales. Site speed is a ranking factor, too; fast loading pages get an advantage over slower ones in search results.

First, you’ll need to test your main landing pages to make sure there are no speed issues. You can do that quickly with WebSite Auditor.

1. Create or open a WebSite Auditor project for your site.
2. Go to the Content Analysis module.
3. Select a page you want to test, enter your keywords, and proceed with the next steps as necessary.

Along with other content and technical info, the software will run a detailed page speed test. See the Page speed (Desktop) section and make sure your page is free from any issues that may be slowing it down.

Here are the 5 top things that affect page speed and are often ignored by e-commerce sites.

  • Eliminate unnecessary redirects. Very often websites redirect visitors from the non-www version to the www version, and then to the mobile version or a user-friendly URL. Eliminate such intermediate redirects whenever you can safely do that.
  • Optimize product images. Ecommerce websites usually have a lot of product images, which make up for the largest share of the traffic payload. Make sure that the all the images are optimized and compressed in size. Consider using smaller images with an option to open a large version.

  • Enable browser caching. E-commerce website visitors will typically view many pages per session. You do not want them to load the unchanged content again and again, do you?
  • Prioritize the load of visible content for pages that have a scroll bar.
  • Avoid JavaScript that blocks page rendering. It will cause the user’s browser to wait for the script to load before loading the page itself.


Deliver a great user experience across devices

50% of Google search traffic is mobile. About 60% of consumers use mobile devices to make purchase decisions. If you are promoting an e-commerce website, you can’t neglect this huge audience.

Just like site speed, a poor user experience on mobile devices may result in lower sales and negatively influence your rankings.

1. Go mobile if you haven’t already

If you haven’t taken your site mobile yet, you’ll need to start with choosing the right technology. There are three major options: dynamic serving, separate mobile pages, or responsive design.

For e-commerce sites, responsive design is perhaps the best way to go. Here are some benefits of this option:

  • Same URL for mobile and desktop versions of pages. Using a single URL for a piece of content makes it easier for users to interact with, share, and link to that content. Such pages are also easier for search engines to discover and index.
  • Content presentation is customizable depending on the type of device it is viewed from.
  • No redirects. Unlike with a separate mobile version of the site, responsive design requires no additional redirects. This makes for a better load time and user experience.

2. Double-check pages of a mobile site

If you aren’t sure if your page is totally mobile friendly, here’s a quick way to check that:

1. Open your WebSite Auditor project.
2. Go to Content Analysis.
3. Select the page to analyze against mobile-friendliness, and proceed with the next steps.

Once the analysis is complete, check the Page usability (Mobile) section to see if your page is fully optimized for mobile devices. Go through the factors under this section to see if you can make any improvements for your mobile visitors.


Create a secure site to win customers’ (and Google’s) trust

Search engines favor websites that securely encrypt the traffic between the site and a user. Going HTTPS is critical for e-commerce websites to protect the customers’ credit card details and other personal information.

You’ll need 2 things to go HTTPS: a dedicated IP and an SSL certificate. To get a dedicated IP, contact you hosting provider. Getting a certificate is no big deal either — there are a lot of SSL certificate providers like Comodo or Geotrust to name a few. Once you’ve installed the certificate, remember to test whether it’s configured correctly with this tool by SSL labs.

There are some common pitfalls to avoid when transferring to HTTPS.

  • If your website uses a content distribution network, third party scripts and APIs, make sure they support HTTPS. Otherwise, visitors will get errors on page load or notifications that only part of the content is encrypted.
  • Make sure all internal links point to the HTTPS version of the website. If your web developers use absolute links, you’ll definitely have to fix those.
  • Configure redirects from the HTTP to the HTTPS version properly. Poor redirects are a common issue with HTTPS — especially if only some parts of your website are encrypted.


Crawl Budget
Make sure search engines can crawl pages that matter for SEO

Crawl budget is the number of pages of a website that search engines can crawl per day. The value is different for every site, as crawl budget is closely related to the authority of the website. This metric is especially important for e-commerce websites with large inventories. While you cannot make search engines crawl more pages, you may facilitate their work by removing clutter on their way.

  • Get rid of duplicate pages. Sure thing you may deal with duplicate content with rel=”canonical”, but crawling duplicate content wastes your crawl budget and slows down the discovery of fresh content.
  • Prevent indexation of useless (in terms of SEO) content. Privacy policy, terms and conditions, and expired promotions are good candidates for a Disallow rule in robots.txt.
  • Fix broken links. Hitting broken links wastes your crawl budget — and doesn’t take search engine bots anywhere useful.
  • Keep your XML sitemap up to date, and make sure to register XML sitemaps in Google Search Console.

This article first appeared on SEOPowerSuite’s Web site.
I strongly recommend this site to anyone wanting to better understand SEO. 

Actionable guide to SEO in 2016

Written by   in 

The rule is simple — search engines won’t rank your site unless they can find it. So, just like before, it is extremely important to make sure search engines are able to discover your site’s content — and that they can do that quickly and easily. And here’s how.

1. Keep a logical site structure

Good practice
  • The important pages are reachable from homepage.
  • Site pages are arranged in a logical tree-like structure.
  • The names of your URLs (pages, categories, etc.) reflect your site’s structure.
  • Internal links point to relevant pages.
  • You use breadcrumbs to facilitate navigation.
  • There’s a search box on your site to help visitors discover useful content.
  • You use rel=next and rel=prev to convert pages with infinite scrolling into paginated series.
Bad practice
  • Certain important pages can’t be reached via navigational or ordinary links.
  • You cram a huge number of pages into one navigation block — an endless drop-down menu or something like this.
  • You try to link to each & every inner page of your site from your homepage.
  • It is difficult for users to go back and forth between site pages without resorting to Back and Forward browser buttons.
An example of a logical site structure:

An example of a clean URL structure:


2. Make use of the XML sitemap & RSS feeds

The XML sitemap helps search bots discover and index content on your site. This is similar to how a tourist would discover more places in an unfamiliar city if they had a map.

RSS/Atom feeds are a great way to notify search engines about any fresh content you add to the site. In addition, RSS feeds are often used by journalists, content curators and other people interested in getting updates from particular sources.

Google says: “For optimal crawling, we recommend using both XML sitemaps and RSS/Atom feeds. XML sitemaps will give Google information about all of the pages on your site. RSS/Atom feeds will provide all updates on your site, helping Google to keep your content fresher in its index.”

Good practice
  • Your sitemap/feed includes only canonical versions of URLs.
  • While updating your sitemap, you update a page’s modification time only if substantial changes have been made to it.
  • If you use multiple sitemaps, you decide to add one more sitemap only if your current sitemaps have already reached the limit of URLs (up to 50 thousand per each sitemap).
  • Your RSS/Atom feed includes only recently updated items, making it easier for search engines and visitors to find your fresh content.
Bad practice
  • Your XML sitemap or feed includes the URLs search engines’ robots are not allowed to index, which is specified either in your robots.txt, or in meta tags.
  • Non-canonical URL duplicates are included into your sitemap or feed.
  • In your sitemap, modification time is missing or is updated just to “persuade” search engines that your pages have been brought up to date, while in fact they haven’t.

SEO PowerSuite tip:
Use XML sitemap builder in WebSite Auditor

3. Befriend Schema markup

Schema markup is used to tag entities (people, products, events, etc.) in your pages’ content. Although it does not affect your rankings, it helps search engines better interpret your content.

To put it simple, a Schema template is similar to a doorplate — if it says ‘CEO Larry Page’, you know whom to expect behind the door.

Good practice
  • Review the list of available Schemas and pick the Schemas to apply to your site’s content.
  • If it is difficult for you to edit the code on your own, you use Google’s Structured Data Markup Helper.
  • Test the markup using Google’s Structured Data Testing Tool.
Bad practice
  • You use Schemas to trick search engines into believing your page contains the type of info it doesn’t (for example, that it’s a review, while it isn’t) — such behavior can cause a penalty.

4. Leverage rich answers

In 2015 we observed the growth in the number of rich answers in Google search results. There are various types of rich answers. Basically, a rich answer is a snippet that already contains a brief answer to the search query. It appears above other organic search results and thus enjoys more exposure.

Any website has a chance to be selected for the rich answers. Here are a few things you may do to increase your chances to get there:

1) Identify simple questions you might answer on your website;
2) Provide a clear direct answer;
3) Provide additional supporting information (like videos, images, charts, etc.).

CHAPTER 2 Master Panda survival basics

“Panda” is a filter in Google’s ranking algorithm that aims to sift out pages with thin, non-authentic, low-quality content. This means getting rid of thin content and duplicate content should be high up on your 2016 to-do list.

1. Improve content quality

Good practice
  • These days, it’s not enough to keep your content unique in a sense that it passes the plagiarism test. You need to create really useful, expert-level content and present it in the most engaging form possible.
  • You block non-unique or unimportant pages (e.g. various policies) from indexing.
Bad practice
  • Your website relies on “scraped” content (content copied from other sites with no extra value added to it). This puts you at risk of getting hit by Panda.
  • You simply “spin” somebody else’s content and repost it to your site.
  • Your website includes too many pages with little textual content.
  • Many of your site’s pages have duplicate or very similar content.
  • You base your SEO strategy around a network of “cookie-cutter” websites (websites built quickly with a widely used template).

2. Make sure you get canonicalization right

Canonicalization is a way of telling search engines which page should be treated as the “standardized” version when several URLs return virtually the same content.

The main purpose of this is to avoid internal content duplication on your site. Although not a huge offense, this makes your site look messy — like a wild forest in comparison to a neatly trimmed garden.

Good practice
  • You mark canonical pages using the rel=”canonical” attribute.
  • Your rel=”canonical” is inserted in either the <head> section or the HTTP header.
  • The canonical page is live (doesn’t return a 404 status code).
  • The canonical page is not restricted from indexing in robots.txt or by other means.
Bad practice
  • You’ve got multiple canonical URLs specified for one page.
  • You’ve got rel=”canonical” inserted into the <body> section of the page.
  • Your pages are in an infinite loop of canonical URLs (Page A points to page B, page B points to page A). In this case, search engines will be confused with your canonicalization.

SEO PowerSuite tip:
Use WebSite Auditor to check your pages for duplicate rel=”canonical” code

CHAPTER 3 Learn to combat Penguin

Google’s Penguin filter aims at detecting artificial backlink patterns and penalizing sites that violate its quality guidelines in regards to backlinks. So, keeping your backlink profile look natural is another key point to focus on in 2016.

Good practice
  • Your website mostly has editorial links, earned due to others quoting, referring to or sharing your content.
  • Backlink anchor texts are as diverse as reasonably possible.
  • Backlinks are being acquired at a moderate pace.
  • Spam, low quality backlinks are either removed or disavowed.
Bad practice
  • Participating in link networks.
  • Having lots of backlinks from irrelevant pages.
  • Insignificant variation in link anchor texts.

SEO PowerSuite tip:
Check backlinks’ relevancy with SEO SpyGlass

SEO PowerSuite tip:
Detect spammy links in your profile

CHAPTER 4 Improve user experience

Quite a few UX-related metrics have made their way into Google’s ranking algorithm over the past years (site speed, mobile-friendliness, the HTTPs protocol). Hence, striving to improve user experience can be a good way to up your search engine rankings.

1. Increase site speed

There are quite a few factors that can affect page loading speed. Statistically, the biggest mistakes site owners make that increase page load time are: using huge images, using large-volume multimedia or other heavy design elements that make the site as slow as a snail.

Use Google’s PageSpeed Insights to test your site speed and to get recommendations on particular issues to fix.

SEO PowerSuite tip:
Optimize your pages’ loading time with WebSite Auditor

2. Improve engagement & click-through rates

The Bing and Yahoo! alliance, as well as Yandex, have officially confirmed they consider click-through rates and user behavior in their ranking algorithms. If you are optimizing for any of these search engines, it’s worth trying to improve these aspects.

While Google is mostly silent on the subject, striving for greater engagement and higher click-through rates tends to bring better rankings as well as indirect SEO results in the form of attracted links, shares, mentions, etc.

3. Consider taking your site HTTPs

In August 2014, Google announced that HTTPs usage is treated as a positive ranking signal.

Currently there is not much evidence that HTTPs-enabled sites outrank non-secure ones. The transition to HTTPS is somewhat controversial, because

a) Most pages on the Web do not involve the transfer of sensitive information;
b) If performed incorrectly, the transition from HTTP to HTTPS may harm your rankings;
c) Most of your site’s visitors do not know what HTTP is, so transferring to HTTPS is unlikely to give any conversion boost.

4. Get prepared for HTTP/2

HTTP/2 is a new network protocol that should replace the outdated HTTP/1.1. HTTP/2 is substantially faster than its predecessor. In terms of SEO, you would probably be able to gain some ranking boost due to the improved website speed.

On November 06, 2015 John Mueller announced in a G+ hangout that Google Bot will soon be able to crawl HTTP/2 websites. At the time of writing, about 70% of web browsers support HTTP/2. You can keep track of HTTP/2 support by browsers on “Can I Use”.

HTTP/2 is likely to become a “must” soon. Thus, keep an eye on the issue and be ready to implement this feature when required.

CHAPTER 5 Be mobile-friendly

The number of mobile searches may soon exceed the number of desktop searches. With this in mind, search engines in general and Google in particular love mobile-friendly websites.

Mobile-friendliness has become a minor ranking factor for the mobile SERPs. You can test if your website is mobile-friendly using Google’s Mobile-Friendly Test.

On October 07, 2015 Google introduced Accelerated Mobile Pages Project (AMP). As the name implies it aims to provide a more streamlined experience for mobile users. The technology consists of three elements: special HTML markup, AMP JavaScript, and a content distribution layer (the latter is optional). The AMP search is currently available only on mobile devices. You may give it a try at g.co/ampdemo.

Good practice
  • Your page’s content can be read on a mobile device without zooming.
  • You’ve got easy-to-tap navigation and links on your website.
Bad practice
  • You are using non-mobile-friendly technologies like Flash on your webpages.

SEO PowerSuite tip:
Use mobile-friendly test in WebSite Auditor

CHAPTER 6 Earn social signals — the right way

Search engines favor websites with a strong social presence. Your Google+ posts can make it to your Google connections’ organic search results, which is a great opportunity to drive extra traffic. Although the likely effect of Twitter or Facebook links on SEO hasn’t been confirmed, Google said it treats social posts (that are open for indexing) just like any other webpages, so the hint here is clear.

Good practice
  • You attract social links and shares with viral content.
  • You make it easy to share your content: make sure your pages have social buttons, check which image/message is automatically assigned to the post people share.
Bad practice
  • You are wasting your time and money on purchasing ‘Likes’, ‘Shares’ and other sorts of social signals. Both social networks and search engines are able to detect accounts and account networks created for trading social signals.

SEO PowerSuite tip:
See your site’s social signals in SEO PowerSuite

CHAPTER 7 Revise your local SEO plan

In August 2015, Google reduced the number of results in the local pack from 7 to 3 and removed addresses and phone numbers. The search engine made it harder for SEOs to get to the local pack; however, a new map view has been added with up to 20 spots for the search results.

What has changed is that local rankings are now more dependent on the IP address of the user. You can read more on how to adjust your local SEO strategy to the new Google’s update in this guide.

SEO PowerSuite tip:
Check website authority in SEO PowerSuite

What’s coming in SEO in 2016?

Here are the main SEO trends for 2016, as predicted by our in-house SEO team:

SEO remains part of multi-channel marketing
Customers can find your business through social, paid search, offline ads, etc. Organic search is an integral part of a complex path to conversion. Just be aware of these other channels and get savvy in additional spheres, if necessary.

Google now gets searcher intent & context
The keyword is no longer the gravity center of a ranking. Google now also looks at synonyms, your niche connections, location, etc. to see if you fit the bill (=query). The good news is that you don’t need to get too hung up on specific keywords to prove you’re relevant to a query.

The end of search monopoly might be near
According to comScore in 2015, both Yahoo! and Bing continued to steadily increase their search market share.

No quick results, no easy SEO
With its latest iterations of Panda & Penguin and the penalties Google dished out to link networks, the search engine widened its net for spam — and became even better at detecting sites with unengaging content or unnatural link patters.

Traffic stays on Google increasingly
Google has definitely stepped up its effort to provide immediate answers to people’s searches. And, with the increasing number of rich answers, this tendency for stealing traffic from publishers will likely increase.

Paid search expansion
A few years ago, Google changed Google Shopping’s organic model to pay-per-click. It is possible that Google will make yet another organic vertical paid. Local Search is the best candidate for the change, since Local Search results are dominated by businesses selling a product.


By: Yauhen Khutarniuk
Head of SEO at SEO PowerSuite

This article first appeared on SEOPowerSuite’s Web site.
I strongly recommend this site to anyone wanting to better understand SEO. 

Quality and Google SERPs ~ The place of quality in Google’s ranking algorithm

Written by   in 

What do we know about how Google ranks webpages in its SERPs? A lot. But very little for sure. Still, I guess everyone can agree that there are 2 major factors in play:

Relevance and quality.

To identify relevance, Google looks at how well the page answers the searcher’s question or fulfills the purpose of the query. Then, Google tries to figure out the degree of relevance of the page to the query. And while this is an undoubtedly complex process, it’s a comprehensible one. Google will look at your page and entire website in terms of keyword-related features, like keyword usage and topic relevance. Perhaps, they’ll also look for some keywords and semantically related concepts in the anchor text of links pointing to your page.

For most queries, this analysis will produce thousands of webpages that meet the relevance criteria, which Google needs to arrange in a certain order before they are displayed to searchers, ensuring that the best results appear at the top. This is where quality comes in.

But what exactly does Google mean by “quality”? The term seems incredibly (perhaps purposely) vague. But if you dig a little beneath the surface, quality becomes interesting. The concept, it turns out, has to do with many things beyond the website itself. And beyond backlinks, too.

Back in November, Google revealed their latest Search Quality Rating Guidelines, a 160-page read of “what Google thinks search users want”. This document is used by Google’s quality evaluators who rate webpages in SERPs; based on their feedback, Google can develop changes to their ranking algorithms.

That’s right. Human beings sit down, type queries into the Google search bar, and rate search results according to these guidelines so that Google can improve the quality of its SERPs.

In this article, we’ll look at factors, or features, that make a site a high quality one, and dive a little deeper to explore how Google may be weighing those — and what you can do to improve on them.

But before we get down to the factors themselves, it’s important to note that there are different standards for different types of pages.

Your money or your life!
Google’s quality standards for different types of pages

There’s one type of pages Google has extremely high standards for. Those are labeled, perhaps a little too humorously, “Your Money or Your Life” pages; they are the types of webpages that can impact the “future happiness, health, or wealth of users”.

Understandably, YMYL pages are financial, legal, and medical information pages. But also…

“Shopping or financial transaction pages: webpages which allow users to make purchases, transfer money, pay bills, etc. online.”

That’s right: if your site sells anything online, then welcome to the YMYL club. Chances are you’ll need to try hard to prove you’re trustworthy, reputable, and authoritative enough to be displayed within the top search results.

But that doesn’t mean that you can sit back and relax if your site isn’t an online store. While your transactional peers may be judged more strictly, you still have the same criteria to meet to qualify for a high quality resource — only at a different level.

Thankfully, Google does give us a few hints on what it expects from quality sites — and it turns out, there’s a lot you can do to improve your quality score. Let’s get down to the very factors that determine whether your site is deemed high quality or not.

Main content
Write it well, place it right, size it smart

Google divides the content of every webpage into main and supplementary content (and, optionally, ads), main content being the part of the page that “helps it achieve its purpose”. In the guidelines, Google is telling raters what most of us already know. Content is king.

“For all types of webpages, creating high quality MC [main content] takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill.”

According to Google, the way content is placed on a page is also important. The following characteristics are typical of functionally designed pages:

  • The main content should be prominently displayed “front and center.”
  • The main content should be immediately visible when a user opens the page.
  • The design, organization, use of space, choice of font, font size, background, etc., should make the main content very clear.

And it’s not just the quality and placement of the page’s content that matters; its amount also plays a part. And while there’s no universal, one-size-fits-all content length, Google encourages raters to use their judgement to determine whether the content length on a given page is right for the query in question and the purpose of the page.

But let’s dive a little beyond the guidelines. No magic formula on word count would put your site to Google’s top, but… here’s some interesting data form serpIQ’s study of the correlation of content length with Google rankings (the experiment involved analyzing the top 10 search results for over 20,000 queries).

You can see that on average, Google’s top ranking pages have at least 2,000 words of content. And yet… If you run a few experiments yourself, the high and low points of the demonstrated averages will turn out to be incredibly far apart. If we take quick informational queries (like, say, ‘retention definition’) and broader ones (‘what’s the ideal length of a blog post’) where the searcher, perhaps, is looking for an in-depth article, we’ll end up with very different word count averages within the top 10 listings. 846 and 5030 respectively, to be exact.

So how do you determine an ideal content length for a specific page, niche, and the keyword you’re targeting? Hint: you look at your top ranking competitors.

By: Masha Maksimava

This article first appeared on SEOPowerSuite’s Web site.
I strongly recommend this site to anyone wanting to better understand SEO. 

Panda-Proof Content Audit for Your Site ~ 6 steps to make sure your content matches Google’s quality criteria

Written by   in 

First, the news broke out that Google Panda (a special “filter” designed to de-rank low quality content) is now a part of the search engine’s core ranking algorithm and Panda updates will probably start to roll out faster and more regularly.

Second, over the past two weeks, we’ve seen some major changes in Google’s search results. And though confirmed to be unrelated to Panda, these massive SERP turbulences also seem to be in connection with content quality.

It looks like in 2016, the “quality” of your content is not just an empty word, but something that you NEED to optimize your site for. And today we’ll dive deeper into what the Panda quality algorithm is and how to run a thorough Panda-proof content audit.


What’s Panda & how does it affect SERPs?

The Panda algorithm (named after Google engineer Navneet Panda) is designed to help Google improve the quality of search results by down-ranking low quality content.

The basic principle here is that Google assigns a particular quality score to each website in its index (the score is assigned site-wide, not to separate pages.)

Initially, Panda functioned as a filter applied to a pack of search results that Google considered relevant to a search query. The Panda score was re-ordering them, pushing down the low-scorers, and giving a boost to the highest scored content.

Now, as Panda signals are “baked” into Google’s core ranking algorithm, they no longer re-order the results, but form them together with other Google’s ranking signals.

How can Panda identify high quality content?

Sure thing, there’s no “gut feeling” that helps Panda identify real quality. Panda is only an algorithm that checks your website for a number of factors that Google assume are typical of a high quality website. Then, by applying some math, it gives the site a specific quality score based on the results of this check.

The good news is, if your site’s quality score is based on a number of separate factors, you can influence those factors to improve the score.

The bad news is… Google won’t disclose the exact quality factors it takes into account to calculate the score. So the list of Panda-prone issues below is an educated guess, based on what Google has said on site quality, and what trackable factors it can use to determine it.

What are Panda roll-outs & will they get more frequent?

Sure thing, a quality score is not something you can assign to a website once and for all. Websites change, content gets added and removed, and, clearly, the score needs to be updated every now and then to stay relevant.

The optimal way to deal with problematic content largely depends on the size of your site.

  • For a small website (<100 pages), removing low quality content is something you can’t afford. Your key strategy is to improve on every problematic page, rather than delete it.
  • For a medium-sized site (100-1000 pages), removing some of the low quality content is possible. But your main focus will be on improving content at least for the most important pages.
  • For a large website (>1000 pages), improving all the problematic areas is a huge piece of work, so your focus would be to “weed out” and remove the unnecessary and low quality content.

Step 2. Check for thin content

Imagine you have a category page with only a few lines of meaningless text and hundreds of links to products. This is what’s generally called thin content.

Search engines use content to determine the relevancy of a page for a query. And if you barely provide any information that’s accessible to them, how are they to understand what the page is about?

Refreshing the score non-stop in real time would take too much computing power — that is why up till now Google was launching “Panda updates” once in a couple of months,recalculating website quality scores, and thus changing the filter they apply to the search results.

Will that change now that the Panda quality score is a part of Google’s core ranking algorithm? It doesn’t seem so. The score is still not being calculated in real time; the change only means that the algorithm got more solid and doesn’t need to be modified any further. Thus Google won’t have to test and apply new signals (it’ll only re-apply older ones), so it will be possible to run Panda updates faster and more frequently.

Surely, quality is not all about the word count, since there are cases when you can deliver value in a few hundred words. That is why there’s no “minimum word count” threshold that triggers a low Panda quality score. More to that, sometimes pages with a little over a hundred words do exceptionally well on Google and even get included into its rich answers.

But having too many thin content pages will very likely get you into trouble — so on average,word count under 250 words is a good indicator to locate problematic spots across your site.

Step 3. Check for duplicated/very similar content

Another factor that could be a signal of your site’s low quality is duplicated or very similar content across multiple pages.

Very often, bigger sites have to deal with a huge amount of pages that need to be filled with content. And many of them resort to an easy way to fill out those gaps — by writing boilerplate text that’s the same on each page except for a few variables. This is what Google considers automated, low quality content.

So, besides weeding out the word-by-word duplicated content, pay attention to the similar-looking pieces (say, your page titles are absolutely identical in structure and differ only in a product name) that may be a sign of content automation.


Step 4. Check for aggregated content/plagiarism

What’s also synonymous to quality in Google’s eyes is the “uniqueness” of your content. As Google wants your content to add value and not simply repeat what’s already been said, having non-unique content on your website (e.g. plagiarized content, product descriptions duplicated in feeds used for other channels like Amazon, shopping comparison sites and eBay) is an easy way to get under Google’s Panda filter.

If you suspect that some of your pages may be duplicated externally on other online resources, a good idea would be to check them with Copyscape. (http://copyscape.com/)

Copyscape gives some of its data for free (for instance, comparing two specific URLs), but for a comprehesive check you may need a paid Premium account.

Step 5. Check for proper keyword usage

Keywords and keyword targeting are the most basic and longest-running concepts in SEO. And if you’ve been in the search industry for quite some, you may remember the days when SEO meant just having the right words in your meta keywords tag.

Sure, these times have passed: search engines now try to detect and punish websites deliberately using too many keywords in their content.

However, whether Google will admit it or not, their algorithms are still built upon keywords. And having a keyword in your title tag DOES improve your page’s rankings, meaning you simply can’t afford not optimizing pages for keywords.

So, the only ticklish question here is, “How many is not too many?” And one of the ways to check this is by looking at top ranking competitors (because the sites that rank in top 10 are the sites that pass Google quality test with an A+.)

Remember the Hummingbird algorithm update? The one with which Google learned to recognize the meaning behind a search query and give a common answer to a number of “different-in-keywords” but “same-in-meaning” queries?

This update changed the way SEOs optimize pages — now we no longer think “single keyword optimization”, but try to make our pages relevant for a whole group of synonyms and related terms.

So, adding all kinds of related keywords will help you improve your pages’ rankings and avoid the keyword stuffing issues.

Step 6. Check for user engagement metrics

Though Google generally states that user experience signals are not included into their search ranking algorithm, many experiments show the opposite. And one of the metrics SEOs suspect Google to use is bounce rates.

Think about it — as Google tries to bring users the best search experience, it obviously wants them to find what they were looking for with the first search result they click on. The best search experience is one that immediately lands the searcher on a page that has all the information they need, so that they don’t hit the back button to return to the SERP and look for other alternatives.

Bouncing off pages quickly to return to the SERP and look for other results is called pogo-sticking, and it can be easily measured in terms of bounce rates.

What else to check?

  1. 1. Check for user-generated content issues

User-generated content and how it affects Panda has been a hot topic recently, and it has gotten to the point where many SEOs are recommending to get rid of all user-generated content, claiming that Google sees it as a signal of poor site quality.

This is far from being true, because we’re still seeing lots of websites based purely on user-generated content (think Quora) that are doing well on Google.

However, user-generated spam — for instance, irrelevant comments on your blog or poorly moderated forum pages — can put your site into trouble.

So if your website features user-generated content, make sure improving your moderation strategy is a priority.

  1. Check for grammar mistakes

Bad spelling and grammar can both impede user experience and lower the trustworthiness of your content in Google’s eyes, so don’t tempt the fate by leaving obvious grammar errors on your pages. The easiest way to run a spellchecker through your content is to copy the text and paste it into a word processor. This should then highlight the spelling mistakes so you can update the content.

By: Katherine Stepanova
Head of Marketing at SEO PowerSuite

This article first appeared on SEOPowerSuite’s Web site.
I strongly recommend this site to anyone wanting to better understand SEO.