Even if user behavior isn’t a ranking factor yet, it will surely be one soon

Written by   in 

This article first appear on SEOPowerSuite.

Whether or not user behavior factors affect rankings is a controversial topic: Google denies it, while many experiments prove the opposite.

Anyhow, even if user signals are not influencing your rankings right now, common sense and logic say they are the ranking factors of tomorrow, so it may be wise to get ready today.

Rand Fishkin presented his 2-algorithm concept of SEO, suggesting search marketers to combine classic Google-oriented and the new searcher-oriented SEO.

In a two-algorithm world, we need to focus on 5 new elements of SEO — the so-called searcher outputs.

1. Click-through rates. With behavior factors in play, your CTR will be one of the things that determine how you rank — so revising your titles and descriptions one more time could pay off in even more ways than before.

2. Engagement. Do the searchers find on your page what they are looking for, or do they go back to click other search results? Do the searchers stay on your page, proceed to other pages, or do they bounce in just a second or two? The way users engage with your content is likely to influence your rankings, so Rand suggests a list of things to attend to for better engagement:

  • Content that fulfills the searcher’s conscious and unconscious needs;
  • Speed, speed, and more speed;
  • Delivering the best UX on every browser;
  • Compelling visitors to go deeper into your site;
  • Avoiding features that annoy or dissuade visitors.

3. Information that fills gaps in searchers’ knowledge. With the purpose of delivering a rewarding search experience, Google’s machine learning models could look at search results that people eventually land on when they search for keyword X to identify what those have in common. For example, the presence of certain words could predict more successful searches. Watching users searching for “New York”, Google could conclude that a page about New York that doesn’t mention Brooklyn or Long Island is not offering the information searchers are looking for (and hence should not rank very high).

4. Shares, links, and loyalty per visit. Though social signals aren’t officially a ranking factor (while backlinks are), experiments show that pages with lots of social activity and few links outperform ones with more links and fewer shares — even for insanely competitive keywords.

But it’s not just the bare numbers that count – Google may also be looking at how social activity grows over time, and whether or not social engagement results in loyalty and returning visits.

5. Fulfilling the searcher’s task (not just their query). Google wants searchers to complete their tasks quickly, so it’s quite possible that the ranking results will differ depending on the user intent (for instance, purchase) that Google associates with a particular query.

This article first appeared on SEOPowerSuite’s Web site.
I strongly recommend this site to anyone wanting to better understand SEO. 

Higher Google ranks no longer mean more organic clicks

Written by   in 

This articles first appeared in SEOPowerSuite. The content was provided by:

Chris Pinkerton
Vice President, Client development at North America Mediative,@chrispinkerton
“With different layouts of search result pages, where searchers are conditioned to look and click changes”

Ehren Reilly

Director of Product at Glassdoor, @ehrenreilly
“It’s no longer viable to have high-quality non-proprietary content.”

Eric Enge,

CEO and Founder at Stone Temple Consulting, @stonetemple
“We ran a test, publishing 5 pieces of content that answered 5 common SEO questions. In 3 days, we saw our content included into Google’s rich answers.”

Barry Adams,

Founder at Polemic Digital, @badams
“If you waste crawl budget because of slow page loading, the right pages are unlikely to be crawled & indexed.”

Adam Audette

SVP Organic Search at Merkle, @audette
“If an ecommerce site makes $100k in revenue per day, a 1 second load delay equates to $2.5M in lost revenue annually.”

Marshall Simmonds,

Founder of Define Media Group Inc, @mdsimmonds
“Dark traffic is a problem for marketers because we’re not getting credit for the traffic that we’re creating. Be aware and try to analyze it.”

Christine Churchill

President and CEO of KeyRelevance LLC, @chrischurchill
“Research core phrases plus synonyms and words that frequently occur with your main terms. Then build a keyword matrix, assigning each of the keyword groups to different pages.”
Sha Menz
Link Removal Specialist, rmoov.com, @ShahMenz
“You don’t have a link penalty and think you’re safe? Don’t bet on it. You need to actively disavow and remove low quality links that you find during link audits even if you don’t have a penalty yet.”
Rand Fishkin
Founder, MOZ, @randfish
“Make pages for people, not engines” is terrible advice. Engines need the things we’ve always done, and we better keep doing that. People need additional things — and we better do that,

Are you still assuming the #1 Google ranking guarantees you the most organic clicks?

Well, here comes the bad news… While the click-through rate (the percentage of searchers that click on a site’s listing when they see it in search results) is surely still correlated with positions, lots of other factors can influence your organic clicks in Google of 2016.

Paid ads, local packs, carousel results, knowledge graphs and rich snippets — all these attention-grabbing SERP elements can drain away the clicks from your #1 ranking website.

The latest study by Mediative (that was tracking searchers’ eye movement and eventual clicks across different SERPs) showed that the way searchers interact with the SERP varies a great deal from query to query.

Here’s, for example, how the views and clicks are distributed on a SERP with a local pack, a paid ads pack and a carousel:

For search marketers, this change means two things:

1) Keyword search volume itself is no longer a reliable metric to predict how much organic traffic this keyword is able to generate for your website. Before investing effort into optimizing your site for a keyword, take a look at the search result page. See if there are many SERP elements that may be stealing clicks from your organic listing to get a better idea of the traffic potential of this word.

2) Besides trying to get higher search rankings, you need to pay attention to the extra click opportunities that the SERPs give. Can you squeeze into the local pack results? Can you utilize structured data to get a rich snippet listing? Shouldn’t you launch a Google AdWords campaign for this particular query?

Rich answers are on the increase — that’s both a threat and an opportunity

Another huge tendency in search is the increase of Google’s rich answers. According to a study by Stone Temple Consulting, Google returns rich answers to 35% of search queries these days. This is a 38% increase over the past 6 months, and we are clearly in for some further growth.

Quite often, rich answers are built based on public data (like “President of the USA”) or the data licensed to Google (like song lyrics). So, if your SEO strategy was built on public domain data… you better change direction right now, because getting search traffic from Google will only be getting harder for you.

But what if you have high-quality, unique, proprietary content that can help Google answer common searchers’ questions? Well, rich answers are an opportunity for you. According to Eric Enge of Stone Temple Consulting, for 75% of rich answers Google uses external data and includes a link to its source.

Getting your page featured in a rich answer can give you a massive traffic boost — CTR for clickable rich answers is ~2X better than that for the #1 ranking page on a SERP with no rich answer. And the best thing is, getting featured in rich answers is absolutely possible even if your site’s authority is not very high yet.

So, how about pushing your site to direct answers and thus making it rank higher than Her Highness Wikipedia? 🙂

  • 1. Start with long tail keyword research — you need to identify the commonly searched questions in your niche.
  • 2. Create a piece of content that directly answers these questions. Make sure to include the question itself, and a direct answer to it — keep in mind that for rich answers, the structure of your answer is more important than your site’s relevance and authority.
  • 3. Make sure your article is truly helpful and provides additional information on the matter. This will not only increase your chances of getting featured as a rich answer, but will help you entice more clicks.
  • 4. Make your content easy to find for people and search engines (make sure it’s available to Google bots and easily accessible through your site’s navigation; share links to it on your social accounts; submit them via Google search console, etc.)

Page speed is utterly important — optimize it today, don’t put it off till tomorrow

First things first: page speed is a ranking factor. All other things being equal, the site that loads quicker will outrank a competing site, hands down.

Second, slow-loading pages waste your site’s crawl budget (yeah, Google has allocated a specific time for crawling your website, and Google”s bot won’t stay on your website longer than that specific period). For a bigger website this means that the slower your pages load, the fewer of them get indexed by Google.

Tracking your organic traffic with Google Analytics gets even more difficult

If you’re staring in despair at your Google Analytics traffic report, unable to figure out where all this direct traffic is coming from… Your problem is “Dark traffic”.

According to Marshall Simmonds, when Google Analytics is unable to identify where your site’s visits are coming from, the visits are recorded as direct traffic. And as in the modern Web these un-identifiable visits keep growing in numbers, your Google Analytics reports get less precise: they report a direct traffic growth – while in reality you’re growing your organic, social and mobile traffic.

This makes tracking your marketing activities even more complicated.

So, first of all, you need to be aware of the cases when your traffic goes dark. According to Marshall, the common cases are:

  • Traffic from a secure site to non-secure;
  • Traffic from image search;
  • Traffic via links in applications;
  • A big portion of traffic from Facebook, SnapChat, WhatsApp;
  • Traffic from the Android search app.

Second, though checking the precise amounts of dark traffic is impossible, you can at least get a better idea of how it affects your site. To do that, Marshall recommends to:

  • Create a direct traffic report in Google Analytics
  • And then filter out the traffic to pages that are naturally visited “directly” — like your homepage or the front pages of important content sections which users are likely to bookmark.

5. Keywords are neither dead nor dying — they are still the basis of your SEO campaign

Keywords and keyword targeting are the most basic and longest-running concepts in SEO. And if you’re in search for quite some time, you may remember the days when SEO meant just having the right words in your meta keywords tag.

Sure, these times have passed and will never come back: search engines now use much more complicated algorithms to determine webpages’ quality and relevancy. But does this mean keywords are dead? Experts agree — keywords and keyword research should still be the basis of your SEO and content marketing campaigns. However, Google’s Hummingbird update shifts our focus from researching separate keywords to researching groups of related terms and synonyms.

Now that Google is able to recognize the meaning behind a search query, it gives a common answer to a number of “different-in-keywords” but “same-in-meaning” queries. So if you want to grab yourself a place in the Post-Hummingbird search results, you need your pages to be relevant not only to the core term, but for a whole group of its synonyms and related terms.

The aim of your keyword research is now in identifying not individual keywords, but the groups of thematically connected terms your pages will target.

This article first appeared on SEOPowerSuite’s Web site.
I strongly recommend this site to anyone wanting to better understand SEO. 

SEO for e-commerce sites

Written by   in 

The article first appeared in SEOPowerSuite.

Doing marketing for e-commerce websites is associated with very specific challenges. First, e-commerce website promotion requires the marketer to balance between user experience, conversion optimization, and SEO. Second, online stores are typically large-scale and complicated — so trivial on-page optimization tasks can turn into a never-ending nightmare.

In this guide, you’ll find a set of content and technical issues anyone doing e-commerce SEO will face, and actionable how-tos and time-saving tricks on tackling them.


Keyword research
Forget generic keywords, think smart word combinations

Keyword research for e-commerce websites goes far beyond search volume and competition analysis. The reason for that is the complexity of the buying cycle — each step in it calls for different keywords you’d need to target.

Typically, customers of an online store go through 5 stages before making a purchase; different keywords and keyword phrases are used at each stage.

1. Compile 3 separate keyword lists

As you can see, keywords that customers use at the Research, Comparison, and Purchase stages offer the most value in terms of conversion potential. To get a comprehensive list of those, you’ll need to come up with 3 groups of keywords that you’ll later combine into long tail queries.

But before you start, remember to research search patterns typical of your target audience: consider gender, age, and social status. For example, if you are a male and you are struggling to get the organic traffic for a skin care store, take care to talk to your female colleagues or friends to find out the jargon they use when they talk about this stuff. Spend some time on relevant social media resources to learn your audience’s language.

When you’re positive you understand how your customers talk and which words they use, get down to putting up your keyword list.

  • Prepare a list of action keywords that customers might use at the Comparison and Purchase stages as part of their query. Don’t add the product or category names to these keywords yet.
    E.g. “buy”, “purchase”, “price”, “compare”, “review”.
  • Get a full list of brands available at your store.
    E.g. “Sony”, “Samsung”, “Apple”.
  • Compile a list of categories, product names, and their core properties, like size or color.
    E.g. “TV”, “laptop”, “smartphone”; “Iphone”, “Galaxy Note”, “34-inch display”.

2. Mix the keywords up

Once you’ve got these three lists ready, it’s time to move on to putting together search phrases. Combining generic keywords with product keywords and their properties should give you dozens of long tail keywords — like “buy 42-inch Samsung TV”. It works like a slot machine: you turn the reels and get new keyword phrases.

You can do it manually if you need to mix up a dozen of keywords. However, given the size of the inventory in most online stores, you will likely need software tools to get things done quickly.

Try using Rank Tracker‘s Word Combination option to get a full list of possible long tail keywords instantly.

1. Create or open a project in Rank Tracker.
2. Click the Suggest Keywords button.
3. Select Word Combination from the available keyword research methods, and hit Next.
4. Select the number of parts to combine, enter your keywords in the columns, and click Next once more.

(By the way, it looks exactly like a slot machine!)

In an instant, you’ll get plenty of long tail keyword phrases.

Select the keywords to add to your project and hit Update KEI to get their search volume, competition, and Keyword Efficiency Index.

Voila — you’ve just saved yourself a couple of hours!


Keyword matrix
Do smart keyword targeting to avoid cannibalization

Have you heard about keyword cannibalization? To put it short, if several pages of your website contain a response to the same search query, these pages will compete with each other in SERPs for rankings. The search engines may rank the page that is less suitable or important from your standpoint.

In order to avoid keyword cannibalization, create a keyword matrix. Fill the rows in a spreadsheet with the URLs of your site’s most important pages (most likely product category pages), and create columns for your keywords. Put a mark at the intersection of row/column to assign a certain keyword to a certain page. This method will help you make sure that you don’t use the important keyword across multiple pages.

Samsung tvToshiba tvSony tv

If the CMS of your online store creates separate pages for such variations of a product as size and color, it will make sense to restrict such pages from indexing using robots.txt or <meta name=”robots” content=”noindex”> tag. Canonicalization is another solution (see Google’s guidelines for detailed instructions).


On-page optimization
Save time on optimizing thousands of product pages

An e-commerce website typically has a limited set of category pages and thousands of product pages. Everything is more or less clear with category pages (they are usually subject to the traditional on-page SEO approach; if you are new to SEO, check out the A to Z SEO Guide for the steps).

Things get trickier when it comes to product pages. You’ll hardly have the time and resources to create unique titles, H1 tags, and descriptions for each product page.

Luckily, the slot machine approach (see the Keyword research section) can be used for meta tags just as well.

Create title, meta description and H1 templates for your product pages. For example, you may use this template for the title tag: Buy [ProductName] online | Your store name

[ProductName] is a variable that changes for every page depending on the product. If your CMS does not support variables, ask your development team for help.

Do the same for your H1s and descriptions — and remember that titles and meta descriptions are displayed in your listing’s snippet in SERPs, so make sure to use strong calls-to-action to entice clicks from search results.


Make sure every page on your site is unique

Duplicate content issues for e-commerce sites fall into two categories:

  • Off-site — the content appears on many websites across the web.
  • On-site — many pages of the website feature the same content.

1. Fix off-site duplication

Off-site duplication is natural for e-commerce. Online stores often use product descriptions, images, and specifications provided by the manufactures. This is logical, since you cannot invent new specs for the latest iPhone. However, there are a number of solutions for the problem.

  • Write unique descriptions for each item. If you have a team of copywriters to get the entire inventory covered — go for it. Just keep in mind that as the inventory scales up, you’ll need to keep up with the copy as well.
  • Leverage user-generated content. Create incentives for visitors to write reviews of the items they purchased. Send follow-up emails and ask for a review nicely, or offer discounts or bonuses to customers who leave a review. On the downside, there’s no guarantee that you will have a steady flow of reviews for all the items being sold. Additionally, reviews should be moderated to avoid spam or obscene language, which requires additional resources.
  • Add a Q&A section for each product. You can make your product descriptions unique by adding a FAQ section with questions customers often have about the product. Again, doing this will require additional human resources.
  • Optimize product category pages only. If you don’t have the time and resources to work on product pages, you can choose to create unique content for category pages only. In this case, it’s necessary to prevent the indexation of the product pages (using robots.txt or meta tags) — this means that the product pages will not appear in the SERPs.

2. Fix on-site duplication

On-site duplication is a frequent problem across the pages of online stores. It can be caused by the e-commerce content management system or an illogical website structure.

There are two typical scenarios. First, a product may belong to several categories, e.g. one Samsung TV set could be found in “Home”, “TVs”, and “Samsung”. The CMS may generate different URLs for the very same product depending on the path a user takes in the product catalog. For example:


Second, the CMS could generate a separate URL and page for variations of one product (e.g. size, color or other specifications). This approach wasn’t a problem before Google’s Panda algorithm update; currently, Google can penalize websites for duplicated product pages across different URLs. For example:


There are several ways to get around on-site duplication:

  • 1. Master URLs. No matter what path a user takes in the catalogue, the CMS must always return only one URL for a particular product. All product variations should be represented on one page reachable via one URL, so that the user is not redirected to other pages. This approach eliminates content duplication and ensures that your site’s Crawl Budget is used wisely.
  • Canonicalization. This technique does solve the duplicate content problem, but it can have drawbacks in terms of user experience and crawl budget. See Google’s Canonicalization guide for detailed info.


Out of stock and discontinued items
Create search-engine-friendly pages for unavailable products

Clearly, there are times when you store would run out of a certain product — or even discontinue an item completely. These two cases should be handled differently.

1. Create smart pages for temporarily unavailable products

If an item is temporarily unavailable, removing the page is not an option. The page should clearly state that the product is out of stock, and provide all the relevant information the visitor may need to make sure they either wait until the item arrives or order an alternative from you.

  • Include the item’s planned arrival date. This will help the visitors decide whether they’re ready to wait until the item is available, or if they should look for alternatives.
  • Offer an opportunity to get a notification when the item arrives. Even if you don’t know when the item is going to be available, it’s a good idea to give your visitors an option to get notified via email when it’s back in stock.
  • Give visitors a preorder option. If you’re positive the item is going to be available soon, let users preorder it. This will assure your customers that when the product is in stock, they will be the first to receive it.
  • Add a list of similar products. When you can, offer visitors alternative options to make sure they purchase from you and don’t go to competitors instead.

2. Choose how you’ll handle permanently discontinued products

If the item is permanently removed from sale, you have several options to deal with its product page.

  • Return a 404 page. 404 is a natural way to remove pages from the search engine index; the overall rankings of the website will not be affected. Make sure to remove 404 pages from your site’s XML sitemap — this will send a strong signal to the search engines that the page should be removed from the index. This approach is suitable for pages that don’t have a lot of backlinks and don’t serve as an entrance point to the website. If the page ranks well for some queries though, consider other options.
  • Create a 301 redirect to a similar item or relevant product category. The redirect will help you save link juice; on the downside, 301 redirects can increase load time and confuse the visitor.
  • Keep the product page, but state that the item is discontinued and offer an alternative. In this way, you will preserve the link juice and the page’s rankings. However, this option is not recommended if the online store’s inventory changes often — you don’t want to end up with thousands of ghost products wasting your Crawl Budget.


Use pagination properly to avoid duplication and indexing problems

Pagination is the practice of segmenting a piece of content into multiple pages. On an e-commerce website, pagination can create a series of very similar pages with overlapping content. If the pagination bar on your site only includes a few pages, and each number representing a subsequent page is visible and clickable, this will not usually pose a problem. Here’s an example:

But if the number of pages exceeds a certain amount, the pagination bar will display only a couple of initial pages and a few final pages. The in-between pages won’t be linked to from the main page — as a result, they will be crawled by search engines less often.

This issue may be addressed in two ways:

  • Add a View All option. Consider adding a page that contains the products from all pages. In this scenario, each split page should contain the rel=”canonical” link pointing to the view all page. See Google’s blog post for a detailed how-to.
  • Add rel=”next” and rel=”prev” tags. These tags can be used inside the <head> tag of a page to indicate next and previous pages in a series. The first page will only have a rel=”next” tag, and the last one — just a rel=”prev” tag, while the pages in-between will contain both. These tags give Google a hint to treat the split pages as one. This approach will help you consolidate backlinks, and Google will likely display only the most relevant page (the first one) in SERPs. For more information on rel=”next” and rel=”prev” , see this post on Google Webmaster blog.


Site Speed
Optimize your pages’ load time for better rankings and user experience

Site speed is a factor that has a double effect on e-commerce websites. A slow website is poor user experience; poor user experience often translates into lower sales. Site speed is a ranking factor, too; fast loading pages get an advantage over slower ones in search results.

First, you’ll need to test your main landing pages to make sure there are no speed issues. You can do that quickly with WebSite Auditor.

1. Create or open a WebSite Auditor project for your site.
2. Go to the Content Analysis module.
3. Select a page you want to test, enter your keywords, and proceed with the next steps as necessary.

Along with other content and technical info, the software will run a detailed page speed test. See the Page speed (Desktop) section and make sure your page is free from any issues that may be slowing it down.

Here are the 5 top things that affect page speed and are often ignored by e-commerce sites.

  • Eliminate unnecessary redirects. Very often websites redirect visitors from the non-www version to the www version, and then to the mobile version or a user-friendly URL. Eliminate such intermediate redirects whenever you can safely do that.
  • Optimize product images. Ecommerce websites usually have a lot of product images, which make up for the largest share of the traffic payload. Make sure that the all the images are optimized and compressed in size. Consider using smaller images with an option to open a large version.

  • Enable browser caching. E-commerce website visitors will typically view many pages per session. You do not want them to load the unchanged content again and again, do you?
  • Prioritize the load of visible content for pages that have a scroll bar.
  • Avoid JavaScript that blocks page rendering. It will cause the user’s browser to wait for the script to load before loading the page itself.


Deliver a great user experience across devices

50% of Google search traffic is mobile. About 60% of consumers use mobile devices to make purchase decisions. If you are promoting an e-commerce website, you can’t neglect this huge audience.

Just like site speed, a poor user experience on mobile devices may result in lower sales and negatively influence your rankings.

1. Go mobile if you haven’t already

If you haven’t taken your site mobile yet, you’ll need to start with choosing the right technology. There are three major options: dynamic serving, separate mobile pages, or responsive design.

For e-commerce sites, responsive design is perhaps the best way to go. Here are some benefits of this option:

  • Same URL for mobile and desktop versions of pages. Using a single URL for a piece of content makes it easier for users to interact with, share, and link to that content. Such pages are also easier for search engines to discover and index.
  • Content presentation is customizable depending on the type of device it is viewed from.
  • No redirects. Unlike with a separate mobile version of the site, responsive design requires no additional redirects. This makes for a better load time and user experience.

2. Double-check pages of a mobile site

If you aren’t sure if your page is totally mobile friendly, here’s a quick way to check that:

1. Open your WebSite Auditor project.
2. Go to Content Analysis.
3. Select the page to analyze against mobile-friendliness, and proceed with the next steps.

Once the analysis is complete, check the Page usability (Mobile) section to see if your page is fully optimized for mobile devices. Go through the factors under this section to see if you can make any improvements for your mobile visitors.


Create a secure site to win customers’ (and Google’s) trust

Search engines favor websites that securely encrypt the traffic between the site and a user. Going HTTPS is critical for e-commerce websites to protect the customers’ credit card details and other personal information.

You’ll need 2 things to go HTTPS: a dedicated IP and an SSL certificate. To get a dedicated IP, contact you hosting provider. Getting a certificate is no big deal either — there are a lot of SSL certificate providers like Comodo or Geotrust to name a few. Once you’ve installed the certificate, remember to test whether it’s configured correctly with this tool by SSL labs.

There are some common pitfalls to avoid when transferring to HTTPS.

  • If your website uses a content distribution network, third party scripts and APIs, make sure they support HTTPS. Otherwise, visitors will get errors on page load or notifications that only part of the content is encrypted.
  • Make sure all internal links point to the HTTPS version of the website. If your web developers use absolute links, you’ll definitely have to fix those.
  • Configure redirects from the HTTP to the HTTPS version properly. Poor redirects are a common issue with HTTPS — especially if only some parts of your website are encrypted.


Crawl Budget
Make sure search engines can crawl pages that matter for SEO

Crawl budget is the number of pages of a website that search engines can crawl per day. The value is different for every site, as crawl budget is closely related to the authority of the website. This metric is especially important for e-commerce websites with large inventories. While you cannot make search engines crawl more pages, you may facilitate their work by removing clutter on their way.

  • Get rid of duplicate pages. Sure thing you may deal with duplicate content with rel=”canonical”, but crawling duplicate content wastes your crawl budget and slows down the discovery of fresh content.
  • Prevent indexation of useless (in terms of SEO) content. Privacy policy, terms and conditions, and expired promotions are good candidates for a Disallow rule in robots.txt.
  • Fix broken links. Hitting broken links wastes your crawl budget — and doesn’t take search engine bots anywhere useful.
  • Keep your XML sitemap up to date, and make sure to register XML sitemaps in Google Search Console.

This article first appeared on SEOPowerSuite’s Web site.
I strongly recommend this site to anyone wanting to better understand SEO. 

Quality and Google SERPs ~ The place of quality in Google’s ranking algorithm

Written by   in 

What do we know about how Google ranks webpages in its SERPs? A lot. But very little for sure. Still, I guess everyone can agree that there are 2 major factors in play:

Relevance and quality.

To identify relevance, Google looks at how well the page answers the searcher’s question or fulfills the purpose of the query. Then, Google tries to figure out the degree of relevance of the page to the query. And while this is an undoubtedly complex process, it’s a comprehensible one. Google will look at your page and entire website in terms of keyword-related features, like keyword usage and topic relevance. Perhaps, they’ll also look for some keywords and semantically related concepts in the anchor text of links pointing to your page.

For most queries, this analysis will produce thousands of webpages that meet the relevance criteria, which Google needs to arrange in a certain order before they are displayed to searchers, ensuring that the best results appear at the top. This is where quality comes in.

But what exactly does Google mean by “quality”? The term seems incredibly (perhaps purposely) vague. But if you dig a little beneath the surface, quality becomes interesting. The concept, it turns out, has to do with many things beyond the website itself. And beyond backlinks, too.

Back in November, Google revealed their latest Search Quality Rating Guidelines, a 160-page read of “what Google thinks search users want”. This document is used by Google’s quality evaluators who rate webpages in SERPs; based on their feedback, Google can develop changes to their ranking algorithms.

That’s right. Human beings sit down, type queries into the Google search bar, and rate search results according to these guidelines so that Google can improve the quality of its SERPs.

In this article, we’ll look at factors, or features, that make a site a high quality one, and dive a little deeper to explore how Google may be weighing those — and what you can do to improve on them.

But before we get down to the factors themselves, it’s important to note that there are different standards for different types of pages.

Your money or your life!
Google’s quality standards for different types of pages

There’s one type of pages Google has extremely high standards for. Those are labeled, perhaps a little too humorously, “Your Money or Your Life” pages; they are the types of webpages that can impact the “future happiness, health, or wealth of users”.

Understandably, YMYL pages are financial, legal, and medical information pages. But also…

“Shopping or financial transaction pages: webpages which allow users to make purchases, transfer money, pay bills, etc. online.”

That’s right: if your site sells anything online, then welcome to the YMYL club. Chances are you’ll need to try hard to prove you’re trustworthy, reputable, and authoritative enough to be displayed within the top search results.

But that doesn’t mean that you can sit back and relax if your site isn’t an online store. While your transactional peers may be judged more strictly, you still have the same criteria to meet to qualify for a high quality resource — only at a different level.

Thankfully, Google does give us a few hints on what it expects from quality sites — and it turns out, there’s a lot you can do to improve your quality score. Let’s get down to the very factors that determine whether your site is deemed high quality or not.

Main content
Write it well, place it right, size it smart

Google divides the content of every webpage into main and supplementary content (and, optionally, ads), main content being the part of the page that “helps it achieve its purpose”. In the guidelines, Google is telling raters what most of us already know. Content is king.

“For all types of webpages, creating high quality MC [main content] takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill.”

According to Google, the way content is placed on a page is also important. The following characteristics are typical of functionally designed pages:

  • The main content should be prominently displayed “front and center.”
  • The main content should be immediately visible when a user opens the page.
  • The design, organization, use of space, choice of font, font size, background, etc., should make the main content very clear.

And it’s not just the quality and placement of the page’s content that matters; its amount also plays a part. And while there’s no universal, one-size-fits-all content length, Google encourages raters to use their judgement to determine whether the content length on a given page is right for the query in question and the purpose of the page.

But let’s dive a little beyond the guidelines. No magic formula on word count would put your site to Google’s top, but… here’s some interesting data form serpIQ’s study of the correlation of content length with Google rankings (the experiment involved analyzing the top 10 search results for over 20,000 queries).

You can see that on average, Google’s top ranking pages have at least 2,000 words of content. And yet… If you run a few experiments yourself, the high and low points of the demonstrated averages will turn out to be incredibly far apart. If we take quick informational queries (like, say, ‘retention definition’) and broader ones (‘what’s the ideal length of a blog post’) where the searcher, perhaps, is looking for an in-depth article, we’ll end up with very different word count averages within the top 10 listings. 846 and 5030 respectively, to be exact.

So how do you determine an ideal content length for a specific page, niche, and the keyword you’re targeting? Hint: you look at your top ranking competitors.

By: Masha Maksimava

This article first appeared on SEOPowerSuite’s Web site.
I strongly recommend this site to anyone wanting to better understand SEO.