SEO for e-commerce sites

Written by   in 

The article first appeared in SEOPowerSuite.

Doing marketing for e-commerce websites is associated with very specific challenges. First, e-commerce website promotion requires the marketer to balance between user experience, conversion optimization, and SEO. Second, online stores are typically large-scale and complicated — so trivial on-page optimization tasks can turn into a never-ending nightmare.

In this guide, you’ll find a set of content and technical issues anyone doing e-commerce SEO will face, and actionable how-tos and time-saving tricks on tackling them.

Content:

Keyword research
Forget generic keywords, think smart word combinations

Keyword research for e-commerce websites goes far beyond search volume and competition analysis. The reason for that is the complexity of the buying cycle — each step in it calls for different keywords you’d need to target.

Typically, customers of an online store go through 5 stages before making a purchase; different keywords and keyword phrases are used at each stage.

1. Compile 3 separate keyword lists

As you can see, keywords that customers use at the Research, Comparison, and Purchase stages offer the most value in terms of conversion potential. To get a comprehensive list of those, you’ll need to come up with 3 groups of keywords that you’ll later combine into long tail queries.

But before you start, remember to research search patterns typical of your target audience: consider gender, age, and social status. For example, if you are a male and you are struggling to get the organic traffic for a skin care store, take care to talk to your female colleagues or friends to find out the jargon they use when they talk about this stuff. Spend some time on relevant social media resources to learn your audience’s language.

When you’re positive you understand how your customers talk and which words they use, get down to putting up your keyword list.

  • Prepare a list of action keywords that customers might use at the Comparison and Purchase stages as part of their query. Don’t add the product or category names to these keywords yet.
    E.g. “buy”, “purchase”, “price”, “compare”, “review”.
  • Get a full list of brands available at your store.
    E.g. “Sony”, “Samsung”, “Apple”.
  • Compile a list of categories, product names, and their core properties, like size or color.
    E.g. “TV”, “laptop”, “smartphone”; “Iphone”, “Galaxy Note”, “34-inch display”.

2. Mix the keywords up

Once you’ve got these three lists ready, it’s time to move on to putting together search phrases. Combining generic keywords with product keywords and their properties should give you dozens of long tail keywords — like “buy 42-inch Samsung TV”. It works like a slot machine: you turn the reels and get new keyword phrases.

You can do it manually if you need to mix up a dozen of keywords. However, given the size of the inventory in most online stores, you will likely need software tools to get things done quickly.

Try using Rank Tracker‘s Word Combination option to get a full list of possible long tail keywords instantly.

1. Create or open a project in Rank Tracker.
2. Click the Suggest Keywords button.
3. Select Word Combination from the available keyword research methods, and hit Next.
4. Select the number of parts to combine, enter your keywords in the columns, and click Next once more.

(By the way, it looks exactly like a slot machine!)

In an instant, you’ll get plenty of long tail keyword phrases.

Select the keywords to add to your project and hit Update KEI to get their search volume, competition, and Keyword Efficiency Index.

Voila — you’ve just saved yourself a couple of hours!


Content:

Keyword matrix
Do smart keyword targeting to avoid cannibalization

Have you heard about keyword cannibalization? To put it short, if several pages of your website contain a response to the same search query, these pages will compete with each other in SERPs for rankings. The search engines may rank the page that is less suitable or important from your standpoint.

In order to avoid keyword cannibalization, create a keyword matrix. Fill the rows in a spreadsheet with the URLs of your site’s most important pages (most likely product category pages), and create columns for your keywords. Put a mark at the intersection of row/column to assign a certain keyword to a certain page. This method will help you make sure that you don’t use the important keyword across multiple pages.

URLsKeywords
Samsung tvToshiba tvSony tv
www.mystore.com/samsung-tvx
www.mystore.com/sony-tvx
www.mystore.com/Toshiba-tvx

If the CMS of your online store creates separate pages for such variations of a product as size and color, it will make sense to restrict such pages from indexing using robots.txt or <meta name=”robots” content=”noindex”> tag. Canonicalization is another solution (see Google’s guidelines for detailed instructions).


Content:

On-page optimization
Save time on optimizing thousands of product pages

An e-commerce website typically has a limited set of category pages and thousands of product pages. Everything is more or less clear with category pages (they are usually subject to the traditional on-page SEO approach; if you are new to SEO, check out the A to Z SEO Guide for the steps).

Things get trickier when it comes to product pages. You’ll hardly have the time and resources to create unique titles, H1 tags, and descriptions for each product page.

Luckily, the slot machine approach (see the Keyword research section) can be used for meta tags just as well.

Create title, meta description and H1 templates for your product pages. For example, you may use this template for the title tag: Buy [ProductName] online | Your store name

[ProductName] is a variable that changes for every page depending on the product. If your CMS does not support variables, ask your development team for help.

Do the same for your H1s and descriptions — and remember that titles and meta descriptions are displayed in your listing’s snippet in SERPs, so make sure to use strong calls-to-action to entice clicks from search results.


Content:

Duplication
Make sure every page on your site is unique

Duplicate content issues for e-commerce sites fall into two categories:

  • Off-site — the content appears on many websites across the web.
  • On-site — many pages of the website feature the same content.

1. Fix off-site duplication

Off-site duplication is natural for e-commerce. Online stores often use product descriptions, images, and specifications provided by the manufactures. This is logical, since you cannot invent new specs for the latest iPhone. However, there are a number of solutions for the problem.

  • Write unique descriptions for each item. If you have a team of copywriters to get the entire inventory covered — go for it. Just keep in mind that as the inventory scales up, you’ll need to keep up with the copy as well.
  • Leverage user-generated content. Create incentives for visitors to write reviews of the items they purchased. Send follow-up emails and ask for a review nicely, or offer discounts or bonuses to customers who leave a review. On the downside, there’s no guarantee that you will have a steady flow of reviews for all the items being sold. Additionally, reviews should be moderated to avoid spam or obscene language, which requires additional resources.
  • Add a Q&A section for each product. You can make your product descriptions unique by adding a FAQ section with questions customers often have about the product. Again, doing this will require additional human resources.
  • Optimize product category pages only. If you don’t have the time and resources to work on product pages, you can choose to create unique content for category pages only. In this case, it’s necessary to prevent the indexation of the product pages (using robots.txt or meta tags) — this means that the product pages will not appear in the SERPs.

2. Fix on-site duplication

On-site duplication is a frequent problem across the pages of online stores. It can be caused by the e-commerce content management system or an illogical website structure.

There are two typical scenarios. First, a product may belong to several categories, e.g. one Samsung TV set could be found in “Home”, “TVs”, and “Samsung”. The CMS may generate different URLs for the very same product depending on the path a user takes in the product catalog. For example:

http://mystore.com/tv-sets/samsung-un65hu9000-65-inch.html
http://mystore.com/samsung/samsung-un65hu9000-65-inch.html

Second, the CMS could generate a separate URL and page for variations of one product (e.g. size, color or other specifications). This approach wasn’t a problem before Google’s Panda algorithm update; currently, Google can penalize websites for duplicated product pages across different URLs. For example:

http://mystore.com/women-hoodies/juicy-couture-velour-hoodie-white.html
http://mystore.com/women-hoodies/juicy-couture-velour-hoodie-black.html

There are several ways to get around on-site duplication:

  • 1. Master URLs. No matter what path a user takes in the catalogue, the CMS must always return only one URL for a particular product. All product variations should be represented on one page reachable via one URL, so that the user is not redirected to other pages. This approach eliminates content duplication and ensures that your site’s Crawl Budget is used wisely.
  • Canonicalization. This technique does solve the duplicate content problem, but it can have drawbacks in terms of user experience and crawl budget. See Google’s Canonicalization guide for detailed info.

Content:

Out of stock and discontinued items
Create search-engine-friendly pages for unavailable products

Clearly, there are times when you store would run out of a certain product — or even discontinue an item completely. These two cases should be handled differently.

1. Create smart pages for temporarily unavailable products

If an item is temporarily unavailable, removing the page is not an option. The page should clearly state that the product is out of stock, and provide all the relevant information the visitor may need to make sure they either wait until the item arrives or order an alternative from you.

  • Include the item’s planned arrival date. This will help the visitors decide whether they’re ready to wait until the item is available, or if they should look for alternatives.
  • Offer an opportunity to get a notification when the item arrives. Even if you don’t know when the item is going to be available, it’s a good idea to give your visitors an option to get notified via email when it’s back in stock.
  • Give visitors a preorder option. If you’re positive the item is going to be available soon, let users preorder it. This will assure your customers that when the product is in stock, they will be the first to receive it.
  • Add a list of similar products. When you can, offer visitors alternative options to make sure they purchase from you and don’t go to competitors instead.

2. Choose how you’ll handle permanently discontinued products

If the item is permanently removed from sale, you have several options to deal with its product page.

  • Return a 404 page. 404 is a natural way to remove pages from the search engine index; the overall rankings of the website will not be affected. Make sure to remove 404 pages from your site’s XML sitemap — this will send a strong signal to the search engines that the page should be removed from the index. This approach is suitable for pages that don’t have a lot of backlinks and don’t serve as an entrance point to the website. If the page ranks well for some queries though, consider other options.
  • Create a 301 redirect to a similar item or relevant product category. The redirect will help you save link juice; on the downside, 301 redirects can increase load time and confuse the visitor.
  • Keep the product page, but state that the item is discontinued and offer an alternative. In this way, you will preserve the link juice and the page’s rankings. However, this option is not recommended if the online store’s inventory changes often — you don’t want to end up with thousands of ghost products wasting your Crawl Budget.

Technical:

Pagination
Use pagination properly to avoid duplication and indexing problems

Pagination is the practice of segmenting a piece of content into multiple pages. On an e-commerce website, pagination can create a series of very similar pages with overlapping content. If the pagination bar on your site only includes a few pages, and each number representing a subsequent page is visible and clickable, this will not usually pose a problem. Here’s an example:

But if the number of pages exceeds a certain amount, the pagination bar will display only a couple of initial pages and a few final pages. The in-between pages won’t be linked to from the main page — as a result, they will be crawled by search engines less often.

This issue may be addressed in two ways:

  • Add a View All option. Consider adding a page that contains the products from all pages. In this scenario, each split page should contain the rel=”canonical” link pointing to the view all page. See Google’s blog post for a detailed how-to.
  • Add rel=”next” and rel=”prev” tags. These tags can be used inside the <head> tag of a page to indicate next and previous pages in a series. The first page will only have a rel=”next” tag, and the last one — just a rel=”prev” tag, while the pages in-between will contain both. These tags give Google a hint to treat the split pages as one. This approach will help you consolidate backlinks, and Google will likely display only the most relevant page (the first one) in SERPs. For more information on rel=”next” and rel=”prev” , see this post on Google Webmaster blog.

Technical:

Site Speed
Optimize your pages’ load time for better rankings and user experience

Site speed is a factor that has a double effect on e-commerce websites. A slow website is poor user experience; poor user experience often translates into lower sales. Site speed is a ranking factor, too; fast loading pages get an advantage over slower ones in search results.

First, you’ll need to test your main landing pages to make sure there are no speed issues. You can do that quickly with WebSite Auditor.

1. Create or open a WebSite Auditor project for your site.
2. Go to the Content Analysis module.
3. Select a page you want to test, enter your keywords, and proceed with the next steps as necessary.

Along with other content and technical info, the software will run a detailed page speed test. See the Page speed (Desktop) section and make sure your page is free from any issues that may be slowing it down.

Here are the 5 top things that affect page speed and are often ignored by e-commerce sites.

  • Eliminate unnecessary redirects. Very often websites redirect visitors from the non-www version to the www version, and then to the mobile version or a user-friendly URL. Eliminate such intermediate redirects whenever you can safely do that.
  • Optimize product images. Ecommerce websites usually have a lot of product images, which make up for the largest share of the traffic payload. Make sure that the all the images are optimized and compressed in size. Consider using smaller images with an option to open a large version.

  • Enable browser caching. E-commerce website visitors will typically view many pages per session. You do not want them to load the unchanged content again and again, do you?
  • Prioritize the load of visible content for pages that have a scroll bar.
  • Avoid JavaScript that blocks page rendering. It will cause the user’s browser to wait for the script to load before loading the page itself.

Technical:

Mobile
Deliver a great user experience across devices

50% of Google search traffic is mobile. About 60% of consumers use mobile devices to make purchase decisions. If you are promoting an e-commerce website, you can’t neglect this huge audience.

Just like site speed, a poor user experience on mobile devices may result in lower sales and negatively influence your rankings.

1. Go mobile if you haven’t already

If you haven’t taken your site mobile yet, you’ll need to start with choosing the right technology. There are three major options: dynamic serving, separate mobile pages, or responsive design.

For e-commerce sites, responsive design is perhaps the best way to go. Here are some benefits of this option:

  • Same URL for mobile and desktop versions of pages. Using a single URL for a piece of content makes it easier for users to interact with, share, and link to that content. Such pages are also easier for search engines to discover and index.
  • Content presentation is customizable depending on the type of device it is viewed from.
  • No redirects. Unlike with a separate mobile version of the site, responsive design requires no additional redirects. This makes for a better load time and user experience.

2. Double-check pages of a mobile site

If you aren’t sure if your page is totally mobile friendly, here’s a quick way to check that:

1. Open your WebSite Auditor project.
2. Go to Content Analysis.
3. Select the page to analyze against mobile-friendliness, and proceed with the next steps.

Once the analysis is complete, check the Page usability (Mobile) section to see if your page is fully optimized for mobile devices. Go through the factors under this section to see if you can make any improvements for your mobile visitors.


Technical:

HTTPS
Create a secure site to win customers’ (and Google’s) trust

Search engines favor websites that securely encrypt the traffic between the site and a user. Going HTTPS is critical for e-commerce websites to protect the customers’ credit card details and other personal information.

You’ll need 2 things to go HTTPS: a dedicated IP and an SSL certificate. To get a dedicated IP, contact you hosting provider. Getting a certificate is no big deal either — there are a lot of SSL certificate providers like Comodo or Geotrust to name a few. Once you’ve installed the certificate, remember to test whether it’s configured correctly with this tool by SSL labs.

There are some common pitfalls to avoid when transferring to HTTPS.

  • If your website uses a content distribution network, third party scripts and APIs, make sure they support HTTPS. Otherwise, visitors will get errors on page load or notifications that only part of the content is encrypted.
  • Make sure all internal links point to the HTTPS version of the website. If your web developers use absolute links, you’ll definitely have to fix those.
  • Configure redirects from the HTTP to the HTTPS version properly. Poor redirects are a common issue with HTTPS — especially if only some parts of your website are encrypted.

Technical:

Crawl Budget
Make sure search engines can crawl pages that matter for SEO

Crawl budget is the number of pages of a website that search engines can crawl per day. The value is different for every site, as crawl budget is closely related to the authority of the website. This metric is especially important for e-commerce websites with large inventories. While you cannot make search engines crawl more pages, you may facilitate their work by removing clutter on their way.

  • Get rid of duplicate pages. Sure thing you may deal with duplicate content with rel=”canonical”, but crawling duplicate content wastes your crawl budget and slows down the discovery of fresh content.
  • Prevent indexation of useless (in terms of SEO) content. Privacy policy, terms and conditions, and expired promotions are good candidates for a Disallow rule in robots.txt.
  • Fix broken links. Hitting broken links wastes your crawl budget — and doesn’t take search engine bots anywhere useful.
  • Keep your XML sitemap up to date, and make sure to register XML sitemaps in Google Search Console.



This article first appeared on SEOPowerSuite’s Web site.
I strongly recommend this site to anyone wanting to better understand SEO. 
http://www.link-assistant.com/

Actionable guide to SEO in 2016

Written by   in 

The rule is simple — search engines won’t rank your site unless they can find it. So, just like before, it is extremely important to make sure search engines are able to discover your site’s content — and that they can do that quickly and easily. And here’s how.

1. Keep a logical site structure

Good practice
  • The important pages are reachable from homepage.
  • Site pages are arranged in a logical tree-like structure.
  • The names of your URLs (pages, categories, etc.) reflect your site’s structure.
  • Internal links point to relevant pages.
  • You use breadcrumbs to facilitate navigation.
  • There’s a search box on your site to help visitors discover useful content.
  • You use rel=next and rel=prev to convert pages with infinite scrolling into paginated series.
Bad practice
  • Certain important pages can’t be reached via navigational or ordinary links.
  • You cram a huge number of pages into one navigation block — an endless drop-down menu or something like this.
  • You try to link to each & every inner page of your site from your homepage.
  • It is difficult for users to go back and forth between site pages without resorting to Back and Forward browser buttons.
An example of a logical site structure:

An example of a clean URL structure:
www.mywebsite.com/product-category-1/product-1
www.mywebsite.com/product-category-2/product-3

 

2. Make use of the XML sitemap & RSS feeds

The XML sitemap helps search bots discover and index content on your site. This is similar to how a tourist would discover more places in an unfamiliar city if they had a map.

RSS/Atom feeds are a great way to notify search engines about any fresh content you add to the site. In addition, RSS feeds are often used by journalists, content curators and other people interested in getting updates from particular sources.

Google says: “For optimal crawling, we recommend using both XML sitemaps and RSS/Atom feeds. XML sitemaps will give Google information about all of the pages on your site. RSS/Atom feeds will provide all updates on your site, helping Google to keep your content fresher in its index.”

Good practice
  • Your sitemap/feed includes only canonical versions of URLs.
  • While updating your sitemap, you update a page’s modification time only if substantial changes have been made to it.
  • If you use multiple sitemaps, you decide to add one more sitemap only if your current sitemaps have already reached the limit of URLs (up to 50 thousand per each sitemap).
  • Your RSS/Atom feed includes only recently updated items, making it easier for search engines and visitors to find your fresh content.
Bad practice
  • Your XML sitemap or feed includes the URLs search engines’ robots are not allowed to index, which is specified either in your robots.txt, or in meta tags.
  • Non-canonical URL duplicates are included into your sitemap or feed.
  • In your sitemap, modification time is missing or is updated just to “persuade” search engines that your pages have been brought up to date, while in fact they haven’t.

SEO PowerSuite tip:
Use XML sitemap builder in WebSite Auditor

3. Befriend Schema markup

Schema markup is used to tag entities (people, products, events, etc.) in your pages’ content. Although it does not affect your rankings, it helps search engines better interpret your content.

To put it simple, a Schema template is similar to a doorplate — if it says ‘CEO Larry Page’, you know whom to expect behind the door.

Good practice
  • Review the list of available Schemas and pick the Schemas to apply to your site’s content.
  • If it is difficult for you to edit the code on your own, you use Google’s Structured Data Markup Helper.
  • Test the markup using Google’s Structured Data Testing Tool.
Bad practice
  • You use Schemas to trick search engines into believing your page contains the type of info it doesn’t (for example, that it’s a review, while it isn’t) — such behavior can cause a penalty.

4. Leverage rich answers

In 2015 we observed the growth in the number of rich answers in Google search results. There are various types of rich answers. Basically, a rich answer is a snippet that already contains a brief answer to the search query. It appears above other organic search results and thus enjoys more exposure.

Any website has a chance to be selected for the rich answers. Here are a few things you may do to increase your chances to get there:

1) Identify simple questions you might answer on your website;
2) Provide a clear direct answer;
3) Provide additional supporting information (like videos, images, charts, etc.).


CHAPTER 2 Master Panda survival basics

“Panda” is a filter in Google’s ranking algorithm that aims to sift out pages with thin, non-authentic, low-quality content. This means getting rid of thin content and duplicate content should be high up on your 2016 to-do list.

1. Improve content quality

Good practice
  • These days, it’s not enough to keep your content unique in a sense that it passes the plagiarism test. You need to create really useful, expert-level content and present it in the most engaging form possible.
  • You block non-unique or unimportant pages (e.g. various policies) from indexing.
Bad practice
  • Your website relies on “scraped” content (content copied from other sites with no extra value added to it). This puts you at risk of getting hit by Panda.
  • You simply “spin” somebody else’s content and repost it to your site.
  • Your website includes too many pages with little textual content.
  • Many of your site’s pages have duplicate or very similar content.
  • You base your SEO strategy around a network of “cookie-cutter” websites (websites built quickly with a widely used template).

2. Make sure you get canonicalization right

Canonicalization is a way of telling search engines which page should be treated as the “standardized” version when several URLs return virtually the same content.

The main purpose of this is to avoid internal content duplication on your site. Although not a huge offense, this makes your site look messy — like a wild forest in comparison to a neatly trimmed garden.

Good practice
  • You mark canonical pages using the rel=”canonical” attribute.
  • Your rel=”canonical” is inserted in either the <head> section or the HTTP header.
  • The canonical page is live (doesn’t return a 404 status code).
  • The canonical page is not restricted from indexing in robots.txt or by other means.
Bad practice
  • You’ve got multiple canonical URLs specified for one page.
  • You’ve got rel=”canonical” inserted into the <body> section of the page.
  • Your pages are in an infinite loop of canonical URLs (Page A points to page B, page B points to page A). In this case, search engines will be confused with your canonicalization.

SEO PowerSuite tip:
Use WebSite Auditor to check your pages for duplicate rel=”canonical” code


CHAPTER 3 Learn to combat Penguin

Google’s Penguin filter aims at detecting artificial backlink patterns and penalizing sites that violate its quality guidelines in regards to backlinks. So, keeping your backlink profile look natural is another key point to focus on in 2016.

Good practice
  • Your website mostly has editorial links, earned due to others quoting, referring to or sharing your content.
  • Backlink anchor texts are as diverse as reasonably possible.
  • Backlinks are being acquired at a moderate pace.
  • Spam, low quality backlinks are either removed or disavowed.
Bad practice
  • Participating in link networks.
  • Having lots of backlinks from irrelevant pages.
  • Insignificant variation in link anchor texts.

SEO PowerSuite tip:
Check backlinks’ relevancy with SEO SpyGlass

SEO PowerSuite tip:
Detect spammy links in your profile


CHAPTER 4 Improve user experience

Quite a few UX-related metrics have made their way into Google’s ranking algorithm over the past years (site speed, mobile-friendliness, the HTTPs protocol). Hence, striving to improve user experience can be a good way to up your search engine rankings.

1. Increase site speed

There are quite a few factors that can affect page loading speed. Statistically, the biggest mistakes site owners make that increase page load time are: using huge images, using large-volume multimedia or other heavy design elements that make the site as slow as a snail.

Use Google’s PageSpeed Insights to test your site speed and to get recommendations on particular issues to fix.

SEO PowerSuite tip:
Optimize your pages’ loading time with WebSite Auditor

2. Improve engagement & click-through rates

The Bing and Yahoo! alliance, as well as Yandex, have officially confirmed they consider click-through rates and user behavior in their ranking algorithms. If you are optimizing for any of these search engines, it’s worth trying to improve these aspects.

While Google is mostly silent on the subject, striving for greater engagement and higher click-through rates tends to bring better rankings as well as indirect SEO results in the form of attracted links, shares, mentions, etc.

3. Consider taking your site HTTPs

In August 2014, Google announced that HTTPs usage is treated as a positive ranking signal.

Currently there is not much evidence that HTTPs-enabled sites outrank non-secure ones. The transition to HTTPS is somewhat controversial, because

a) Most pages on the Web do not involve the transfer of sensitive information;
b) If performed incorrectly, the transition from HTTP to HTTPS may harm your rankings;
c) Most of your site’s visitors do not know what HTTP is, so transferring to HTTPS is unlikely to give any conversion boost.

4. Get prepared for HTTP/2

HTTP/2 is a new network protocol that should replace the outdated HTTP/1.1. HTTP/2 is substantially faster than its predecessor. In terms of SEO, you would probably be able to gain some ranking boost due to the improved website speed.

On November 06, 2015 John Mueller announced in a G+ hangout that Google Bot will soon be able to crawl HTTP/2 websites. At the time of writing, about 70% of web browsers support HTTP/2. You can keep track of HTTP/2 support by browsers on “Can I Use”.

HTTP/2 is likely to become a “must” soon. Thus, keep an eye on the issue and be ready to implement this feature when required.


CHAPTER 5 Be mobile-friendly

The number of mobile searches may soon exceed the number of desktop searches. With this in mind, search engines in general and Google in particular love mobile-friendly websites.

Mobile-friendliness has become a minor ranking factor for the mobile SERPs. You can test if your website is mobile-friendly using Google’s Mobile-Friendly Test.

On October 07, 2015 Google introduced Accelerated Mobile Pages Project (AMP). As the name implies it aims to provide a more streamlined experience for mobile users. The technology consists of three elements: special HTML markup, AMP JavaScript, and a content distribution layer (the latter is optional). The AMP search is currently available only on mobile devices. You may give it a try at g.co/ampdemo.

Good practice
  • Your page’s content can be read on a mobile device without zooming.
  • You’ve got easy-to-tap navigation and links on your website.
Bad practice
  • You are using non-mobile-friendly technologies like Flash on your webpages.

SEO PowerSuite tip:
Use mobile-friendly test in WebSite Auditor


CHAPTER 6 Earn social signals — the right way

Search engines favor websites with a strong social presence. Your Google+ posts can make it to your Google connections’ organic search results, which is a great opportunity to drive extra traffic. Although the likely effect of Twitter or Facebook links on SEO hasn’t been confirmed, Google said it treats social posts (that are open for indexing) just like any other webpages, so the hint here is clear.

Good practice
  • You attract social links and shares with viral content.
  • You make it easy to share your content: make sure your pages have social buttons, check which image/message is automatically assigned to the post people share.
Bad practice
  • You are wasting your time and money on purchasing ‘Likes’, ‘Shares’ and other sorts of social signals. Both social networks and search engines are able to detect accounts and account networks created for trading social signals.

SEO PowerSuite tip:
See your site’s social signals in SEO PowerSuite


CHAPTER 7 Revise your local SEO plan

In August 2015, Google reduced the number of results in the local pack from 7 to 3 and removed addresses and phone numbers. The search engine made it harder for SEOs to get to the local pack; however, a new map view has been added with up to 20 spots for the search results.

What has changed is that local rankings are now more dependent on the IP address of the user. You can read more on how to adjust your local SEO strategy to the new Google’s update in this guide.

SEO PowerSuite tip:
Check website authority in SEO PowerSuite


What’s coming in SEO in 2016?

Here are the main SEO trends for 2016, as predicted by our in-house SEO team:

SEO remains part of multi-channel marketing
Customers can find your business through social, paid search, offline ads, etc. Organic search is an integral part of a complex path to conversion. Just be aware of these other channels and get savvy in additional spheres, if necessary.

Google now gets searcher intent & context
The keyword is no longer the gravity center of a ranking. Google now also looks at synonyms, your niche connections, location, etc. to see if you fit the bill (=query). The good news is that you don’t need to get too hung up on specific keywords to prove you’re relevant to a query.

The end of search monopoly might be near
According to comScore in 2015, both Yahoo! and Bing continued to steadily increase their search market share.

No quick results, no easy SEO
With its latest iterations of Panda & Penguin and the penalties Google dished out to link networks, the search engine widened its net for spam — and became even better at detecting sites with unengaging content or unnatural link patters.

Traffic stays on Google increasingly
Google has definitely stepped up its effort to provide immediate answers to people’s searches. And, with the increasing number of rich answers, this tendency for stealing traffic from publishers will likely increase.

Paid search expansion
A few years ago, Google changed Google Shopping’s organic model to pay-per-click. It is possible that Google will make yet another organic vertical paid. Local Search is the best candidate for the change, since Local Search results are dominated by businesses selling a product.

 

By: Yauhen Khutarniuk
Head of SEO at SEO PowerSuite

This article first appeared on SEOPowerSuite’s Web site.
I strongly recommend this site to anyone wanting to better understand SEO. 
http://www.link-assistant.com/

Panda-Proof Content Audit for Your Site ~ 6 steps to make sure your content matches Google’s quality criteria

Written by   in 

First, the news broke out that Google Panda (a special “filter” designed to de-rank low quality content) is now a part of the search engine’s core ranking algorithm and Panda updates will probably start to roll out faster and more regularly.

Second, over the past two weeks, we’ve seen some major changes in Google’s search results. And though confirmed to be unrelated to Panda, these massive SERP turbulences also seem to be in connection with content quality.

It looks like in 2016, the “quality” of your content is not just an empty word, but something that you NEED to optimize your site for. And today we’ll dive deeper into what the Panda quality algorithm is and how to run a thorough Panda-proof content audit.

 

What’s Panda & how does it affect SERPs?

The Panda algorithm (named after Google engineer Navneet Panda) is designed to help Google improve the quality of search results by down-ranking low quality content.

The basic principle here is that Google assigns a particular quality score to each website in its index (the score is assigned site-wide, not to separate pages.)

Initially, Panda functioned as a filter applied to a pack of search results that Google considered relevant to a search query. The Panda score was re-ordering them, pushing down the low-scorers, and giving a boost to the highest scored content.

Now, as Panda signals are “baked” into Google’s core ranking algorithm, they no longer re-order the results, but form them together with other Google’s ranking signals.

How can Panda identify high quality content?

Sure thing, there’s no “gut feeling” that helps Panda identify real quality. Panda is only an algorithm that checks your website for a number of factors that Google assume are typical of a high quality website. Then, by applying some math, it gives the site a specific quality score based on the results of this check.

The good news is, if your site’s quality score is based on a number of separate factors, you can influence those factors to improve the score.

The bad news is… Google won’t disclose the exact quality factors it takes into account to calculate the score. So the list of Panda-prone issues below is an educated guess, based on what Google has said on site quality, and what trackable factors it can use to determine it.

What are Panda roll-outs & will they get more frequent?

Sure thing, a quality score is not something you can assign to a website once and for all. Websites change, content gets added and removed, and, clearly, the score needs to be updated every now and then to stay relevant.

The optimal way to deal with problematic content largely depends on the size of your site.

  • For a small website (<100 pages), removing low quality content is something you can’t afford. Your key strategy is to improve on every problematic page, rather than delete it.
  • For a medium-sized site (100-1000 pages), removing some of the low quality content is possible. But your main focus will be on improving content at least for the most important pages.
  • For a large website (>1000 pages), improving all the problematic areas is a huge piece of work, so your focus would be to “weed out” and remove the unnecessary and low quality content.

Step 2. Check for thin content

Imagine you have a category page with only a few lines of meaningless text and hundreds of links to products. This is what’s generally called thin content.

Search engines use content to determine the relevancy of a page for a query. And if you barely provide any information that’s accessible to them, how are they to understand what the page is about?

Refreshing the score non-stop in real time would take too much computing power — that is why up till now Google was launching “Panda updates” once in a couple of months,recalculating website quality scores, and thus changing the filter they apply to the search results.

Will that change now that the Panda quality score is a part of Google’s core ranking algorithm? It doesn’t seem so. The score is still not being calculated in real time; the change only means that the algorithm got more solid and doesn’t need to be modified any further. Thus Google won’t have to test and apply new signals (it’ll only re-apply older ones), so it will be possible to run Panda updates faster and more frequently.

Surely, quality is not all about the word count, since there are cases when you can deliver value in a few hundred words. That is why there’s no “minimum word count” threshold that triggers a low Panda quality score. More to that, sometimes pages with a little over a hundred words do exceptionally well on Google and even get included into its rich answers.

But having too many thin content pages will very likely get you into trouble — so on average,word count under 250 words is a good indicator to locate problematic spots across your site.

Step 3. Check for duplicated/very similar content

Another factor that could be a signal of your site’s low quality is duplicated or very similar content across multiple pages.

Very often, bigger sites have to deal with a huge amount of pages that need to be filled with content. And many of them resort to an easy way to fill out those gaps — by writing boilerplate text that’s the same on each page except for a few variables. This is what Google considers automated, low quality content.

So, besides weeding out the word-by-word duplicated content, pay attention to the similar-looking pieces (say, your page titles are absolutely identical in structure and differ only in a product name) that may be a sign of content automation.

 

Step 4. Check for aggregated content/plagiarism

What’s also synonymous to quality in Google’s eyes is the “uniqueness” of your content. As Google wants your content to add value and not simply repeat what’s already been said, having non-unique content on your website (e.g. plagiarized content, product descriptions duplicated in feeds used for other channels like Amazon, shopping comparison sites and eBay) is an easy way to get under Google’s Panda filter.

If you suspect that some of your pages may be duplicated externally on other online resources, a good idea would be to check them with Copyscape. (http://copyscape.com/)

Copyscape gives some of its data for free (for instance, comparing two specific URLs), but for a comprehesive check you may need a paid Premium account.

Step 5. Check for proper keyword usage

Keywords and keyword targeting are the most basic and longest-running concepts in SEO. And if you’ve been in the search industry for quite some, you may remember the days when SEO meant just having the right words in your meta keywords tag.

Sure, these times have passed: search engines now try to detect and punish websites deliberately using too many keywords in their content.

However, whether Google will admit it or not, their algorithms are still built upon keywords. And having a keyword in your title tag DOES improve your page’s rankings, meaning you simply can’t afford not optimizing pages for keywords.

So, the only ticklish question here is, “How many is not too many?” And one of the ways to check this is by looking at top ranking competitors (because the sites that rank in top 10 are the sites that pass Google quality test with an A+.)

Remember the Hummingbird algorithm update? The one with which Google learned to recognize the meaning behind a search query and give a common answer to a number of “different-in-keywords” but “same-in-meaning” queries?

This update changed the way SEOs optimize pages — now we no longer think “single keyword optimization”, but try to make our pages relevant for a whole group of synonyms and related terms.

So, adding all kinds of related keywords will help you improve your pages’ rankings and avoid the keyword stuffing issues.

Step 6. Check for user engagement metrics

Though Google generally states that user experience signals are not included into their search ranking algorithm, many experiments show the opposite. And one of the metrics SEOs suspect Google to use is bounce rates.

Think about it — as Google tries to bring users the best search experience, it obviously wants them to find what they were looking for with the first search result they click on. The best search experience is one that immediately lands the searcher on a page that has all the information they need, so that they don’t hit the back button to return to the SERP and look for other alternatives.

Bouncing off pages quickly to return to the SERP and look for other results is called pogo-sticking, and it can be easily measured in terms of bounce rates.

What else to check?

  1. 1. Check for user-generated content issues

User-generated content and how it affects Panda has been a hot topic recently, and it has gotten to the point where many SEOs are recommending to get rid of all user-generated content, claiming that Google sees it as a signal of poor site quality.

This is far from being true, because we’re still seeing lots of websites based purely on user-generated content (think Quora) that are doing well on Google.

However, user-generated spam — for instance, irrelevant comments on your blog or poorly moderated forum pages — can put your site into trouble.

So if your website features user-generated content, make sure improving your moderation strategy is a priority.

  1. Check for grammar mistakes

Bad spelling and grammar can both impede user experience and lower the trustworthiness of your content in Google’s eyes, so don’t tempt the fate by leaving obvious grammar errors on your pages. The easiest way to run a spellchecker through your content is to copy the text and paste it into a word processor. This should then highlight the spelling mistakes so you can update the content.

By: Katherine Stepanova
Head of Marketing at SEO PowerSuite

This article first appeared on SEOPowerSuite’s Web site.
I strongly recommend this site to anyone wanting to better understand SEO. 
http://www.link-assistant.com/