How to Check Which Links Can Harm Your Site’s Rankings

Matt Cutts’ statement in March 2012 that Google would be rolling out an update against “overoptimised” websites, caused great turmoil within the SEO community. A few days later thousands of blogs were removed from Google’s index and Matt tweeted confirming that Google had started taking action against blog networks.

Even though thousands of low-quality blogs of low or average authority were manually removed from Google’s index, they weren’t the only victims. For instance, www.rachaelwestdesigns.com, a PR7, DA70 domain was also removed, probably due to the very high number of blog roll (site-wide) backlinks.

These actions indicate that the new update on “overoptimised” websites has already begun to roll out but it is uncertain how much of it we have seen so far.

At around the same time Google sent to thousands webmasters the following message via message via Google’s Webmaster Tools:

In the above statement, it is unclear what Google’s further actions will be. In any case, working out the number of “artificial” or “unnatural links” with precision is a laborious, almost impossible task. Some low quality links may not be reported by third party link data providers, or even worse, because Google has started deindexing several low quality domains, the task can end-up being a real nightmare as several domains cannot be found even in Google’s index.

Nevertheless, there are some actions that can help SEOs assess the backlink profile of any website. Because, in theory, any significant number of low quality links could hurt, it would make sense gathering as many data as possible and not just examine the most recent backlinks. Several thousand domains have already been removed from Google’s index, resulting in millions of links being completely devalued according to Distilled’s Tom Anthony (2012 Linklove).

Therefore, the impact on the SERPs has already been significant and as always happens in these occasions there will be new winners and losers once the dust settles. However, at this stage it is be a bit early to make any conclusions because it is unclear what Google’s next actions are going to be. Nevertheless, getting ready for those changes would make perfect sense, and spotting them as soon as they occur would allow for quicker decision making and immediate actions, as far as link building strategies are concerned.

As Pedro Dias, an Ex-Googler from the search quality/web spam team tweetted, “Link building, the way we know it, is not going to last until the end of the year” (translated from Portuguese).

The Right Time For a Backlinks Risk Assessment

Carrying out a backlinks audit in order to identify the percentage of low-quality backlinks would be a good starting point. A manual, thorough assessment would only be possible for relatively small websites as it is much easier to gather and analyse backlinks data – for bigger sites with thousands of backlinks that would be pointless. The following process expands on Richard Baxter’s solution on ‘How to check for low quality links‘, and I hope it makes it more complete.

  1. Identify as many linking root domains as possible using various backlinks data sources.
  2. Check the ToolBar PageRank (TBPR) for all linking root domains and pay attention on the TBPR distribution
  3. Work out the percentage of linking root domains that has been deindexed
  4. Check social metrics distribution (optional)
  5. Repeat steps 2,3 and 4 periodically (e.g. weekly, monthly) and check for the following:
  • A spike towards the low end of the TBPR distribution
  • Increasing number of deindexed linking root domains on a weekly/monthly basis
  • Unchanged numbers of social metrics, remaining in very low levels

A Few Caveats

The above process does come with some caveats but on the whole, it should provide some insight and help making a backlinks’ risk assessment in order to work out a short/long term action plan. Even though the results may not be 100% accurate, it should be fairly straightforward to spot negative trends over a period of time.

Data from backlinks intelligence services have flaws. No matter where you get your data from (e.g. Majestic SEO, Open Site Explorer, Ahrefs, Blekko, Sistrix) there is no way to get the same depth of data Google has. Third party tools are often not up to date, and in some cases the linking root domains are not even linking back anymore. Therefore, it would make sense filtering all identified linking root domains and keep only those still linking to your website. At iCrossing we use a proprietary tool but there are commercial link check services available in the market (e.g. Buzzstream, Raven Tools).

ToolBar PageRank gets updated infrequently (roughly 4-5 times in a year), therefore in most cases the returned TBPR values represent the TBPR the linking root domain gained in the the last TBPR update. Therefore, it would be wise checking out when TBPR was last updated before making any conclusions. Carrying out the above process straight after a TBPR update would probably give more accurate results. However, in some cases Google may instantly drop a site’s TBPR in order to make public that the site violates their quality guidelines and discourage advertisers. Therefore, low TBPR values such as n/a, (greyed out) or 0 can in many cases flag up low quality linking root domains.

Deindexation may be natural. Even though Google these days is deindexing thousands of low quality blogs, coming across a website with no indexed pages in Google’s SERPs doesn’t necessarily mean that it has been penalised. It may be an expired domain that no longer exists, an accidental deindexation (e.g. a meta robots noindex on every page of the site), or some other technical glitch. However, deindexed domains that still have a positive TBPR value could flag websites that Google has recently removed from its index due to guidelines violations (e.g. link exchanges, PageRank manipulation).

Required Tools

For large data sets NetPeak Checker performs faster than SEO Tools, where large data sets can make Excel freeze for a while. NetPeak checker is a standalone free application which provides very useful information for a given list of URLs such as domain PageRank, page PageRank, Majestic SEO data, OSE data (PA, DA, mozRank, mozTrust etc), server responses (e.g. 404, 200, 301) , number of indexed pages in Google and a lot more. All results can then be exported and processed further in Excel.

1. Collect linking root domains

Identifying as many linking root domains as possible is fundamental and relying in just one data provided isn’t ideal. Combining data from Web master tools, Majestic SEO, Open Site Explorer may be enough but the more data, the better especially if the examined domain has been around for a long time and has received a large number of backlinks over time. Backlinks from the same linking root domain should be removed so we end up with a long list of unique linking root domains. Also, not found (404) linking root domains should also be removed.

2. Check PageRank distribution

Once a good number of unique linking root domains has been identified, the next step is scrapping the ToolBar PageRank for each one of them. Ideally, this step should be applied only on those root domains that are still linking to our website. The ones that don’t should be discarded if not too complicated. Then, using a pivot chart in Excel, we can conclude whether the current PageRank distribution should be a concern or not. A spike towards the lower end values (such as 0s and n/a) should be treated as a rather negative indication as in the graph below.

3. Check for deindexed root domains

Working out the percentage of linking root domains which are not indexed is essential. If deindexed linking root domains still have a positive TBPR value, most likely they have been recently deindexed by Google.

4. Check social metrics distribution (optional)

Adding in the mix the social metrics (e.g. Facebook Likes, Tweets and +1s) of all identified linking root domains may be useful in some cases. The basic idea here is that low quality websites would have a very low number of social mentions as users wouldn’t find them useful. Linking root domains with low or no social mentions at all could possibly point towards low quality domains.

5. Check periodically

Repeating the steps 2, 3 and 4 on a weekly or monthly basis, could help identifying whether there is a negative trend due to an increasing number of linking root domains being of removed. If both the PageRank distribution and deindexation rates are deteriorating, sooner or later the website will experience rankings drops that will result in traffic loss. A weekly deindexation rate graph like the following one could give an indication of the degree of link equity loss:

Note: For more details on how to set-up NetPeak and apply the above process using Excel please refer to my post on Connect.icrossing.co.uk.

Remedies & Actions

So far, several websites have seen ranking drops as a result of some of their linking root domains being removed from Google’s index. Those with very low PageRank values and low social shares over a period of time should be manually/editorially reviewed in order to assess their quality. Such links are likely to be devalued sooner or later, therefore a new link building strategy should be devised. Working towards a more balanced PageRank distribution should be the main objective, links from low quality websites will keep naturally coming up to some extent.

In general, the more authoritative & trusted a website is, the more low quality linking root domains could be linking to it without causing any issues. Big brands’ websites are less likely to be impacted because they are more trusted domains. That means that low authority/trust websites are more at risk, especially if most of their backlinks come from low quality domains, have a high number of site-wide links, or if their backlink profile consists of unnatural anchor text distribution.

Therefore, if any of the above issues have been identified, increasing the website’s trust, reducing the number of unnatural site-wide links and making the anchor text distribution look more natural should be the primary remedies.

Source: http://www.seomoz.org/blog/how-to-check-which-links-can-harm-your-sites-rankings

25 Ways to Decrease Your Rankings

Sure, any old chump can follow SEO best practices and achieve the high SERPs rankings that lead to sustained business growth and profits.  But do you have what it takes to trash years of hard work by flagrantly abusing Google’s policies?

Here’s what you need to do:

Tip #1 – Cloak your site

If you want the Googlebot to swear off your site for good, use cloaking technology that allows you to display an entirely different site from what your readers are seeing.

Tip #2 – Make use of doorway pages

Same deal – if you’re using techniques that pass visitors through one landing page and on to another, you’re violating the Google Terms of Services and guaranteeing that your site will lose rank (if not be removed from the index altogether).

Tip #3 – Scrape articles from other websites

If writing your own original content sounds boring, why not just pull some text off of another website you like and paste it into your own?   Copy too much of the same content from other sites, and you risk running into Google’s duplicate content filters, which will ensure other sites featuring the copied articles rank above yours in the SERPs.

Tip #4 – Use unrevised PLR content

You know those “10,000 PLR article packs” you have sitting on your hard drive?  If you want to decrease your rankings in the search engines, copy and paste them directly to your site without any revisions.  Providing little to no original content (and little to no value to your readers) is a sure-fire way to get your site tanked in the SERPs.

Tip #5 – Keyword stuff your content

Nothing says “effective optimization” like cramming your target keywords into your website as many times as you can.  Consider the following example of good keyword stuffing:

“Take a moment to read my Review of the 2013 Chevrolet Tahoe.  Are You Gonna Buy a 2013 Chevrolet Tahoe?  Watch the Video Reviews of the 2013 Chevrolet Tahoe below to see this vehicle in action!”

If your current website text is more readable than this, you aren’t fitting nearly enough keywords into your content in order to decrease your rankings!

Tip #6 – Keyword stuff your meta tags

Your meta tags represent a valuable opportunity to tell the search engine spiders what your site is about – so why not use this space to stuff in every single keyword you’re targeting?!  Bombing your title, keyword and description meta tags with as many keywords as possible is a surefire way to trigger SERPs ranking penalties.

Tip #7 – Pack your page with SEO copy

Nothing screams, “I’m good at SEO” like a paragraph or two of keyword stuffed text crammed at the bottom of your page where you think no one will see it (I’m looking at you, Red Envelope!).  Since the search engines have definitely caught on to this tactic, it’s another great way to decrease your rankings in the search results.

Tip #8 – Present text in graphics or scripts

So you hired yourself a fancy graphic designer who’s built your entire site out of graphics and scripts?  That’s great!  Since the textual content of your site won’t be able to be indexed by the search engine spiders (unless your designer happened to build an alternative indexable version), you’ll definitely see a drop in the SERPs.

Tip #9 – Over-optimize your images

Want to tip off the search engines to your blatant manipulation of their rankings in order to increase the odds of incurring penalties?  Over-optimize your images by stuffing your file names and ALT tags with every single keyword you’re targeting on your site.

Tip #10 – Ignore keyword research data

The web runs on keywords, so if you really don’t want to get ranked for the terms that readers would be using to find your site, ignore them entirely!

Tip #11 – Hide text on your site

Filling your site with text that’s displayed in the same color as your background has long been acknowledged as a spam technique designed to artificially manipulate rankings.  As such, hiding text on your own site in this way is a great way to incur SERPs penalties!

Tip #12 – “Stack” your title tags

If you really want to irritate the Googlebot, add multiple title tags to each page you create.  After all, if one title tag provides an important SEO benefit, two or more will be even better – right?!

Tip #13 – Use frames in your site’s construction

Using frames in your site’s construction doesn’t just say, “Look at me, I’m stuck in 1996!” – it’s also a good way to make your site difficult to navigate for the search engine spiders.

Tip #14 – Build sites around brand names

Building your entire website around established brand names (or even targeting your competitor’s branded keywords in your meta tags or site content) opens you up to copyright claims.  Depending on how far your competitors decide to take these claims, the more likely you are to have your site dropped from the search engine indexes entirely.

Tip #15 – Generate a low clickthrough rate from the SERPs

Your clickthrough rate within the SERPs matters, so if you want to drop your rankings, totally ignore the content that makes up your SERPs snippets.  (Alternatively, if you want to undo the damage done by low value snippets, consider incorporating your PPC or social media data into your meta tags.)

Tip #16 – Buy links

If major retailers like JCPenney are doing it, buying links should work for your site, right?!

Tip #17 – Exchange links

Link exchanges are another great way to get the search engines on your case about artificially inflating your rankings.  Look into both two-way and three-way link exchanges in order to create the obvious link paths that can lead to rankings penalties.

Tip #18 – Build only low quality links

Want to decrease your rankings for sure?  Low quality backlinks are your new best friends, so get out there and start building profile links, links from spam directories, links from FFA sites and other low value pages.

Tip #19 – Build links in bad neighborhoods

If your site’s incoming links are originating entirely from foreign language sites, adult-oriented websites and other “bad neighborhood” sites, you’ve got a great chance of seeing your rankings drop substantially!

Tip #20 – Build only one type of link

Good SEO is built on a natural backlink profile that resembles one that would occur if people found your site and shared it on their own.  To go in the totally opposite direction and drop your rankings, find one type of backlink that you like and focus on building that type of link only.

Tip #21 – Overload your site with plugins

WordPress plugins and other scripted program can add a tremendous amount of functionality to your site, but adding too many can slow your site down significantly.  And since the search engines prioritize site speed as a ranking factor, add as many plugins as you can if you want your SERPs rankings to crash.

Tip #22 – Ignore your site’s speed

Since site speed now matters more than ever, ignoring this important metric is a good way to drop your rankings.  Don’t bother to run a site speed optimization check and definitely don’t bother to act on the results of this test – doing so could speed up your site and actually *improve* your rankings!

Tip #23 – Ignore errors in your site’s code

Code errors disrupt the indexing process, making them a great way to confuse the search engine spiders and ensure your site isn’t displayed in the right SERPs.  For best results, ignore webmaster best practices that involve checking your code and allow existing errors to remain in place, disrupting your site’s performance and indexing.

Tip #24 – Ignore site uptime

Hey, every site goes down from time to time – right?!  Since site uptime plays a role in the search engine algorithms, you should definitely avoid monitoring your uptime and worrying about whether your pages are live at all times.

Tip #25 – Fail to stay on top of SEO news

SEO is a constantly changing field.  If you ignore major updates, you could take your search results listings by incurring penalties or missing opportunities to excel over your competitors.  Remember, if your goal is to decrease your rankings, ignorance is bliss!

Source: http://www.searchenginejournal.com/25-ways-to-decrease-your-rankings/42430/