Negative SEO: Myths, Realities, and Precautions – Whiteboard Friday

This week we will be covering a topic not often discussed on Whiteboard Friday. We are going to be talking about negative SEO tactics and how these practices function. Negative SEO is definitely not something we condone here at SEOmoz but education around these techniques can be a helpful, precautionary method that could prevent you from being the subject of malicious intent.

We hope you enjoy the video and don’t forget to leave your comments below.

Video Transcription

 Howdy, SEOmoz fans. Welcome to another edition of Whiteboard Friday. This week we’re talking about a very concerning and controversial topic – negative SEO. Now, negative SEO has a number of meanings. I want to walk through them and get to some points. If you’ve been paying attention to the Twitter-sphere or the SEO blogosphere over the past week, two weeks, there’s been a lot of discussion around negative SEO, particularly backlink pointing to bring down sites. I will get to that, but first I want to start with some of the classic ways that negative SEO could potentially hurt you.

The idea behind negative SEO is that rather than doing good, positive things that will promote signals in the search engines that bump up your rankings, there are ways to do bad, terrible, negative things. Now, obviously you could do these on your own sites, but hopefully you’re smart enough not to do that. There may be things that other site owners, webmasters, marketers, or black hat SEO’s, mostly we’re talking about black hat SEO’s, spammers, and even people doing very illegal things to bring down your website in the rankings or to even take your website offline.

There are classic types of things, like malware, hacks, and injections. So this is the first one I’m going to talk about. Basically, what we’re saying here is that you’ve got your site, it has some pages on here, and hackers may find security vulnerabilities in your site, in your FTP logins. It may be a WordPress install. Earlier this year I had a hacker essentially come in and inject spam and malware onto my personal blog at RandFishkin.com/blog. The idea is that they all inject spam, links to spam sometimes, sometimes very subtly. They will make changes to your site. One of the classic examples of this is someone going and editing your robots.txt file to block Google bot or to restrict all IPs from a certain range, or those kinds of things. Obviously, that’s going to take your site out of the search engines. Or inject viruses or malware that will install itself on computers that visit you.

Unfortunately, I was actually visiting MozCation.com, which Gianluca Fiorell, one of our Pro members from Spain – he’s Italian but from Spain – had set up last year to promote MozCation in Barcelona, in Spain. Unfortunately, it looked like some spammers had injected some malware on that site, and it had been on there a little while. I think he’s taken care of it now, but these are the types of problems. What you’ll see is a download will go into your cache, and sometimes Microsoft Security Essentials will alert you that that’s happened, hopefully if you’ve got it installed. So this is something to watch out for. You want to close those security holes.

The other kinds of things to watch out for is spam reporting. Sometimes a lot of people, unfortunately, in the SEO-sphere still do manipulative kinds of link building. Obviously, most of the people who watch Whiteboard Friday are not in that group, but some of you probably are. Maybe you buy a few directory listings. You go on Fiverr and you buy some cheap links. You find some spam through some forums that potentially works. You’re doing sorts of things that are on the grey hat/black hat borderline, in terms of link acquisition, and sometimes you will see that your competitors might spam report you. So this guy’s going to go over to Google and maybe he’ll leave a threat at the webmaster forums, or he’ll send it through a spam report in his Google Webmaster Tools. A lot of this spam reporting, I think they said they get tens of thousands of spam reports each month, I believe it was. Actually, fewer than I’d expect, but a lot of people do report spam to Google. These might be your competitors. These might be other webmasters. They could just be random people on the Internet who are like, “Why isn’t this site ranking here?. This looks terrible. I don’t like this.”

When this happens, Google might take a closer look at your backlinks, and obviously this might bring you down. There are arguments about the ethics inside the search engine industry. Personally, I think that removing low quality crap from the Internet is all of our jobs, and I like to be part of that. I think that it’s a good thing to make the Internet a better place, and if you’re not making the Internet a better place, I hope that you’re not doing web marketing because it makes the rest of our industry look bad.

However, certainly reasonable minds can disagree. Aaron Wall, from SEO Book, who I highly respect, who I grew up with in this industry and think the world of, takes a complete opposite view. He thinks that because I support disclosing spam and manipulation to Google and to search engines that this makes me a bad person. That’s too bad. That’s frustrating, but I think reasonable people can disagree. Certainly whatever angle you are on, on this, you should at least be aware that this stuff happens and know that it’s a potential risk, particularly if you’re doing highly manipulative things.

The last one I want to talk about is actually the biggest one and probably the most important and the most salient and relevant to what we’ve been talking about today. That is pointing nasty links to your website. Now this has been something that a lot of webmasters have been discussing actively over the last couple of weeks in this sphere, essentially kicked off by a forum thread on Traffic Power Forum. I haven’t previously spent a lot of time there, but it’s a very active forum populated by a wide mix of white hat folks, grey hat folks, some pretty dark black hat folks, which I’ll show you in a minute.

Two members there, Jammie and Pixelgrinder, hit two different websites. One is called SEOFastStart.com, that’s owned by Dan Thies. Dan, of course, early keyword research guru in the SEO space, big industry mover and shaker. Spoke at a lot of the early search engine strategies conferences. I’ve met him a number of times, really good guy, solid guy. He complimented Matt Cutts, the Google Webspam Chief, on the search quality team. He complimented him over Twitter on knocking out some spam. Some people on the forum felt that it was, I don’t know, in poor taste. Right? Essentially they felt that because he was being complimentary to Google for kicking out webspam, that he should then be the target of this negative SEO. The other site was NegativeSEO.me, which was essentially a website offering services to get someone banned from the search indices, and this a little concerning in and of itself.

Now the thing that’s interesting about these sites, and Dan admitted this about SEOFastStart. Not a very big site. Right? Not a lot of great brand or link signals. Potentially some small amounts of not wholly white hat types of activities already happening around these sites. So we’re not talking about (a) big brand sites, or (b) sites that have no idea about the SEO world and aren’t doing anything manipulative and are clean as the driven snow. These are a little off that track. These were both hit by these guys, at least presumably, according to the forum thread, and lost a lot of their rankings.

When I say hit, what I mean is this type of thing happens. So here’s your site.com up here. Right? Essentially, what’s going on is you’ve got some nice white hat, editorially given, earned links, high quality stuff, and that’s great. Then there’s some kind of this dark cloud of black hattery, spammy, manipulative posts. They talked about a number of things, XRumer blasts, buying links on Fiverr, buying links from some link networks, pointing some links that they had seen get hit on other sites at this site, and essentially trigger this loss of rankings. Now, they didn’t get banned from the index, but they fell from, I think Dan Thies’ site in particular fell from ranking #1, for his personal name, to number30, 35, somewhere around there, and hits like that similar across both these sites.

The second example was another forum thread started by a user with the user name, Negative SEO, and that was for the domain JustGoodCars.com. Now again, Just Good Cars unfortunately looks like they were doing a little bit of things that might be construed as manipulative, even prior to this attack on them by the Negative SEO guy. Some links that were of questionable sources or how they were acquired, and then a big network of websites that were all pointing back and forth to each other from many different pages on these many different sites. This guy took it upon himself to say, well they were . . . I guess this website had been complaining in the Google webmaster forums about some other sites outranking them, so this person took it upon themselves to do some pretty nasty, evil stuff.

Now I can’t support this in any way. I’m frustrated that unfortunately this is a part of our world. But you should be aware of it, because what they did was creative, almost to the point of ingenuity, but definitely dark and evil, maybe even bordering on illegal depending on the legalities. I’m not really sure. Here’s what they said they did. Of course, I can’t prove that they actually did these things, but here’s what they said they did. So they did go do a lot of manipulative, nasty backlinking to the site from a lot of those sources we talked about. They mentioned a few XRumer blasts. They posted a lot of duplicate content. They set up fake WordPress splogs, essentially a spam blog, and then they re-posted the content that existed on JustGoodCars.com on tens of thousands of pages across the Web so that Google might say, “Oh, well why is this duplicate content?” I don’t know that that’s actually highly concerning in and of itself. A lot of people copy content from all over the Web for both good and bad reasons.

Then they did something that’s really nasty. They went to Fiverr and they asked for people to post fake reviews to Google Reviews to make it look like Just Good Cars was manipulating Google Reviews, and actually got them thrown out of that program. According to the forum post, anyway, that’s what happened. They got their stars and their Google Reviews and their ratings removed, and all that kind of stuff, which that’s whew, that’s really low. That sucks if that’s what really happened.

It’s even more terrifying, but they sent fake emails. They set up email addresses that looked like they came from Just Good Cars, and sent fake emails to websites that had posted good editorial, positive links, saying, “Hey, you should stop linking to this site. There are these problems with it. We’re requesting a DMCA take down action against it. Our attorneys will be in touch if you don’t remove your links.” Those kinds of things. So really just, oh man, that’s really evil. But stuff that we definitely need to be aware of in terms of the world of negative SEO and what this kind of stuff can happen.

Now, it’s very tough to verify anonymous users on an anonymous forum posting and whether all of this stuff actually happened, but certainly the ideas behind it are very concerning. What I want to express today is that there are some things you can do on your site that will make you higher risk and lower risk to these kinds of things.

Higher risk is going to be, like some of these other sites, you’ve already done a little bit of manipulative linking. Right? You’ve already done some spammy stuff. You have manipulative on-site stuff. Meaning for example, like Just Good Cars there’s kind of that footer with all these links pointing to all these other places. This was mentioned in the forum thread. So I’m not giving away new information here, but there’s stuff on this site that looks like it might be not wholly kosher, not wholly white hat.

Your site has few high quality brand signals. High quality brand signals, things like lots of people searching for your domain name and brand name. Lots of mentions of you in the news and press, in outlets that are high quality. Lots of offline sorts of signals. Lots of user and usage metrics types of signals. Lots of verification kinds of things. Using high quality providers of everything from the IP address, where your website’s hosted, to the domain registration link, to the services you might have installed on your site, Akamai or any of the CDN networks suggest you’re very popular. Any type of signal like this that looks like a highly brand intense signal.

Lower risk is going to be the opposite. Right? So things like a totally clean backlink profile. Never done any kind of manipulative linking, at least not intentional outbound backlink building. Don’t forget, everyone’s going to have some spam links. Even if you’ve never done any manipulative backlinking or any backlinking or marketing of any kind, you will have some bad backlinks, because the Web, just there are all sorts of weird crawlers and bots that host links all over the place. It’s fine. Don’t sweat those. It’s the normal volume. Things like having a beautiful, elegant, high quality UX. A great UX is a fantastic defense against a lot of spam and manipulation. It’s even a great tactic for folks who are trying to do SEO. It’s just a great signal in general. Right? Having a great UX is going to get you more conversions and more people using your site. Anyone who is browsing your website, say, from the Google Search Quality team or the webspam team, or the Google reviewers, which Google hires, or from Bing, any of those folks who are looking at your site are going to say, “Oh this is clearly a great site. We want to have this in our index.”

If you review some of these other sites, you can take them or leave them. One that does not feel very SEO. I think you all know what I mean. There’s sort of that sixth sense of, boy, they’re doing a lot of things on the page and off the site that don’t feel like they’re natural, don’t feel like they’re for users. Whenever you have that sixth sense around a site, that’s going to put you in a higher danger category. Not doing that, having that very natural sort of site, you can target keywords, do a good job with your titles, do a good job with your content, do a good job with your internal linking, but make it feel very natural. I’ll give you good examples. Amazon, very well SEO’ed, but doesn’t feel SEO’ed. Zappos, doesn’t feel SEO’ed. Even SEOmoz, it doesn’t feel very SEO’ed, but it’s doing a good job. TechCrunch, doesn’t feel SEO’ed, but ranks phenomenally well.

Finally, having those strong brand signals, the branded searches, lots of people searching for your brand name specifically. Good links, good mentions, good press, good user and usage metrics, all these types of things are going to protect you from a lot of these types of spam attacks.

That being said, there’s nasty stuff that other people can do. So you want to (a) keep your eyes wide open. Make sure you’re registered with Google Webmaster Tools so you can get any of these warnings ahead of time. If you happen to see an influx of really nasty looking links, you might want to send a preemptive reconsideration request to Google saying, “Hey, we don’t know where these came from and we have nothing to do with this. We just want you guys to know that this is not our activity. Please feel free to disregard or not count these links.” 99% of the time Google is not going to say, “Oh these bad links that are pointing to you, we’re going to count those as reducing your SEO and bringing you down in the rankings.” They’re instead going to say, “Oh well, we’re going to ignore these. We’re going to remove the value that these pass.” They’re not going to pass PageRank or anchor text value or link trust, or whatever it is. We’re just going to count the good stuff.

I remember being in a session, this was years ago, probably five or six years ago, with Matt Cutts, the head of webspam for Google. He was looking at a site on his computer, and the person asked about their website from the audience, and he said, I see, I don’t remember what it was, 14,000 odd links pointing to this site, but Google’s actually only counting about 30 of them. That’s why you’re not ranking very well. Most of those links we’ve removed all the value that they pass. So it’s not that they were having those bad links hurt the site. It’s just that they’re saying, “Oh these are not going to pass any more link value.”

Now, what I would suggest here is, if you see stuff that looks like manipulative and negative SEO, you just be careful. We are trying to do some things here at SEOmoz to help with this. One of the things our data scientist, Dr. Matt Peters, is working with some folks here at Moz to build a large list of spam so we can do some classification, and eventually inside the Mozcape index, which will appear in Open Site Explorer, show up in your Pro-web app, show up in the Mozbar, we’ll try and classify sites to say, “Hey we’re pretty sure this is spam. This looks like the kind of thing where we’ve pattern matched and seen Google penalize or ban a lot of these sites.” We’re also trying to build some metrics to show what are really good, high quality, and editorially given sites. So domain authority and page authority already exist to try and do that.

Then, we’re also running some experiments where I’ve offered up my personal blog, which is a relatively small site, probably has as few links as any of these, probably fewer than Just Good Cars, RandFishkin.com, to see if some of these nasty folks, who are hitting and taking down sites with negative SEO, would like to concentrate their focus on my sites. For two reasons, number one, we’d be very curious to see it happen, and number two, we can certainly afford the hit. We offered up SEOmoz as well. Most people seem to think that SEOmoz is not a good target. It won’t actually be taken down.

We’re going to run some experiments internally as well on this front and hopefully be able to disprove that negative SEO is a common thing that works very well. I’d hate to see an industry spring up like this. I think that this type of activity, particularly some of these really nasty things, are just an awful part of being around the black hat spam-sphere. I hope that it’s something that we can defend against. I hope you’ll join me in contributing. I look forward to your comments. If you’ve seen stuff like this before, please do feel free to talk about it either anonymously or openly in the comments. I will see you again next week for another edition of Whiteboard Friday.

Source: http://www.seomoz.org/blog/negative-seo-myths-realities-and-precautions-whiteboard-friday

Advertisements

New Google Search Algorithm Update Targets Web Spam

Google Logo

Google’s long anticipated over-optimization penaltyis now live. Except Google called it an algorithmic update that’s targeting web spam – a.k.a., keyword stuffing and link schemes, in the process causing some big search ranking upheavals.

Webspam Algorithm Update

Google’s Distinguished Engineer Matt Cutts, head of the web spam team, yesterday announcedthat Google had pushed out the new algorithm. Cutts said this “improvement” better identifies websites using “aggressive web spam tactics” (that have been against Google’s quality guidelinesfor years) for the purposes of gaming their way to top spots in Google’s rankings.

Cutts specifically noted that websites likely to lose rankings are those that practice keyword stuffing and sites that have “unusual linking patterns,” such as links from spun content with anchor text that is completely unrelated to the actual on-page content.

The web spam algorithm update will affect about 3.1 percent of English Google queries, but noted it would have a bigger impact in heavily-spammed languages, such as Polish.

Additionally, Cutts emphasized the importance of “white hat” SEO in his post, as well as the importance of creating great websites filled with high-quality, compelling content that provide a good user experience. Google’s guidance on high-quality content consists of these 23 questionsyou should ask yourself when evaluating website content.

SEO by the Sea has a good rundown of Google’s patents related to combating web spam.

Early Assessment of Damages

There are lots of theories floating around at the moment about what types of sites took the biggest hits, but it seems a bit premature to make conclusions with so many conflicting reports. It seems a few “innocents” may have be caught up in this (though, honestly, it’s easy to blame Google for not ranking your site), and some pages that shouldn’t be ranking are now, according to various reports in forums since last night.

Some are arguing that Google’s results are worse now. If that sounds familiar, many people were saying the same thing after Panda launched last year. Pretty sure those who saw their search rankings increase aren’t complaining.

google-make-money-online-search

Granted, there are some cringe-worthy search results, such as [make money online], that some are pointing to as proof that Google’s latest rollout is a miserable failure. The top organic result is a completely empty blog (that same blog currently ranks third on Bing for the same search) (UPDATE: I’m no longer seeing this result on Google.)

Regardless, Google favors branded websites, and early reports seem to indicate that those with agood link profile have survived this storm. This update shouldn’t be too shocking considering Google has been deindexing blog networks and flagging “unnatural” links. And because of these link evaluation changes, negative SEO, where a competitor buys bad links and aims them at competitors website to harm them, has become a big concern for many people.

Searchmetrics has released a preliminary analysis of search visibility winners and losers from the update, and they’ve concluded that aggregators and template-based websites are among the biggest losers. As always, however, it’s best not too put too much stock in these lists.

A Lot of SEOs are Freaking Out Because of Over-Optimization

The “over-optimization penalty” became the equivalent of an SEO ghost story over the last several weeks since Cutts made his comment at SXSW and SEOs began echoing the Gospel of Matt, who warned that thou shalt not do “over optimization” or “overly” do SEO.

When Google’s Panda update launched, people were upset, as it inflicted a lot of financial damage by wiping out rankings and traffic. But it seems some people still haven’t learned one of the biggest lessons that came from Panda: you can’t rely on Google as your sole source of traffic and income. That’s a doomed business model. There are plenty of other marketing tactics, including PPC, social media, email, and video.

Reminder: you aren’t guaranteed a number one spot in Google or any search engine. You have to work at it.

Chasing an algorithm isn’t a winning marketing strategy. Stop chasing taillights. Drunks chase taillights.

The below image from a Warrior Forum thread, sums up the never-ending loop that SEOs can get caught in with this strategy:

google-algo-change

Source: http://searchenginewatch.com/article/2170391/New-Google-Search-Algorithm-Update-Targets-Web-Spam

How We Managed to Benefit from the Panda Updates

As I am into the online marketing field, I read a lot about SEO. This is my first post about SEO, so please don’t be harsh in the comments. The Panda update is what made the SEO community roar about how many websites lost ranking and so on. There is so little information about the ones that benefited of the update and we are one of the winners.

I personally think that the Panda update made the SERPs quality a lot better and to some point buried the medium to low quality websites deep into the results. Even some of the high-authority websites went down.

I will share some insights of an user generated moving reviews website MyMovingReviews.com and how we got positively impacted by the Panda update. The website features many US and Canadian moving companies and provides the opportunity for people to rank them and write moving reviews. Additionally to that, there is a blog/article section with moving tips and info.

Industry specifics that influence the analytics data

Before we begin, you should know that the specifics of the industry add some additional noise to the analytics data. These are the main trends in the moving industry:

  • Weekly trends: People search a lot more about moving services in the beginning of the week in the working days. Mondays are usually the most active days. We assume that people usually search for movers at work during work hours.
  • Monthly trends: People search for movers more by the end of the month and less in the middle of the month and during holidays.
  • Seasonality: People search 30% more for movers in the summer months than during the rest of the year. Nobody wants to move in the winter (especially in the Northern states).

The Fist Panda update

Since the first Panda update in 2011 we started seeing some increase in rankings. Because of the specifics of the users behavior in our industry, the analytics data is looking weird but you can see the pattern.

first panda update mymovingreviews

Further benefits from the Panda update

As we saw a huge opportunity in the Panda update, we tried to adjust the website to better suit the visitors, give them alternatives once they visit the website and make visitors consume more of the moving industry related content. The goals were to increase the time on sitereduce the bounce rate and increase the pages per visit.

What we did to increase rankings/visitors

1. Reducing the bounce rate

We stared by working on the high bounce rate pages. We edited some of the content and deleted some of the pages. One of the very high bounce rate pages were the blog section posts. Since we are always committed to build only high quality content, we knew that the problem with the high bounce rate on the blog was elsewhere. We knew that visitors were able to find the information they were searching for and after that they were leaving the blog. We added a suggestion fly-box. The box appears on the right side on the page once the visitors scrolls by the end of a post and suggests another random post from the blog. This had a huge impact on the blog bounce rate by lowering it with more than 30%. From the highest bounce rate section of the website, the blog become the lowest one overnight.

2. Creating a mobile website

mobile visits my moving reviewsWe have about 11 percent mobile visits (we don’t consider iPads to be mobile traffic). We decided to further lower the bounce rate by creating a full-featured mobile website. This of course brings the benefits of higher conversion rates. We’ve been postponing the mobile website for some time now and we finally decided to finish it and launch it by December. We kept the same URLs as the desktop version and only changed the templates.

3. More content

As part of the Panda update is the amount of content on page. We didn’t want to have many pages with thin content so we increased the minimum text required for a moving review to be posted. After reading about how Zappos corrected the spelling mistakes of all their reviews, we additionally wanted to avoid spelling mistakes as much as possible. We included a spell checker on the moving review form. We are also planning to correct the mistakes on all old reviews in the future.

To recap, here are the changes we did:

  • Editing some of the content with the highest bounce rate.
  • Adding a spell checker on the write a review page and setting a higher minimum amount of text for the reviews.
  • Giving suggestions to users once they finish reading a blog post to reduce the bounce rate.
  • Started a mobile website to reduce the bounce rate and time on site for mobile visitors.

The results

We had almost 50% increase in visits in the next one-two months. Please note that we introduced most of the changes in December, so we can’t really measure how fast did these changes influenced the rankings because of the holidays. Not surprisingly, the largest part of the increase was from the blog as this is where we managed to reduce the bounce rate the most.

MMR traffic increase

Conclusion

I can’t say that all of the gained increase of visitors came because of the above changes, but given the changes and tactics we did at the time, these were the most significant ones. Targeting the visitor and thinking of how to enhance the customer experience results in more visitors. It is as simple as that. Working on the design and thinking of techniques to reduce the bounce rate will result in better rankings, especially if you are a high-traffic website.

Source: http://www.seomoz.org/blog/how-we-managed-to-benefit-from-the-panda-updates

2 Blog Commenting Guidelines – SEO Video Lesson

In order to help your blog comments go live, it’s important that you leave a real response. “Good job!” or “great article” tell the blogger that you are just in it for the link, and don’t really have anything to add to the conversation. It’s also worth incorporating links to your social profiles (in addition to your company website) to help garner more fans and followers.

Watch this week’s SEO video lesson here!

For more link building tips and lessons from Nick Stamoulis, check out the Brick Marketing link building video lesson archive.

Source: http://www.searchengineoptimizationjournal.com/2012/04/23/blog-commenting-guidelines/

Creating an “Opportunity Report” – PPC

Creating an “Opportunity Report” to Show How to Get More Out of Your Accounts

A lot of marketing managers and C-level executives I run across really aren’t sure how good their AdWords account is. That’s certainly to be expected because, in general, the people expected to evaluate the performance aren’t really experienced in PPC enough to know the ins and outs. Paid search is pretty complex nowadays, and it’s simply unfair to expect anyone outside of the Account Manager to be knowledgeable enough to evaluate performance accurately. So, how does the Account Manager show they’re best of breed during evaluations? Do they depend on Google’s overly simplified and vague ‘analyze competition’ tab? Can they trust Google account reps whose focus is spread across dozens of accounts, and, in all likelihood, have been in PPC only a year or two? Do they bring in a third party for an account audit that knows nothing about your business? What about using something like Compete or Hitwise?

Working at an agency, we’ve learned (the hard way) that none of these are ideal options. So, as in most things related to PPC, I prefer a less passive approach. Here’s how you can construct your own “Opportunity Report” that gives an accurate, detailed view of your account performance – and highlights the quickest ways to improve it.

Step One:

Structure your account in a way that results in the most reliable performance metrics – including ad group impression share.

Google is really doing a bang-up job of giving marketers more data. The biggest leap forward in this area, in my opinion, has been ad group-level impression share data. This information will be a vital component of the account’s opportunity report. In order to leverage ad group level impressions share data, however, you need to make sure you’re working with a valid metric that can be applied all the way down to the query level. Hence, the following must be true:

  • The ad group must contain only one keyword
  • The keyword in the ad group must be in exact match (i.e. the keyword is also required to be the query)
  • Keyword-to-query mappings must be forced by utilizing negative match (i.e. when a user searches a given query, that query can only be sent to the keyword in your account)

Because of all of these requirements, you should limit yourself to structuring things in this manner for only the high-volume queries/exact match keywords (PPC Associates calls these ‘Alphas’). I tend to use some sort of conversion-based cut-off to determine where to draw the line on high-volume queries (for instance, the query must have 10+ all-time conversions in the Search Query Report). For most accounts, you can set up at least 70% of your traffic to go through this keywords setup without the structure becoming too large to work with.

One result of this setup is that you now have query level impression share data.

Step Two:

Translate the ‘Impression Share’ report into a ‘Lost Impressions Report” for the keywords in the campaigns described in the first step.  

I was a high school math teacher in a former life, and I used to say ‘fractions are your friends’ to my students all the time (this tended to make them rather annoyed, which was a nice bonus). But, as it relates to creating an opportunity report however, fractions aren’t your friend! To quantify and sort by opportunity, you need to know how many impressions were lost, not what percent of impressions were lost (volume matters!). To do this, take your ad group-level impression share (which, if you’ve set up as described in step one, is query-level impression share at this point), and apply some math.

Here’s how:

  • Google tells you how many impressions you have, your impression share, and your lost impression share
  • So, say you have 80% impression share and 1,000 impressions. This means ‘80% of the total number of impressions is 1,000.’
  • In an expression, that is 0.80X=1000, where X is the total number of impressions
  • X = 1000/0.80 = 1250. This means there were 1,250 impressions available (i.e. there 1250 searches performed over your report’s given time period)
  • The number of lost impressions was your lost impression share, times the number of total impressions. In this case 0.20(1250)= 250
  • Last, divide the number of lost impressions by the number of days in your date range to create a new metric of ‘lost impressions per day.’

There are a several other equivalent ways to calculate the number of lost impressions as well, but however you choose to do it, you should end up with a nice ‘lost search query impressions report’ that will look something like this for the keywords in campaigns that follow the described structure in step 1. I just used Google.com data here (an available segment) and sorted by ‘lost impressions per day on Google.com.

From there, it’s pretty easy to determine why the impressions were lost. If your PPC account is on par, it’ll always be because you weren’t able to bid high enough, because it would have resulted in a CPA (or ROAS, ROI, etc.) that was higher than allowable based on the account’s goals.

Step Three:

Create an ‘Other Impressions Report” for the keywords in the campaigns described in the first step.  

The lost-impressions-per-day report is pretty neat, but it’s only half of the ‘opportunity report.’ The rest can be formed by leveraging ‘top vs. other’ data (if you need a reminder, here’s why getting the top spots is so important).

Here’s how to do this:

  1. Export a keyword (or ad group) report with a ‘top vs. other’ segmentation for campaigns structure in the manner described in step 1.
  2. Only use ‘Google.com’ data. Partner search data is, unfortunately, going to skew the results in a massive way! (Many partner sites don’t have ‘top’ impressions available.) At the very least, create a different report for ‘Google.com’ and ‘Google search partner’ data.
  3. Divide the number of ‘Google.com other impressions’ by the number of days in your date range

The result should look this (after a bit of Excel formatting).

Again, at this point you should be able to create a rather clear narrative at the query level as to why you’ve had the given number of ‘other’ impressions. Hopefully the reasons for an ad not appearing on the ‘top’ is due the inability to raise bids based on the account’s current goal set.

Step Four:

Create the finished ‘opportunity report’

If you’d like to merge these reports, you’ll need to modify your ‘lost impressions report’ to a ‘lost impressions report on Google.com,’ like I had (so you’re not mixing some data that has partner data, and other data that doesn’t). If you choose to do this, you’ll end up with a lot of really great data that you can trust! Since along the way you’ve created a narrative as to why each keyword is under the circumstance it’s in, the merged results represent opportunities that could be captured if either:

  1. The goals of the account changed, allowing higher bidding
  2. The conversion rate improved, allowing higher bidding
  3. Your CTR changed by improved ad copy, which leads to higher ad rank.

Of these, only the CTR component is directly controlled by the account manager – meaning the variables for which you’re responsible have been limited to one.

Here’s an example of how you can merge the data and create a hypothetical output. In this hypothetical, I wanted to know what would happen if we had 100% query-level impression share for high-volume queries, and all these impressions were on the ‘top.’

If you plan on creating a merged report like this, make sure you’re a VLOOKUP master! You need at least an intermediate knowledge of Excel to keep the ‘opportunity report’ from turning into a long project. Additionally, make sure you keep using query-specific metrics, since you’ve got conversion rate and CTR information for each query from the original exported reports.

What’s great about a report like this the ability to quantify the value in something like improving the conversion rate (this could also help you determine how much to invest in things like funnel optimization). I’ve run this exercise with one of my clients and concluded that if the client could improve conversion rate by 100%, we could get 300% more conversions at the same margin per sale. The client held up their end of the bargain, and I came back with 310% more conversions! So, while fractions might not be our friends, more data certainly is.

Happy ‘opportunity report’ creating!

Source: http://www.searchenginejournal.com/creating-an-opportunity-report-ppc/42726/

24 Eye-Popping SEO Statistics

SEO isn’t exactly the most understood medium in online marketing. For one thing, people aren’t too sure what SEOs do exactly. Then there is talk about how SEO is converging with social media, content marketing, usability, and more.

It’s not exactly the most well regarded marketing channel either:

Every now and then, you’ll hear from the jaded webmasters who have invested in SEO a few times and wasted their money. After that, you’ll hear about discussions about black hat and white hat SEO. As a webmaster, you don’t have time to worry about all this chatter.

That’s understandable. You need to get the important things done.

To start things off, you should get your minimum viable SEO right and then move onto building your site.

You can turn to the more advanced SEO tactics to really help make some strides with your traffic. And take the word ‘advanced’ with a grain of salt, it’s not that bad once you get into the flow.

Keep the following statistics in mind as a reminder that investing more into your SEO efforts later will pay off handsomely:

SEO

  1. Content marketing rocks. Marketing Sherpa reports distribution lead to a 2,000% increase in blog traffic and a 40% increase in revenue.
  2. 70% of the links search users click on are organic.
  3. 70-80% of users ignore the paid ads, focusing on the organic results.
  4. 75% of users never scroll past the first page of search results.
  5. GroupM states “when consumers were exposed to both search and social media influenced by a brand that overall search CTR went up by 94 percent.”
  6. Search and e-mail are the top two internet activities.
  7. Companies that blog have 434% more indexed pages. And companies with more indexed pages get far more leads.
  8. Inbound leads cost 61% lower than outbound leads. An example of an inbound lead might be from search engine optimization. An outbound lead might be from a cold call.
  9. 81% of businesses consider their blogs to be an important asset to their businesses.
  10. A study by Outbrain shows that search is the #1 driver of traffic to content sites, beating social media by more than 300%
  11. SEO leads have a 14.6% close rate, while outbound leads (such as direct mail or print advertising) have a 1.7% close rate.
  12. For Google, a study from Slingshot SEO shows 18% of organic clicks go to the #1 position, 10% of organic clicks go to the #2 position, and 7% of organic clicks go to the #3 position.
  13. In that same study, tests for Bing show the following: 9.7% of organic clicks go to #1, 5.5% of organic clicks go to #2, and 2.7% of organic clicks go to #3.
  14. 79% of search engine users say they always/frequently click on the natural search results. In contrast, 80% of search engine users say they occasionally/rarely/never click on the sponsored search results. Here’s a look at what the natural (blue) and sponsored search results (red) look like:

Search

  1. Google owns 65-70% of the search engine market share.
  2. 93% of online experiences begin with a search engine.
  3. MarketingCharts reports that over 39% of customers come from search
  4. The search engine industry is estimated to be worth more than $16 billion.
  5. There are over 100 billion global searches being conducted each month.
  6. 88.1% of US internet users ages 14+ will browse or research products online in 2012.
  7. Search directly drove 25% of all online U.s. device purchases in 2010.
  8. 82.6% of internet users use search.

Mobile

References:

Source: http://www.searchenginejournal.com/24-eye-popping-seo-statistics/42665/

Linking 2012 – Effective Link Building Techniques in 2012

With Google’s constant effort to improve their search results, their algorithm is evolving to better understand and compliment organic versus spam link building. Since links are still (and will continue to be) a crucial ranking factor for websites, SEOs are focusing efforts on strong organic link building methods which in the short term are relatively tougher to track but in the longer term are extremely valuable to their websites overall domain strength and in turn increasing their SEO market share.

Organic or natural link building is the method of building high quality links irrespective of the anchor text or landing page. When I say its relative tougher to track I mean that in a pure ranking perspective – since we’re acquiring a large volume of links across the board it’s become difficult to hone in on the effectiveness of the campaign. However, there are domain strength signals that can be tracked which give you an idea of the overall success of the campaign.

Before we discuss the tracking methodology lets dive in to the three effective organic link building techniques:

1. Link Reclamation

Link reclamation is the process of winning back the links that you once had but are now broken due to changes in your website or the external links are pointing to pages via 301/302 redirects that don’t pass the full value of the link authority.

So how does link reclamation work? We follow a scalable process that helps identify and acquire broken links. Here’s how it works:

A). Phase 1: Discovery

We analyze the links from three main tools – OpenSiteExplorer, MajeticSEO and Google Webmaster Tools. We form one consolidated excel sheet that has all the links with duplicates removed. Since Google only gives us the URL for the linking page, we need to run our (AdLift’s) internal scrapper tool to give us more information on linked page and anchor text. Here’s the list of data that we need to help with the outreach process:

  1. AC Rank of Linking Page (From MajesticSEO)
  2. No. of External links pointing to Linking Page (From MajesticSEO)
  3. Linked Page (Internal Scraper)
  4. Anchor Text (Internal Scraper)
  5. HTTP Status Code (Screaming frog SEO Spider)

The first two data sets (ACRank and No. of external links) helps us identify high value sites.

On an average, we see that out of all the links pointing back to a particular website 15-30% of the links are 301/302 and 3-7% are 404s. Depending on the size of your website this a substantial link acquisition opportunity. (Data collected through a sample size of 10 websites)

Next, we append the target linked pages/anchor text for URLs that are either broken (404s) or are redirects (301/302). We’re now ready for the outreach process

B). Phase 2: Outreach

The outreach process is probably the most time consuming but that’s because it’s the most rewarding. Here’s where you need to segment the above data by 301/302 and 404s and reach out to webmasters that are linking to broken pages. The conversion rates (20-40%) on fixing broken links are high because it’s a win-win for both the website that’s currently linking to the broken page and obviously the linked website.

C). Phase 3: Tracking

The last piece of this campaign is the tracking how effective this has been. There are several ways of doing this but one that proves to validate the effectiveness is the tracking of the overall domain and page authority (OpenSiteExploer) of pages that the links were reclaimed.  For projects we’ve worked on, we’ve seen this number move in the positive direction within 30-45 days of all links being reclaimed. Alternately, if you reclaim broken links for specific anchor text you can track the rank/traffic improvements over time.

Overall, getting in to the numbers – if you have a link profile of 10,000 links, its safe to assume 5% are 404s and 25% are 301/302 – that’s 3000 links that have a high potential of converting.

2. Competitive Link Audit & Acquisition

The competitive link audit and acquisition is similar in some ways to link reclamation in that it involves understanding the link profile of a particular website and uncovers unique link targets. There are a number of tools you could use to help you analyze the competitions linking strategies. Here are some important graphs that help analyze the competitive link landscape.

A). Domain and Page Authority Segmentation (Tools used: OpenSiteExplorer or MajesticSEO)

This is a segmentation of all in coming links to a competitor website by domain and page authority.

The above graph shows us that a majority of incoming links has a domain authority between 50-60.

The above graph shows us that a majority of incoming links has a page authority less than 40.

B). Non-Brand Vs. Brand Anchor Text Segmentation (OpenSiteExplorer or MajesticSEO)

C). GEO Linking profile (Tool used: Blekko)

The GEO linking profile is particularly helpful in analyzing the link profile for global brands. It helps uncover unique linking opportunities for international sites where building quality GEO specific links becomes tough.

The example below shows Acer Germany’s link profile, where a large majority of inbound links is coming from Germany – which is great!

Source: Blekko.com http://www.acer.de/ /seo 

I’ll have to admit that these techniques are very time consuming, but gone are the “get rich quick” days of SEO.  To build up your domain strength with quality links and garner a greater share of the SEO traffic this level of effort is a must and it definitely pays off!

Source: http://www.searchenginejournal.com/linking-2012-effective-link-building-techniques-in-2012/42674/