Wednesday 28 August 2013

[Build Great Backlinks] 2 Million Backlinks and 15 SEO Answers from Google’s Matt Cutts

Build Great Backlinks


2-million-linksI’ve covered some pretty controversial topics in recent weeks when it comes to SEO. Revealing I’m a scammer, showing how freshness is being abused and then doing a follow-up blog post to show the same again. Though at times it may seem like I’m being a bit harsh on Google, I have acknowledged that the job their staff have must be one of the hardest in the world. Trying to defend yourself against thousands (millions?) of people who are focused on nothing more than gaming their system on a daily basis.

I still feel though that the search results from 2011 and 2012 are just far better than what we’ve seen in 2013. Google should not be getting tricked by people simply changing the date on a blog post and thinking that it’s suddenly fresh and deserves better rankings. There’s also no doubt in my mind that Youtube has a huge algorithmic preference over other video platforms like Vimeo, Wistia & DailyMotion, no matter what Google say about keeping things fair.

My original plan with this blog post was just to share with you the answers from the Webmaster videos that Google share on their channel which are primarily hosted by their head of web spam, Matt Cutts. Then my brain went off on a bit of a tangent and I wanted to cover how much you can really trust about what Google themselves preach. Their head of webspam, Matt, is someone who has no doubt made millions of dollars from his Google stock options (he joined 13 years ago) so my only prognosis is that he genuinely enjoys his job and the intelligent people he gets to work with every day.

google-early-days

On the face of it, you have to give Google credit for making the effort. They don’t really have to give any advice to SEO’s or webmasters; they can just ignore the whole lot and I honestly don’t think it would impact their search market share. Good luck convincing your friends outside of search to start using Bing or Yahoo anytime soon. On the other hand though, it’s also clear to see that most of the videos are just a PR stunt for Google, trying to encourage their ideal internet to make their own jobs easier.

The biggest thing that stands out to me from all of these questions and answers on their Youtube channel is just how scared of SEO the general webmaster seems to be
. So much talk of penalties, the disavow tool and updates like Penguin and Panda have put the ‘mom and pop’ on the back foot and worried about doing literally anything to their site.

15 SEO Answers Directly from Google’s Matt Cutts

webmaster-videos

Whether they give you information you can actually use or should take with more than a grain of salt is debatable, but I recommend every internet marketer who focuses on search to at least watch and read the information Google put out there. It can be a little tedious to go through lots of 5 minute videos from their Webmaster Help Channel, so I did it for you and put the answers together here.

Is load speed a more important factor for mobile? Is it really something that can change your rankings, all other things being equal?

“All things being equal, if your site is really really slow, we do use page speed in our rankings. All things being equal, then yes, a site can rank lower. Look at your neighborhood of websites and if you’re the outlier – your site is very very slow – then you may rank lower.

It’s not that in mobile we apply that any more or any less for desktop search.”

What should we do with embeddable codes in things like widgets and infographics? Should we include the rel=”nofollow” by default?

“My answer is coloured by the fact that we’ve seen a ton of people trying to abuse widgets and abuse infographics. We’ve seen people with a web counter and they don’t realise there’s links with mesothelioma in there. I would not rely on infographics and widgets as your primary way to gather links. I would recommend putting a no-follow, especially on widgets. Depending on the scale of what you’re doing with infographics, you might want to put a rel=”nofollow” on those as well”

What can I do if someone – like my competition – is trying to harm me with bad backlinks?

“You’ve done the right thing; you got in touch with site owners and you’ve said look, please don’t link to me I don’t want to have anything to do with your site. If those folks aren’t receptive then just go ahead and disavow those links. As long as you’ve taken those steps you should be in good shape.”

As memorable .com domains become more expensive, developers are choosing alternate new domains like .IO and .IM which Google geo-targets to small areas. Do you discourage this activity?

“You can pick any domain you want, but if you pick a domain like .ES or .IT because you think you can make a novelty domain like Google.it – “Google it” – or something like that, do be aware that most domains at that specific level do pertain to that specific content. We think that content is going to be mainly intended for that country.

There are a few country code top level domains that are sort of generic because, for example, .IO stands a for something related to the Indian Ocean but there were very few domains that were actually relevant to that. We might go ahead and say okay this is a generic country code level top level domain.”

How does Google treat hidden content which becomes visible when clicking a button? For example a page to buy something then a “show details” button which shows more information.

“If you’re using a tiny little button that people can’t see and there’s 6 pages of content buried in there that users can’t see and that’s keyword stuffing, then that is something we could possibly consider hidden text and probably would consider hidden text.

In general if you just have an ‘ajaxy’ sort of site and things get revealed and you’re trying to keep things clean, that’s not the sort of thing that’s going to be on the top of our list to worry about because a lot of different sites do that. It’s pretty common on the web.”

How does duplicate content that’s legally required (i.e Terms & Conditions across multiple offers) affect performance in search?

“I wouldn’t stress about this, unless the content you have that’s duplicated is spammy or keyword stuffing or something like that I wouldn’t worry about it. We do understand that various places across the web do need to have disclaimers and various legal information.”

Should a customer with 20 domain names link them all together or not, and if he does should he add no follow to the links?

“First off, why do you have 20 domain names? *Giggles* If it’s all cheap online casinos or medical malpractice in Ohio, having 20 domain names there can look pretty spammy and I would probably not link them all together.

On the other hand, if you have 20 domain names and they’re all versions of your domain in different countries then it can make sense to have some way to get from one version of the domain to another version. Even then I wouldn’t link all of the domains in the footer, all by themselves. I would probably have one link to a country locator page on the main .com.”

A client got unnatural link warnings in September 2012 without any example links. 90% of links were removed and I asked for examples in every reconsideration request. Shouldn’t it be better to have live / cached “list” of bad links or penalties in Google Webmaster Tools?

“We’re working on being more transparent and giving more information in messages as we can. I wouldn’t try to say “hey, give me examples” in a reconsideration request (RR) because a RR will read what you say but we can only really give a small number of replies.

Yes the RR has been granted or no you still have work to do. There’s a very thin middle ground which is ‘your request has been processed’. That usually only applies if you have multiple web spam actions.”

If my site goes down for a day, does it affect my rankings when this happens?

“If it goes down for a day then you should be in good shape. If it goes down for two weeks then there’s a better indicator that your site is actually down and we don’t want to send users to a site that is down. If it was only just a short period of downtime I really wouldn’t worry about that.”

If I write about another article, where should I link to the original source?

“Whatever way you choose to do will work fine for Google’s ranking because the link – whether it’s at the bottom of the article or whether it’s in that first paragraph – it still flows pagerank either way. Credit will flow to the website that you’re referring to.

For my personal preference, I prefer when a link is relatively close the top of the article.”

Which aspect of Google updates do you think the SEO industry simply won’t get? Where do you see many SEOs spending too much energy on when they could be taking care of other things?

“One is the difference between an algorithm update versus just a data refresh. When you change an algorithm the signals that you’re using and how you weight those signals are fundamentally changing. When you’re doing just a data refresh, then the way that you run the computer program stays the same but you might have different incoming data or refresh the data that the algorithm is using.

I’ve seen a lot of accusations after Panda and Penguin that Google is just trying to increase its revenue. Let me confront that head on. Panda, if you go back and look at Google’s quarterly statements they actually mention that Panda decreased our revenue.

A lot of people have these conspiracy theories that Google is making these changes to make more money. Not only do we not think that way in the search quality team, we’re more than happy to make changes which are better for the long-term quality of our users.

A lot of people think about “how do I build more links” and they don’t think about the grander global picture like how do I make something compelling and then how do I make sure that I market well. You get too focused on search engines and then totally miss social media and for example social media marketing.”

Why does Google continue to present multiple results from one domain on a search result?

“In the past it was the case that you could search for ‘antique green glass’ and all of the results might be from one domain and that was kind of a bad experience. We introduced something called host clustering which means for each hostname – so like per subdomain – you only get two results.

We then saw people – spammers and bunch of different webmasters adapt – say “ok, do a bunch of different sub-domains and get two results from one hostname, two results from another hostname” and they can get back to crowding up the whole results page again.

We changed things again because we do want diversity in our search results and we made it such that you can get results from one domain then another result from one domain then other results from that domain get progressively harder and harder to rank.

We then made another change not too long ago where we say – if someone is searching for rental cabins in Tennessee – and there’s a really good website about that it may be helpful to show more than just a few results from that domain.

Once you’ve saw a cluster of results from one domain, once you go through subsequent pages we wont show you that domain again so that should help improve the diversity.”

What does Google think of single-page websites? There are some great websites using only a single page (& lots of CSS and Javasvcript) bringing the same user experience as a regular website with many subpages?

Google has gotten better at handling javascript and a lot of times if you’re doing some different or strange JavaScript interaction or having things fold in or out we’re pretty good at being able to process that. In general I wouldn’t bet your SEO legacy on this one single page working well.

If it works for you and users to have that all on one page, for the most part it should work for Google as well.”

Do Google take action on sites that do keyword stuffing (with phone numbers)?

The answer is yes we do. We get a lot of complaints about that. When you type in a phone number of you just get page after page after page of those cookie-cutter sites you get really annoyed and we hear those complaints internally within Google. We treat it basically as keyword stuffing as you’re repeating very similar words after each other – just like someone throwing a dictionary up on the web (but with numbers).

We do consider that web spam.”

Matt’s Comments in a Video about Negative SEO

“We try to think about whether there’s a way that person A could hurt person (competitor) B. We try really hard to design algorithms which are robust and resistant to that sort of thing.

At the same time, a lot more people are thinking about their backlinks…what if people try to do negative SEO? Where they point links at a site to try and make that site rank lower. In my view there are very few people who talk about negative SEO and fewer who try it and fewer still who actually succeed.

We’ve just uploaded a Disavow tool which allows people to upload a text file of links and say “I would like Google to ignore these links to my site.” If you’re someone who wants to do Negative SEO it’s probably a much better use of your time to try and do something productive.”

How Much Can We Take Seriously?

google-answers

It really shouldn’t come as much of a surprise to you that spamming Youtube to death works pretty well after my last few blog posts, but it’s a shame that it happens in such prominent industries. And even more of a shame when it comes from networks that Google definitely seems to know about.

First came the tweet from Matt:

matt-russian

I didn’t actually catch this at the time (hat tip to GOS) but it definitely went viral among the Blackhat forums which you can see with pretty much any Google search on the topic. Many of them who relied on Russian link networks started getting a little worried.

Further confirmation came a month later when the highly respected Barry Schwartz covered the topic on Search Engine Land. SEL is without a doubt the most respected publication when it comes to search engine news, so it wasn’t hard to put two and two together judging by the dates on the tweet and their post.

sape-penalised

It’s easy to think when you read something like this that you should totally stay away from any of these networks. Especially when they’ve been called out specifically. I mean something so public surely isn’t going to work in Google anymore is it?

How’s this for irony. Let’s look at two popular SEO related search terms. These are arguably some of the most competitive search terms on the web not because of their search volume (although they do get 40K+ exact searches per month) but because you’re literally competing against other people who consider themselves to be among the best at getting top search engine rankings.

And what do we have ranking number #1 for ‘SEO Company’?

seo-copmany
Note: I did edit this screenshot to remove the Adwords ads but didn’t alter the positioning of results

An awful Youtube video. I use proxies any time I perform these kind of searches and I’m not logged into Google. The comments I’ll show you in a second clearly prove I’m not the only one seeing these results either. They might change in a few days from this post – I don’t know – but the ones I’ve called out in recent blog posts over the last few months haven’t changed.

Let’s put in another search term like ‘SEO Services’ which gets over 40,000 exact searches per month. It’s a pretty ideal buyer’s keyword if you’re looking for SEO clients:

seo-services

So where are these links coming from? Well, you can of course start digging into the backlink profile, but other members of the SAPE network pretty much give it away themselves:

sape-comments

And if you’re not convinced, tell me when you last saw such a natural link profile like this one below. Over 2 million backlinks in such a short period of time, more than likely from the SAPE.ru link network.

2million

Please be aware that there are a lot of imitators in Google trying to rank for the name of this network to get customers because of how well it works. They are not the service I’m referring too here. We’re focusing solely on the Russian network that was supposedly taken down and ‘dealt’ with already.

I could do this stuff all day long. Oh wait, I do. The list of examples I have is mind-blowing.

As a side note: My ‘private SEO circle’ is opening 20-30 new places on September the 2nd (will probably max out after an hour or two) so send me an email to HQ @ name of this website if it’s something you might be interested in.

The Google PR Spin

I really wanted to leave some feedback after each Matt Cutts’ answer but I didn’t want to ruin the flow of your reading. Comments like “[don't do negative SEO] it’s a much better use of your time to do something productive” and “just go ahead and disavow those links” aren’t exactly ideal to me when you consider that Google don’t show all of the bad links they supposedly know about pointing to your site. With great timing, Jim Boykin shared some responses from an interview he just did with Google’s John Mueller over Google Hangouts. Here are some of the main takeaways:

  • Google supposedly have technical limitations for how many ‘bad links’ they can show in Webmaster tools (1,000 links from 1,000 domains) – I find this hard to believe by the way
  • Don’t think you’ll get help from Google on this when Matt commented above that you shouldn’t expect people who handle reinclusion requests to send you example links
  • It’s up to you to disavow links even if they’re from sites that scraped content from those that already link to you
  • The links in webmaster tools in many cases are years old and rarely get updated (this is from Jim’s experience, not mine) making the job even harder

My thoughts for a while now – which I’m glad Jim touched upon – are basically why can’t Google just give those links no value rather than relying on us to do the job for them and report links which may have been built unnaturally. John’s response: “Well, if we could recognize all of them I guess that might be a possibility.”

There’s a lot to take away from that statement. It’s near impossible for Google to be able to detect with 100% accuracy that certain links are good while others are bad and thus rely on us to report it for them. There are exceptions of course, but then again you have to keep in mind a) Whether something was Negative SEO on another site by a competitor and b) blanket rules for this in the algorithm would wreak havoc across normal search results.

Recently there was a bit of drama surrounding a Moz.com post which suggested that Google +1′s were the number one factor in getting search engine rankings these days. This was quickly debunked by Matt over on Hacker News and the post on Moz was edited to reflect that they had meant that shares on Google+ were getting sites rankings. I’m not here to bash Moz but the correlation does not equal causation. Sites that write good content generally have readers who will share their content on social media sites like Google+ and they are actively attracting new backlinks anyways which help their rankings.

moz-correlation

I am here to point out how much Matt dodged a lot of questions on that thread and really didn’t want to talk about shares having an effect on rankings (all that juicy pagerank) but just reverted to the good old Google PR angle “focus on creating great content”. Just like that amazing video that ranks number #1 for SEO company and top 3 for SEO services, right? ;)

If you’re new here, I did write a few blog posts recently which show lots of other examples of terrible search results and I tend to get a few dozen more in my inbox from readers every time I write a blog post like this. I really don’t like to be the person to call out too many examples though (I blurred the links in my last post on request from someone I had ‘outed’).

To wrap up this post, there are a few things I want you to keep in mind:

  • Listen to what Google have to say via these channels, but don’t take it too seriously past common sense
  • Do not do crazy SEO tests on your money site (the site you can’t ‘risk’) but do SEO tests
  • Remember that while Google may have around 2.5 million servers, the people who put these algorithms together sit down for staff meetings at a conference table every week to try and deal with the challenges they face just like you or I would (but a little smarter)

Next week I’m sharing a replay of my first ever webinar which shows you some SEO tactics you can put into place that take affect immediately and will give you an upper hand on people who only read SEO blogs and don’t follow through with their own testing. You’re going to love it (I hope)! Thank you, as always, for reading…



You may view the latest post at
http://www.viperchill.com/2-million-backlinks/

Build Great Backlinks
Glen
peter.clarke@designed-for-success.com

No comments:

Post a Comment