Category Archives: SEO Crap

User Behavior Isn’t A Ranking Factor? So Sayeth the Google…

spock-bullshit

Yesterday, John Mueller (the guy who’s essentially become the face of Google while Matt Cutts has been away) was asked whether user behavior on your website is a ranking factor in a Google Webmaster Hangout.  Mueller’s answer was as follows:

So in general, I don’t think we even see what people are doing on your web site. If they are filling out forms or not, if they are converting and actually buying something… So if we can’t see that, then that is something we cannot take into account. So from my point of view, that is not something I’d really treat as a ranking factor.

But of course if people are going to your web site and filling out forms or signing up for your service or newsletter, then generally that is a sign that you are doing the right things. That people are going there and finding it interesting enough to take a step to leave their information as well. So I’d see that as a positive thing in general, but I wouldn’t assume it is something that Google would pick up as a ranking factor and use to kind of promote your web site in search automatically.

I’m calling bullshit on this.  I’m not a tinfoil hatter by any means, but the idea of Google not looking at user behavior just doesn’t ring true to me at all.  And what’s with the “I don’t think” stuff?  Plausible deniability?

Obviously only the folks at Google know what the ranking factors are, but I find it extremely hard to believe that things like bounce rate, session length, session depth and positive actions (filling out forms, placing orders, etc) have no impact on rank.

The whole point of of Google constantly changing and updating their algorithm is to keep moving forward and improving, right?  So why wouldn’t Google look at user behavior to determine page quality as compared to the term being searched?  The whole point of Panda was to reward and rank higher quality sites and black hatters have proven time and time again that they can rank with shitty spun content, so why wouldn’t Google use these data points that they collect to determine relevance?

Now I know that not everyone has Google Analytics code installed on their servers (although most sites do), so I suppose it would be unfair to reward certain actions, but consider this scenario:

  • User does a search
  • User clicks a link from the search results
  • User sees the site and bounces (leaves within seconds of arriving) by hitting the back button
  • User clicks on the next search result
  • User doesn’t go back to the search results

This seems like a fairly clear signal that the first link didn’t have information that matched the search query while the second link did.  If this happens over and over again, are we really expected to believe that Google will choose to ignore something that is so clearly a behavior resulting from a site not returning information that is relevant to the search term?  Didn’t think so.

It seems that I’m not the only one with this opinion:

rankingfactors-info

 

Posted in SEO Crap.

Building Content The Right Way

“Adapt or perish, now as ever, is nature’s inexorable imperative.”

-HG Wells, 1945

We all know that well written, original content is extremely useful from an SEO perspective. It is 100% worth doing and is vital to any holistic SEO strategy. That said, if you’re writing your content with the search engines in mind as your primary target, you’re doing it wrong.

2015 is quickly turning into the year that Google finally forces websites to be customer focused. We all know that Google released their “mobile update” in April and the reason for this was to force websites to focus on design for the end user. If you have a mobile friendly site, you get a boost in Google’s mobile rankings. If you don’t, you won’t get the boost.

This really started a few years ago when Google announced that they would stop providing keyword data for organic. In 2011, Google started withholding keyword data. In 2015, pretty much everyone is seeing 90%+ of their organic keyword traffic being reported as “not provided”. This isn’t news of course, but Google’s reasons for doing this are becoming much more clear.

For years, Google has asked SEOs to stop focusing so much on keywords and focus more on the user experience. SEOs continued to focus on brute force style SEO, so Google basically said “you children can’t be responsible with your data, so we’re not giving it to you anymore”. That was the first true shot across the bow when it came to Google’s war on keyword based SEO. Since 2011, they’ve done the following:

2011
Pushed Panda Live, punishing sites with “thin content”
Removed virtually all organic keyword data via keyword encryption
Rolled out the “Freshness” factor update
Released an update that penalized sites who were showing too many ads “above the fold” (or rewarded those who don’t, depending on how you look at it)
Started adding authorship to articles in the form of pictures and bylines

2012
Rolled out the “Venice” update to improve localized results
Rolled out Penguin to further penalize people for keyword stuffing, article spinning and lots of other no no’s
Rolled out the Knowledge Graph, showing more emphasis on semantic search and giving searchers the information they’re looking for easier and faster
Rolled out the “Pirate” update, to better identify copyright violations
Rolled out the EMD update, devaluing exact match domains

2013
Replaced the core algorithm with Hummingbird, while keeping several of the key updates mentioned previously (Penguin, Panda, etc)
Started the removal of authorship due to widespread abuse of the markup

2014
Rolled out Pigeon to improve geolocated search results
Completed the removal of authorship markups

2015
Rolled out the Mobile Friendly update

If you look at these updates, they all have something in common. The pattern is clear and obvious for all to see. Google is telling SEOs to focus on user experience instead of ranking or expect to see your coverage and visibility slide. They’re also telling SEOs that Google wants you to do as they say, not as they do (as their knowledge graph and answer boxes are nothing but unoriginal, scraped content), but that’s a rant for another day.

There are always going to be sites that play in the grey or the black in order to get ranked. That should not be your focus, as it is 100% beyond your control (unless you enjoy ratting out your competition, and even then there’s no guarantee that anything will happen). Instead, focus on what you can control – content quality, site speed and ease of use.

Good guys don’t always win, but at least they can sleep a little easier knowing that they won’t wake up one day to find their site completely deindexed with a message sitting in your inbox that reads something like this:

Dear site owner or webmaster,

We’ve detected that some of your site’s pages may be using techniques that are outside Google’s Webmaster Guidelines. Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes. We encourage you to make changes to your site so that it meets our quality guidelines. Once you’ve made these changes, please submit your site for reconsideration in Google’s search results. If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request. If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support.

Sincerely,
Google Search Quality Team

Posted in SEO Crap.