Summary: 2012 was the year of the Panda and Penguin for SEOs… so what sort of creatures should we expect Google to turn our world upside down with in 2013? Today we take a look predictions made by one of the industry’s foremost experts, Rand Fishkin of SEOMoz. We’ll offer our take on Rand’s predictions as well as a few of our own.
Well that didn’t take long… barely 2 days in to 2013 and we already had industry heavyweights weighing in on the future of most of the principle disciplines in online marketing, from SEO and local search optimization to PPC advertising. Over the next few days we’ll take a look at how some of these gurus expect the next 12 months to unfold and then I’ll give you my take.
Let’s start with Rand Fishkin of SEOMoz, who decided to score his 2012 predictions before gazing into his 2013 crystal ball. I’d recommend reading the entire post for details, but here’s a quick overview of his scoring method, followed by a summary of his eight predictions from 2012 and how he scored each:
Here’s how scoring works:
- Spot On (+2) - when a prediction hits the nail on the head and the primary criteria are fulfilled
- Partially Accurate (+1) - predictions that are in the area, but are somewhat different than reality
- Not Completely Wrong (-1) - those that landed near the truth, but couldn’t be called “correct” in any real sense
- Off the Mark (-2) - guesses which didn’t come close
The rules state that if the score is lower than +1, I’m not allowed to make predictions for the coming year. Here’s to hoping!
So how did he do?
|1. Bing will have a slight increase in US marketshare, but remain <20% to Google’s 80%+||+2|
|2. SEO without social media will become a relic of the past||+1|
|3. Google will finally take stronger, Panda-style action against manipulative link spam||+2|
|4. Pinterest will break into the mainstream||+1|
|5. Overly aggressive search ads will result in mainstream backlash against Google||-1|
|6. Keyword (not provided) will rise to 25%+ of web searches||+2|
|7. We’ll see the rise of a serious certification program||-2|
|8. Google will make it very hard to do great SEO without using Google||-1|
Again, you should read the entire post, but Rand fairly scored himself a +4, which is pretty good and means he was worthy of prognosticating on SEO in 2013. Like his revelations about last year’s predictions, it’s worth checking out the specifics of each of his ten predictions for 2013, but I’ll just include the headlines and my take on each:
#1: None of the potential threats to Google’s domination of search will make even a tiny dent
I completely agree, although at first I was going to take issue with Rand’s conclusions about who at least has the “best chance” against big G in the two zero one three – Amazon. To me it seems with Apple and Facebook’s audiences, they would be better positioned, however I have to concede they don’t have the technology. Amazon is much closer. For those of us that shop on Amazon, we know how intelligent their search engine really easy. Still, it’s not “product search”, not web search. So Rand is probably right, while no one is giving Google a run for their money, Amazon may siphon off a few “product” searchers who aren’t satisfied with Google’s product search, which you’ve probably heard is became an all pay-to-play service for merchants after the transition to Google Shopping last year. Basically your products aren’t listed unless you’re a paid advertiser, which means a lot of merchants may shy away. Amazon has fees as well, but they aren’t the same as paid advertising fees. Anyway, long story longer – I agree with Rand that we’re looking at status quo – Google dominance will probably continue moderate growth, but may shed some “shopping” users who seek out the superior UX and product availability at Amazon.
#2: “Inbound marketing” will be in more titles & job profiles as “SEO” becomes too limiting for many professionals
I can’t say I disagree with the prediction here either, but I’m not a fan of the trend. For starters, a few years ago when you heard “Inbound Marketing“, it was probably from someone at Hubspot… or someone who’d just listened to a Hubspot presentation. That’s not a bad thing, I’m just saying that their CEO and founder Brian Halligan coined the term and co-branded it well with Hubspot. But still, “inbound marketing” to me was what I’ve always just called “SEO”. As new things like social media came along, I considered to be aspects of SEO. To me “Inbound Marketing” equals SEO. So why coin the phrase? Well, I’m sure they would argue about subtle nuances, but I think they read the tealeaves about public perception of SEO. In a lot of people’s minds, SEO is blackhat link buying scam artists. As honest “white hats”, it’s nothing we’ve ever engaged in so I’ve always taken offense at being lumped in with them. So while I would congratulate HubSpot for their prescience, I’d also like to state for the record that I’m a stubborn person and I want to hold the line on what an “SEO” is. Don’t let a few bad apples spoil this bunch and don’t let uninformed “reporters” lump us together with them. Unrealistic? Either way, this is probably the least risky prediction of the bunch. No one should be surprised if SEOs don’t want to fall victim to the false smears and start using “inbound marketing” language as a shield.
#3: More websites will move away from Google Analytics as the only provider of web visitor tracking
I have to admit I don’t know much about the GA alternatives Rand mentions, but it would surprise me to see many people jump ship. Keep in mind that more than half of businesses in most states are still without a site and as they catch up with the 21st century, it’s a safe bet that most will get set up with GA, especially with Google being behind the big push to “Get Your Business Online“. We’ll meet back here at this time next year to discuss, but unless somehow these other analytics alternatives were able to account for the “(not provided)” keyword situation (which they can’t), I’m not seeing any sort of mass exodus. Granted, Rand is only predicting the trend will “be small, but measurable”… but then is it even worth counting as one of the top 10 predictions for the year?
#4: Google+ will continue to grow in 2013, but much more slowly than in 2012
This one is a little tricky. First of all, it’s kind of hard for Google+ grow “much more slowly” than the sluggish pace it’s crept along at so far. Because we all bow at the alter of Google, a lot of people created an account, but then never used it again. How do you convince a billion users to waste their time on G+ instead of on Facebook? At the same time, Google has a few things going for it – dozens of other “free” services that we use every day, like Gmail. The key to Google+ ever being successful is users being logged into their Google accounts. Well, in addition to Gmail, Google has services like YouTube, Blogger, Places/+Local, Analytics. Given the integration with these existing Google accounts and their strategic roll-out, I haven’t quite written Google+ off yet, assuming they still had something up their sleeve. But if Rand’s prediction is right and growth slows further, I think that could signal the end of another failed venture by Google into the social media world.
#5: App store search will remain largely ignored by marketers (for lots of defensible reasons)
And this prediction will remain largely ignored by me. I’m actually again basically in agreement with Rand, but I’m just not sure why this item even made the list, other than to help it get to the nice round “10″. Using the appstore to market something would require that you create an app, and that will simply only make sense as a strategy for a very limited number of businesses.
#6: Facebook (and maybe Twitter, too) will make substantive efforts to expose new, meaningful data to brands that let them better track the ROI of both advertising and organic participation
FB and Twitter have obviously both been behind the curve here so I’d like to see this happen, but I’m not confident it will. Call me cautiously optimistic for now.
#7: Google will introduce more protocols like the meta keywords for Google News, rel author for publishers, etc.
Not exactly a bold prediction, but I completely agree. Although I would also include expansion of the importance of rich snippets/structured data markup in that list. I would also expect we’ll see increased display of this data in both standard organic results as well as knowledge graph results.
#8: The social media tool market will continue a trend of consolidation and shrinkage
Another safe prediction that a recent/current trend will continue.
#9: Co-occurrence of brands/websites and keyword terms/phrases will be proven to have an impact on search engine rankings through correlation data, specific experiments, and/or both
As you know, honest SEOs have in recent years, been moving more toward content as a long term strategy and away from traditional link building methods Google has devalued (trading links, etc.). But a website’s inbound link profile can still have a profound impact on its rankings. This isn’t an argument against content, however, because a large reason we create quality content is so that others will find it and link to it naturally, if they find it relevant and useful enough. What Rand is talking about with “co-occurrence of brands and phrases” (or co-citations), is “mentions” on the web that don’t necessarily include a link. The principle is still the same, but the goal is to get other websites talking about your company website, even if they aren’t linking to it. It’s the same principle behind “citations” in Local SEO. I don’t disagree that this is happening, but I’m 0n the fence about how heavily Google will weigh these “mentions” when determining rankings. Rand’s logic is sound, but I just think that blackhat/webspammers will take advantage like they always do (remember, once upon a time “meta keywords” mattered), possibly forcing Google to reduce the impact on rankings. Then again, over the last year we’ve seen them answer low quality content and spammy links with Panda and Penguin, so perhaps they’re in the lab tinkering with a mutant Pandguin……. or Pengda, that might help them combat the inevitable abuse of co-occurrence “mentions”.
#10: We’ll witness a major transaction (or two) in the inbound marketing field, potentially rivaling the iCrossing acquisition in size and scope
Sure, maybe, but so what? Pardon the apathy on this one, but it doesn’t really impact “SEO” (in terms of strategy or how we do our work) unless you’re part of the company being acquired or acquiring. Any major acquisition would certainly be “industry news” worth talking about when it happens… but this isn’t a “Buy Out Prediction” list, it’s an SEO prediction list.
To summarize, I don’t really disagree with much of what Rand is predicting, but I think some items are rather inconsequential while some others are simply safe bets about continuing trends. As a whole it’s a good list, but I just don’t think the individual items measure up against his 2012 predictions in boldness. Using the same scoring system, it would be hard to imagine he could “lose”.
My SEO Predictions for 2013
As I said, I basically agree with most of Rand Fishkin’s predictions, but I’d like to go out on a limb a little further in mine.
Randomized Search Results
Using the word “randomized” may be overstating it a bit, but that doesn’t change the point – search rankings will increasingly change to the point where focusing on specific rankings of specific keywords will become a fools errand. We’re kind of at that point already, but I expect it to get worse, in the sense that you’re going to struggle if you’re still chasing individual rankings. SEOMoz recently published a post by Dr. Pete about how they’ve observed and tracked this phenomenon for 3 specific keywords. Essentially the conclusion was, after watching rankings fluctuate several times, even within a single day, that Google isn’t doing this just to make it harder for SEOs to figure out what makes a website rank. Rather, Google is reacting to the organic nature of the internet with constantly changing results. Here’s Dr. Pete’s wrap-up and interpretation of the data:
We can’t conclusively prove if something is in a black box, but I feel comfortable saying that Google isn’t simply injecting noise into the system every time we run a query. The large variations across the three keywords suggest that it’s the inherent nature of the queries themselves that matter. Google isn’t moving the target so much as the entire world is moving around the target.
The data center question is much more difficult. It’s possible that the two data centers were just a few minutes out of sync, but there’s no clear evidence of that in the data (there are significant differences across hours). So, I’m left to conclude two things – the large amount of flux we see is a byproduct of both the nature of the keywords and the data centers. Worse yet, it’s not just a matter of the data centers being static but different – they’re all changing constantly within their own universe of data.
The broader lesson is clear – don’t over-interpret one change in one ranking over one time period. Change is the norm, and may indicate nothing at all about your success. We have to look at consistent patterns of change over time, especially across broad sets of keywords and secondary indicators (like organic traffic). Rankings are still important, but they live in a world that is constantly in motion, and none of us can afford to stand still.
Now, the reason I think the randomness (as I’m calling it, even though Google would surely say there is science behind the organized confusion) will increase is because there will be increased competition. As I mentioned earlier, more than half of businesses still don’t have websites, but the number that do is growing all of the time. Google’s stated goal has always been to provide the most relevant results. Certainly they know that there are businesses that are just as “relevant” as others, but have a much smaller/newer footprint on the web. Wouldn’t it stand to reason they start to rotate search results in and out in the name of fairness? I’m not saying that it’s the right approach because it actually punishes that success of others who might have worked hard to obtain high rankings – but Google would say they aren’t interested in being manipulated. Again, if relevance is the goal, how can Google select just 10 results out of potentially hundreds that may be just as “relevant”.
Basically I’m saying the same things I’ve said for a while about the “local pack” that appears within search results, but the same principles apply to organics. Let’s review it in the context of the “local pack” though to better illustrate the point:
If you have 20 Chinese restaurants in your town, how does Google determine which 7 are the most relevant and worthy of being in the local pack? I’m not asking how they determine relevancy right now – we have a pretty good idea about many of the local ranking factors. But are you going to tell me someone couldn’t argue that at least one of the other 13 restaurants is equally as “relevant” as the other 7? I wouldn’t want to be in the position of being the arbiter of “fairness”, but the subjective nature of “relevance” would seem to put Google in that role, and you would think they would want to give everyone an equal shot. I suppose one could always argue that Google still has their own secret “relevancy” factors that dictate which 7 are the “most” relevant, but why 7? What if Google deemed a total of 8 to be equally relevant? How much potential business will #8 miss out on? And again, the point is the same for the 10 organic results as well. Think about poor #11.
Think about a brand new restaurant that has a brand new website, if they have one at all. Even if they create a Google+ Local listing, it doesn’t have any history, support from outside citations or reviews to bump it up in terms of “relevancy”, but it happened to be a block away from the searcher. Proximity is a factor, but it’s understandably not always a major factor because it shouldn’t be in certain industries – in restaurants it should probably play a pretty big role. Now, imagine this brand new restaurant has a lot of buzz around it and everyone around town wants to check it out, but a lot obviously don’t know they name.
If the hypothetical company described above was here in Winooski, a lot of people would obviously search for things like ‘winooski restaurant‘, right? Take a look at those results and tell me, where’s Donny’s Pizza, Pho Pasteur, or the newest addition, Misery Loves Company? (To name just a few) I can tell you just from living and working in the area that MLC is the most buzzed about and probably the most active on the web (daily menus posted on site, shared on Twitter and Facebook). Now I know that beyond the 7 restaurants in the local pack and the corresponding pins on the map, there are tiny pink dots that represent other restaurants. But who notices those? Besides, things don’t get much better when you click the map in those results which bring you to a page with 10 business/map listings at a time. That’s right, Subway, a half-mile or so outside of the city center is featured ahead of several other businesses right in the clump with the others that are featured. I’ve got nothing against Jared or $5 foot-longs, but come on! Which is more “relevant” to the searcher looking for a “winooski restaurant”, Subway or Misery Loves Company? Need more? The Windjammer and Pulcinella’s are in the top 10, but they’re in South Burlington!
Obviously Google isn’t going to ever be able to keep up with the constantly changing landscape of every industry in every region, so what choice do they have but to mix things up to ensure all potentially “relevant” results are being served? I’m not sure if that’s the ideal solution, but clearly there’s a problem.
Keep in mind, I’m not advocating randomizing search results. In fact it makes the job of an SEO more difficult when we can’t measure to track progress accurately. I’m just pointing out what only seems logical and “fair” if you’re in the “relevancy” business. Google has no legal or moral obligation to equal opportunity here, but if there are 20 search results that are just as “relevant”, how can the same 10 always be the ones that show up on page 1? Perhaps I’m oversimplifying a bit (I know location can impact results, etc.), but I think you get the point. Google became a success by providing the best search results/user experience, so it’s hard to say they should change things to “give a shot” to low visibility websites. And by no means should they be forced to randomize for “fairness”. Really it’s on the “low visibility” companies to figure out other ways to drum up business, rather than rely on free Google search results and complain that they aren’t ranking. But it just seems like Google is also not in the business of drumming up massive business for a half of a dozen companies in each industry (and region in many cases), while dozens to hundreds of others get virtually no attention.
By the way, if you’re concerned that I’m unnecessarily obsessing over the first page of search results, it’s time you took a look at one of the many “heat maps“, like the one on this BI post that depict user behavior. Depending on the source, these maps show you where users click, mouse-over or “look”. Obviously you can see that not only is it important to get on the first page, but you need to be in the the top few results on the page if you want your site to garner any significant percentage of user interaction.
While it certainly hasn’t and can’t fully resolve the issue I’ve described above, search results influenced by auto-detected user location do factor in and became increasingly common in 2012. And I’m not talking about the “local pack” here, but rather the results that look like regular organic results, while clearly being local/location influenced, whether geographic terms were used in a searched phrase or not – what I call “local organics“. Obviously this is another effort by Google to serve “relevant” results, but it still doesn’t address the fundamental question about competition I raised above. Is it “fair” (in Google’s own mind) to show just 7 of 20 local Chinese restaurants in the local pack and then 2 or 3 others in the organics?
Local organics only compound the problem described above. At least with the “local pack” Google can claim that they have an algorithm that determines the top 7 local/map/Places results and say “that’s that” for now, but that’s separate and apart from determining organic rankings. And again, we’re not talking about purely organic rankings here either. For example, Google throws up a 7-pack when your location is set to Burlington, VT and you search “chinese food”… but the top (seemingly) organic result is actually a local organic for North Garden Chinese in South Burlington. Fine, but what about others in the area that are at least as “relevant”?
Whatever Google does to resolve this issue, I believe “local organics” will be a part of the answer. Over the last year I’ve noticed that Google has gotten better at pulling a site from organic listings if the company is already among those listed in the local pack, so they clearly have their eye on these types of searches and results pages. As an SEO, it used to be great when I could get a client’s site ranking both organically and in local results, giving them 2 great pieces of real estate on page 1 of search results. Those days are all but gone at this point.
You may notice Yelp and other directories listed among the “local organics” quite often. Now it’s no secret that Google and Yelp have butted heads over the years, but you still regularly see Yelp among Google’s search results. Still, Google would obviously prefer that users stay within one of their own properties where they stand a chance of earning revenue if you click on a PPC ad. I think perhaps they’ll move directories, like Yelp, into another area in the search results page when specific local intent is clear, and free up some of the “local organics” space for individual restaurants. We watched them test out various new search result navigation layouts throughout 2012 before making the change official toward the end of the year. We’ve all wondered what they were up to with this change, which actually saw the results area move further to the left as search filter options were moved above. Could this be paving the way for a new, more “fair” layout for local results?
Google Shopping Will Become Free Again
I know that Google Shopping only transitioned to an all paid service for retailers last summer, but I think the lack of variety and competitive pricing that this will lead to, as well as Amazon’s superior user experience, will cause Google to rethink their decision. I’m definitely going out on a limb here, because Google knows they can get bigger advertisers to continue to pay to play, but this all goes back to relevance and user experience. I think they run the risk of turning off shoppers and if the shoppers aren’t there, the advertisers will eventually leave too. I can personally say I’m someone who stopped using Google product search, in favor of Amazon, over the past several months. From time to time I try to look at Google Shopping again, but I’m almost always disappointed.
This obviously isn’t as much a prediction as it is common sense, but I “predict” we’ll all continue to be inundated with spam. And I’m not talking about us as individuals with email accounts – I’m talking about businesses with websites and webspam. For those of us who don’t get involved with shady blackhat SEO tactics, it would be easy to forget that this garbage is still going on, if it weren’t for the fact that we run in to it all of the time. As someone who’s been in this industry for over a decade, I feel like the spammers should have been stomped out about 5 years ago… but it seems like it gets worse every day. But why? Competition. As more websites come online, it becomes tougher and tougher to rank. Desperate people do desperate things, like buying links, even against their own better judgement.
What’s even scarier than desperate people keeping spammers in business? The number of business owners who simply don’t know any better. I can’t count how many times a client has called us and said some variation of the following: “we just got a call (or email) from Google that said we need to buy links to improve our rankings”. The worst part? We’ve had these same scam artists leave these same types of pitches on our voicemail, and I can confirm that they do deceptively attempt to say they “work with Google, Yahoo and Bing”. And I know they read from a script because I’ve gotten the exact same, somewhat lengthy message from two different guys in the last year. Their wording is potentially ambiguous enough to keep them out of legal hot water, but the intention is clearly to make gullible business owners believe that they either work directly for the search engines or at least partner with them. Oh, and the best part? Because they are so tight with Google et al, they’ll sell you top ranking positions that they currently have “available”.
As I said this isn’t much of a prediction because web spammers are like weeds you just can’t seem to kill, but what amazes me is how brilliant they really are. And given that desperate and low information website owners aren’t going anywhere, there will always be a pool of gullible people willing to purchase some snake oil. What will be interesting is watching to see whether the spammers even feel the need to adapt given that 2012 was the year of the Panda and the Penguin on Google. Basically Google dropped a nuke article submissions and link buying and did so very publicly. The problem I see is that there will always be a crop of new people who don’t know better and will be sucked in. But can we reasonably assume that this group will shrink over time to the point where spammers are forced to change strategies? That seems unlikely, but we’ll see.
Tablet Detection – Desktop Sites
Do you own a tablet, or a smartphone for that matter? And do you get as frustrated as I do when you’re forced in to a dumbed down, feature reduced mobile site because the site detected that you were browsing on a mobile device? Fortunately some of these sites at least offer you the option to click a “full site” link in the footer, but 90% of the time I wish the desktop site was the default and “mobile version” was optional (maybe a pop-up on page load?). And while I can at least understand the argument for defaulting to mobile sites on phones, why would I want to see a crappier version of a site on my 10 inch Nexus 10? And yes, I’m aware that there are settings I can tweak on my own… but do you think most users will ever know or do this?
So I expect that more specific mobile detection will become necessary. How is this SEO related? Well, it’s not related in terms of search engine rankings, but it’s part of online marketing in that it plays a big role in converting visitors. How many weak mobile sites are people backing out of on their tablets, when more precise device detection could have brought them to the desktop site – the site the user is often familiar with after having viewed it on their computer. So this one is really half prediction and half wishful thinking, but I’m sticking with it given the tablet surge that we saw in 2012 (and the expected continuation this year).
Google Panda and Penguin updates will undoubtedly continue, but those names don’t strike fear in the hearts of spammers like they did when they were first released. I’m going to stop short of “predicting”, and just say I wouldn’t be surprised if 1-2 new Google animals are added to the zoo in 2013.
Don’t get caught up chasing a ghost, desperately trying to follow the “latest” trend in SEO. Just keep doing what you’re doing (assuming you’ve been creating content, engaging on social networks, etc. all along). Stick to whitehat tactics, avoid spam at all costs and supplement efforts with PPC advertising when necessary. Most of what has changed is relatively minor, but the picture remains the same and for the foreseeable future regular content and legitimate links and “mentions” from other sites will be key to your website’s success.
Do you have any predictions about SEO in 2013? Thoughts about Rand Fishkin’s predictions or mine? Let me know in the comments below.