The message from Google is clear: it’s time to play by the rules.
Problem is it’s not always that easy to know and understand what the rules are. Or it gets very frustrating to play by the rules and get nowhere while watching our competitors with half-baked content take over Google’s first page rankings.
I’ve found myself covering a lot of SEO/link building news in my Weekly Marketing Skinnies – so much so that the Skinnies were getting fatter and fatter by the week; sort of defeats the purpose of getting a quick overview of the week’s happenings in one convenient place.
So I decided to start writing this special SEO/link building Skinny to keep track of the new Google changes, how they might affect our websites, and what we need to do about them (or what we need to stop doing for that matter.)
I intend to continue adding to this SEO Skinny more links, updates, studies, etc. as they come out.
Also, if you know of any interesting posts/case studies/notes/developments etc. that would benefit the readers, feel free to mention them in the comments.
TL;DR If you want to stay updated on SEO and Google, but don’t want to sift through hundreds of publications, bookmark this post and come back often to keep up.
Side note: since I started writing this post on August 15, many of the previous updates of the year might not make it into this post, but I will try to keep you as up to speed as possible.
Section 1: Google Latest Updates
Google Wants To Know Which Small Sites Should Rank Better
August 28, 2013
Google’s head of search spam, Matt Cutts, posted a survey on Google Docs asking you to tell them which small web sites are getting an unfair shot in Google and should rank higher.
Matt announced it in this tweet:
If there’s a small website that you think should be doing better in Google, tell us more here: https://t.co/s80BibIBhN
— Matt Cutts (@mattcutts) August 28, 2013
The survey itself makes it clear that no rankings will be impacted directly from this survey:
Google would like to hear feedback about small but high-quality websites that could do better in our search results. To be clear, we’re just collecting feedback at this point; for example, don’t expect this survey to affect any site’s ranking.
Hat tip: Barry Schwartz.
Official: Google’s Keyword Tool Replaced
August 29, 2013
It took some time, but now the legacy external keyword tool is gone. If you try to access it at adwords.google.com/select/KeywordToolExternal you will see the following message:
This Keyword Planner is what replaced the old Keyword Tool.
Downside: you need to login to your AdWords account to use it. If you do not have an AdWords account, you will need to sign up for one to get this data.
We’ve talked about this in the previous Marketing Skinny, but here’s a refresher.
WordTracker released a major update to their Free Keyword Tool – their free tool now works more like the paid version of WordTracker.
You can get instant access to the free Wordtracker by just entering your email and choosing a password. It’s as simple as that.
If you are wondering what I personally use for all my keyword research, rank tracking, link tracking, etc., it’s SEMRush – the tool the paid version of Wordtracker uses. Can’t live a day without it; it’s that good.
Matt Cutts Is Working on his Spam Network Jokes
August 29, 2013
Matt Cutts tweeted that he is working up several “ghost-related puns for a spam network.”
Thinking of ghost-related puns for a spam network. “They try to look super natural, but using them will dampen your spirits.”
— Matt Cutts (@mattcutts) August 29, 2013
Is this a sign that Google just released yet another target aimed at another spam network? Could be.
Find Previous Google Updates
Just click on the sections to expand/collapse.
[expand title=”Google Webmaster Videos with Matt Cutts”]
Latest Google Webmaster Videos with Matt Cutts
Most of the times, the best way to learn something is to go to the source. One of the best resources on Google SEO is Matt Cutts and his Google Webmaster Help videos.
If you want to know first-hand what Google says without me or any other blogger chewing it up and spitting out our own conclusions on the topic, these are some of the most relevant recent videos to watch.
- Should I add rel=”nofollow” to links that are included with my widget? August 12, 2013
- Should I be worried if a couple of sites that I don’t want to be associated with are linking to me? August 7, 2013
- Will Webmaster Tools give examples of which links or pages caused a manual spam action? July 31, 2013
- If I have 20 domains, should I link them all together? July 17, 2013
[expand title=”Confusion About Manual & Algorithmic Penalties – August 21″]
Confusion About Manual & Algorithmic Penalties
Google is very much aware that there’s a lot of confusion between a manual and algorithmic actions.
John Mueller, Google’s Webmaster Trends Analyst, said in multiple Google Webmaster Help threads in the past twenty-four hours that they are aware and working to make it easier.
But for now, here’s your best guide:
When a site is algorithmically found to have been compromised, a reconsideration request is unnecessary — it’ll be updated automatically as we recrawl & reindex the content from there, and see that it’s no longer compromised. The normal crawling & indexing can take a bit of time, so unfortunately you’ll need to be a bit patient.
In cases where the site was manually found to be compromised, you can still submit a reconsideration request to have it reviewed. You’ll see the difference by checking the manual action feature in Webmaster Tools.
We’re looking into ways to make this process a bit clearer, more straightforward & consistent, I realize it’s a bit confusing at the moment.
~ John Mueller
[expand title=”Rel author Frequently Asked (advanced) Questions – August 21″]
Rel=”author” Frequently Asked (advanced) Questions
Google Authorship helps searchers identify content from a specific author better.
Additionally, searchers can click the byline to see more articles you’ve authored or to follow you on Google+.
Google has just posted answers to several advanced questions about Google Authorship:
1. What kind of pages can be used with authorship?
You can increase the likelihood that we show authorship for your site by only using authorship markup on pages that meet these criteria:
- The URL/page contains a single article (or subsequent versions of the article) or single piece of content, by the same author. This means that the page isn’t a list of articles or an updating feed. If the author frequently switches on the page, then the annotation is no longer helpful to searchers and is less likely to be featured.
- The URL/page consists primarily of content written by the author.
- Showing a clear byline on the page, stating the author wrote the article and using the same name as used on their Google+ profile.
2. Can I use a company mascot as an author and have authorship annotation in search results? For my pest control business, I’d like to write as the “Pied Piper.”
You’re free to write articles in the manner you prefer — your users may really like the Pied Piper idea. However, for authorship annotation in search results, Google prefers to feature a human who wrote the content. By doing so, authorship annotation better indicates that a search result is the perspective of a person, and this helps add credibility for searchers.
Again, because currently we want to feature people, link authorship markup to an individual’s profile rather than linking to a company’s Google+ Page.
3. If I use authorship on articles available in different languages, such as
example.com/en/article1.html for English and
example.com/fr/article1.html for the French translation, should I link to two separate author/Google+ profiles written in each language?
In your scenario, both articles:
example.com/en/article1.html and example.com/fr/article1.html should link to the same Google+ profile in the author’s language of choice.
4. Is it possible to add two authors for one article?
In the current search user interface, we only support one author per article, blog post, etc. We’re still experimenting to find the optimal outcome for searchers when more than one author is specified.
5. How can I prevent Google from showing authorship?
The fastest way to prevent authorship annotation is to make the author’s Google+ profile not discoverable in search results. Otherwise, if you still want to keep your profile in search results, then you can remove any profile or contributor links to the website, or remove the markup so that it no longer connects with your profile.
rel=publisher helps a business create a shared identity by linking the business’ website (often from the homepage) to the business’ Google+ Page. rel=author helps individuals (authors!) associate their individual articles from a URL or website to their Google+ profile. While rel=author and rel=publisher are both link relationships, they’re actually completely independent of one another.
7. Can I use authorship on my site’s property listings or product pages since one of my employees has customized the description?
Authorship annotation is useful to searchers because it signals that a page conveys a real person’s perspective or analysis on a topic. Since property listings and product pages are less perspective/analysis oriented, we discourage using authorship in these cases. However, an article about products that provides helpful commentary, such as, “Camera X vs. Camera Y: Faceoff in the Arizona Desert” could have authorship.
Learn more about Google Authorship and Google Author Rank:
[expand title=”Google Quietly Updates the Link Schemes Document – July”]
Google Quietly Updates the Link Schemes Document
Back in July, Google quietly updated the Link Schemes/Unnatural Links document inside the Webmaster Tools section of their site.
Barry Schwartz seemed to be the first one to notice/report the changes.
Link Building Takeaways from the document
Of course, there’s a lot of ambiguity in this document. Who’s to know what my intention is or what excessive means, right? Google intentionally left itself a lot of wiggle room here and that’s understandable.
If you want to read a good post mulling over the language in the Link Scheme document, read Eric Ward’s Understanding Google’s Latest Assault On Unnatural Links.
Meanwhile, here are my main link building takeaways:
When you build a link – any link: blog comment, guest post, social media promotion – ask yourself what your main intention is.
If it is for the sake of link building, don’t do it. Provide links if and when they are valuable to the community or to support the point you are making.
Don’t buy/sell links – that one is obvious.
What might not be obvious is what’s considered buying/selling links.
It’s not just monetary exchange.
This includes exchanging money for links, or posts that contain links; exchanging goods or services for links; or sending someone a “free” product in exchange for them writing about it and including a link…
That includes affiliate links, sponsored reviews, any kinds of advertisements, link exchanges, etc.
If you do any of them, make sure you use a “nofollow” tag on them – whether they are inbound or outbound.
Google isn’t saying “Don’t buy links”.
It’s OK to buy links to get you click traffic. Just make sure there’s no PageRank passed through those links, i.e. use “nofollow”.
If Matt Cutts was looking over your shoulder as you purchased the link, could you explain to him why it has nothing to do with rank and everything to do with audience relevance?
That should be your guide.
Don’t do link exchanges for the sake of link building.
Don’t do “I link to you and you link to me” with intention of building reciprocal links.
Do it because it makes sense to promote each other’s businesses and your readers benefit from it (i.e. a party planner linking to a local catering site, a flower shop, a local band, etc).
Reserve article marketing and guest posting for traffic.
Don’t do it for link building. And don’t do it in mass; doesn’t work like that any longer anyway.
As far as guest blogging is concerned, the reputation of both the guest blogger and the receiving blog are extremely important.
As Eric Ward says:
The words “Blog For Us” has become a red flag for me. I’d look at the caliber of all the contributors, the topics of their content. You could write the most elegant guest post in history, but if that same blog follows up your guest post with one about cheap Canadian pharmacies, well, oops.
Don’t use any automated link building software or hire anyone who does.
There are plenty of tools out there that will blast your site to hundreds of website directories or post hundreds of spam comments on your behalf in a blink of an eye.
Stay away from them.
If anyone says it’s currently working, politely turn them down and watch their rankings drop because sooner or later they will.
That’s exactly what’s happening right now.
Link building tactics that worked even a year or two ago, like link building blog networks, are exactly what many site owners are currently paying back for.
Remember the words:
- May be considered
- Widely distributed
If anything you do might be interpreted as such, don’t do it.
[expand title=”In-depth Articles on Google Introduction – August 6″]
In-depth Articles on Google
Last week, Google’s introduction to in-depth articles were all the rage.
So much so that I postponed publishing Ahmed Safwan’s Viral Content guest post at Traffic Generation Café to write an update about it and a follow-up on the implementation of Schema markup for “Article”.
Of course, Schema markup for Articles was nothing new. In fact, the guys behind WPSocial SEO Booster, the plugin I recommended for all Schema markups as well as social media markups, have incorporated it into their plugin almost a year ago.
The only difference between THEN and NOW is the fact that Google acknowledged the importance of Schema markup by rewarding the sites that implement it with first-page rankings.
Google’s In-depth Articles Takeaways
I won’t rehash it here since there’s a perfectly great post published on the topic here at Traffic Generation Café.
Just one this to mention: Schema markup, although might sound unfamiliar and intimidating to some of you, is in fact fairly easy to apply to your site with the tools I suggested here, and there’s no excuse for not doing it.
Like Pete Meyer from Moz.com said in the comment on his recent recap of in-depth listings on Google:
Honestly, I’m kind of reading Google’s advice as “If you’re not a big media site, and you want a chance to get listed, then do all this stuff.” They’ll make sure the big sites show up, with or without the markup, because people expect the big sites to show up.
Now go do all this stuff.[/expand]
[expand title=”View Manual Webspam Actions In Webmaster Tools – August 8″]
View Manual Webspam Actions In Webmaster Tools
It shows information about actions taken by the manual webspam team that directly affect that site’s ranking in Google’s web search results. To try it out, go to Webmaster Tools and click on the “Manual Actions” link under “Search Traffic.”
Lo and behold, went to check out the tool and this is what I saw:
Considering I’ve never gotten a single “Unnatural link warning” message or intentionally built a link in over a year and a half, that was surprising.
I’ve got my work cut out for me.
Google has also released eleven documents (with 7 videos) on these manual spam actions:
- Manual Actions
- Cloaking and/or sneaky redirects
- Hacked site
- Hidden text and/or keyword stuffing
- Pure spam
- Spammy freehosts
- Thin content with little or no added value
- Unnatural links from your site
- Unnatural links to your site (the video is posted above)
- Unnatural links to your site—impacts links
- User-generated spam
Have you checked your new Manual Actions tab recently? Any warnings in there?
I checked with a couple of other bloggers as well as my followers on Google+ and it looks like many blogs were affected. YOURS?[/expand]
Section 2: SEO Community Reacts
Google Plus Impact on SEO
Learn what leading search industry experts think about the impact of Google Plus on SEO.
+Eric Enge (Stone Temple Consulting) leads a discussion with +Mark Traphagen (Virante.org) and +Joshua Berg (REALSMO) about correlation studies, passing of PageRank, and whether or not PageRank from Google Plus shares impacts ranking of web pages outside of Google Plus.
To-the-point (Eric Enge made sure of that, lol) actionable information you need to know about Google+ and SEO – worth every minute of it.
And here’s a practical takeaway from me: 3 Steps to Increase your Google+ profile authority/PR.
In short, you can/need to:
- Start tweeing/sharing your Google+ updates instead of original articles.
- Link from your blog posts to your Google+ posts (just like I will do right below).
- Link from your blog posts to Google+ posts of other people (for good measure and other more selfish benefits).
To learn more about these 3 steps with screenshots and examples of how to put them into practice, read this Google+ post.
Find Previous SEO Discussions
[expand title=”Is the Fear of Linking Killing Natural Links? – August 13″]
Is the Fear of Linking Killing Natural Links?
Following Matt Cutts’ latest video on how we should “nofollow” links in widgets as well as infographics, many webmasters are starting to expand that to “should we link out at all” for fear of being eventually penalized by Google.
This unhealthy fear of Google is blinding us to the only entity we should really concern ourselves with – our reader / customer.
Imagine me writing this post without linking out to any sources. What would that look like? Like unconfirmed unfounded suspicions at best.
We always need to do what’s best for our readers AND, coincidentally, that’s precisely what Google wants us to do as well.
So link out generously, my friend, without being afraid to miss out on search engine traffic that you are not getting to begin with, while providing enormous value to your readers, AND building relationships with bloggers you are linking out to and getting traffic from them in return.
No External Links? Are You Mad?
Michael Martinez wrote an excellent and sobering post on external linking What To Do When Using External Links.
It almost feels like the 1990s again. No one seems to know what to do with their Websites any more, or how to link out…
That wild, crazy link-sotted party that has been raging for ten years almost seems to be winding down. Whatever will Web marketers do now that they cannot spam Google’s index so easily any more?
While you should definitely read that post for yourself, here are my quick takeaways:
- Matt Cutts never said “Google doesn’t want you to link to other Websites”. Quite the contrary, he says that Google encourages people to link out to each other.
- Matt Cutts never said to add “rel=’nofollow’” attribute on all outbound links you choose to give to other Websites. All they ask is that you use “rel=’nofollow’” for your marketing links.
- Selling links is fine. Google sells links (AdWords). Sell more links Google is OK with.
- User-generated links (from comments, for example) are NOT bad — bad links are generated by bad users (in volume). Remove such links from your site even if the links are automatically nofollowed.
Empower other sites with YOUR outbound links. Make sure they are good sites. Even better, make sure they are sites more likely to reward you.
One more quote for the road:
…if you stop obsessing over links then you really should not care what the search engines think of links. In fact, you should just pretend the search engines don’t exist and are not following your links.
Section 3: Your Practical SEO & Link Building Takeaways
Unnatural Links to Your Site: What to Do
If you did get an unnatural link warning or suspect that you’ve been penalized for your past link building practices, start here:
Then review Manual Actions page to get more information about the action applied to your site.
Links from Scraped Content: Can They Hurt You?
You publish a post. An unscrupulous blogger republishes your post on their blog – with or without a link back, makes no difference. Now you are getting links from them.
Another scenario: you get mentioned on another blog. That post gets scraped. YOU get links back from the scraped content.
What do you do? Will you get into trouble with Google for all these unnatural-looking links?
A while ago, I would’ve said “Hey, it’s free link building!” I even wrote a contest-winning blog post on the topic: 2535 Words on How to Turn sCRAPed Content into Link Building Goldmine.
If in the past, my thinking was “Content scraping is inevitable, might as well make the best of it“, I am now going on the offensive about it including disavowing links from scraped content, as well as filing DMCA complaints against repeat scrapers.
Example of Scraped Content in Action
Last week, my Google In-Depth Articles: How to Rank for Them In Google Search Results post was mentioned in a post at Moz.com.
Side note: it certainly pays to be one of the first ones to write a definitive post about a new development – that’s what happened after I wrote that post.
The reason I even knew about the mention was because my comment moderation queue started to look like this (ironically, I’ve never actually gotten the trackback from the original Moz post):
Naturally, I thanked Pete Meyer for the mention, but also asked for input on scraped links and their potential negative impact on sites.
Haha – sorry about that – we are quite the scraper magnet 🙂 These days, we ignore it, but it’s a lot more dangerous for smaller sites, IMO, and Google still isn’t doing a great job of sorting the problem out.
Enjoyed your article – plus, it saved me from explaining all of those bullet points.
Couldn’t agree more. Scraped content links can become very dangerous very quickly, especially to newer sites.
How to Deal with Scrapers
1. Implement rel=”author” markup – first and foremost.
As a matter of fact, this is as good a place as any to say this:
EVERY SITE must implement Google Authorship rel=”author” and, ideally, Google Publisher rel=”publisher” markups.
Used to be somewhat tricky to do, but it’s a breeze now; you can set up your Google Authorship here within minutes.
As of today, this is the most reliable way to establish the authorship of your content and beat scrapers to search engine rankings as the original content source.
Here’s an interesting case study at SearchEngineLand.com Is Google Authorship Affecting Rankings Today?, which talks about a scraper site managing to claim authorship of content they didn’t write because they had Google Authorship implemented on their site, and the originated site didn’t.
2. Use Excerpts in RSS Feed
A lot of scraping is done through RSS feeds and not your actual blog.
If you publish only excerpts in your RSS feed, you’ll automatically turn away a lot of scrapers since they want long-form content and not short excerpts.
3. File a DMCA complaint with the search engine
When you find content scrapers stealing your stuff, you can file a DMCA complaint with Google.
To learn more about defending yourself from scraped content, read:
- How to Keep Scrapers from Ruining Your Content Strategy – QuickSprout.com
- How to Put the Kibosh on Content Scrapers and Thieves – FamousBloggers.net
- The Definitive Guide to Blog Content Scraping & How to Stop It! – HyperArts.com
“State of SEO” Marketing Takeaway
Google is a tough nut to crack. And just as you think you put a dent in it, they grow another skin.
But that’s OK. I like Google. They mean well.
Let’s end this SEO and Link Building Skinny on a high note:
Everyday is a party when you love what you do (even though Goole might put a damper on it every once in a while)!