The Art Of SEO - The Science Of PPC

  • SEO Secrets
  • Web Content
  • Match Intent
  • User Interface
  • Speaking Engagements
  • Company
    • Team
    • Contact

January 2, 2014 by Jonah Stein 5 Comments

Building Your Hummingbird Feeder

Tweet

HummingbirdTargetGoogle announced in September 2013 the release of Hummingbird, their first algorithm replacement in 15 years. This was not a tweak or a new signal, but the first complete overhaul of Google’s algorithm since BackRub. Hummingbird is billed as a response to the changing nature of search, the trend towards natural language queries, and the growth in the length of queries.

SEO pundits were quick to offer their take of what this change meant and how it would affect us all; mostly they have missed the mark, although some have given good advice in the process.

Google Jeopardy: Put all of your content in the form of a question, Eric Ward suggested, and you will suddenly rank on the top of the first page. It is true that Q&A sites like Quora and Stack Overflow have been the rage lately. It is also true that a well executed FAQ strategy can be the foundation of  simple and effective content marketing. If someone actually asks you a question via email, live chat, forum, etc., then likely hundreds of other users are trying to answer the same question via search. Mining user questions and publishing the answers is a great strategy that could yield big gains in traffic — but it probably has little to do with Hummingbird.

Topics and Sets: AJ Kohn has a great piece about Google Now and Sets which got me thinking about the role of search history & personalization in Hummingbird. It is well worth a read. Basically AJ points out how Google maps words (queries) and sites you click on to build Google Now topics. In the process, they create a small data leak into the personalization/disambiguation engine Google is using to discover the true user intent.

My analysis of the Hummingbird Update focused largely on the ability to improve topic modeling through a combination of traditional text analysis natural and entity detection. Google Now Topics looks like a Hummingbird learning lab.

Watching how queries and click behavior turn into topics (there’s that word again) and what types of content are displayed for each topic is a window into Google’s evolving abilities and application of entities into search results.  It may not be the full picture of what’s going on but there’s enough here to put a lot of paint on the canvass.

This insight is great, as far as it goes, but in my opinion entities and topics are important but not at the core of Hummingbird.

Hummingbird is about disambiguation of search intent based on the users search history.  The missing clues to the true nature of Hummingbird are readily available.

The Big Brand Bailout: A few years ago big brands started magically dominating search results for highly competitive short tail queries. Displaced site owners (many with lead gen sites) screamed in protest.  Google called this update Vince.   @stuntdubl called it the Big Brand Bailout and that is the name that stuck.  And hundreds of theories suggest how/why this happened.   Eventually, a Google engineer who was not trained in the @MattCutts school of answering-questions-without-saying-anything-meaningful slipped up and revealed that Google was relying on users’ subsequent query behavior to improve SERP for the initial query and elevating sites (brands) that showed later in the click stream.  Brands that were included in as little as 1-2% of subsequent queries got pushed on to the first page of results.  This was the first direct evidence we had of Google using user behavior to disambiguate intent and influence rankings on the original query.

Panda:  Many parts of the Panda update remain opaque and the classifier has evolved significantly since its first release.  Google characterized it as a machine learning algorithm and hence a black box which wouldn’t allow for manual intervention. We later learned that some sites were subsequently added to the training set as quality site, thus causing them to recover and be locked in as “Good sites.”  This makes it especially hard to compare winners and losers to reverse engineer best practices.  What most SEO practitioners agree upon is that user behavior & engagement play a large role in the site’s Panda score. If users quickly return to the search engine and click on the next result or refine their query, that can’t be a good signal for site quality.

Personalization: If you are an SEO, you likely spend a lot of time in incogito/private browsing mode and turn off search history, location tracking and other features that allow Google to track your intent, your movements and your online behavior.  If you are not an SEO, you likely surf logged into your Google+/Gmail/Gchat/YouTube/Borg account with your history enabled.  If you browsed more like a normal person, you would have noticed how dramatically Google SERPs change based on previous queries.  Notice how the search URL has evolved to include your query sequence.

Conversational Voice Search:  Google demonstrated what they mean by “conversational search” at the I/O conference in May of 2013.  Danny Sullivan provided his usual excellent coverage, which focused on Google’s ability to remember your previous queries and provide context for your question. This is a significant advance in the user experience for and something that differentiates it from Siri.  More importantly, it is an unambiguous statement about what Google means by “conversational search.”  Conversational search is the ability to use previous interactions to disambiguate the user intent for subsequent queries.  

Google’s announcements are often aspirational and seemingly lack nuance. They tell us that they have solved a problem or devalued a tactic and we all point to the exceptions before proclaiming the announcement as hype or FUD; years later we look around and that tactic is all but dead and the practitioners are toast (unless the company is big enough to earn Google immunity). These pronouncements feel false and misleading because they are made several iterations before the goal is accomplished.  The key to understanding where Google is now is to look at what they told us they were doing a year ago.

In the case of Hummingbird, what they told us is that the search quality team has been renamed the knowledge team; they want to answer people’s search intent instead of always pushing users off to other (our) websites. Google proudly proclaims that they do over 500 algorithm updates per year and that they are constantly testing refinements, new layouts and features.  They also allude to the progress they are making with machine learning and their advancing ability to make connections based on the enormous amount of data they accumulate every day.  Hummingbird marks a sea change in our understanding of the algorithm.  Instead of the Knowledge Team, they should have renamed it the Measurement Team, because Google is measuring everything we do and trying mine that data to understand intent provide users with the variation they are looking for.

What does this mean to site owners?

This is the $64 billion dollar question and one without a simple answer. Matt Cutts told us at SMX Advanced in 2013 that only 15% of queries are of interest to any webmaster/SEO anywhere.  85% of what Google worries about we actually pay no attention to. An update that affects 1.5% of queries can affect 10% of queries some SEO somewhere cares about and 50% of the “money terms” on Google.

Simultaneously, Google tends to role out changes and then iterate them.  The lack of  screaming protests or volatile weather reports suggests that very few results actually changed when Hummingbird was released; at least results you can view in a ranking scraper. Instead Google rolled out the tools they need to make the next leap in personalization which will gradually pick winners and losers.

The good news is that Hummingbird provides a significant chance for onsite SEO to improve performance and generate strong ROI.  Machine learning is data driven and by nature the product of objective, measurable user actions.  Site owners who embrace user focused optimization (not narrowly defined conversion goals) and build out robust topics with segmentation driven by mapping related queries to content that honestly addresses user intent can significantly improve engagement.

That is Hummingbird nectar.

 

Filed Under: Google

August 13, 2012 by Jonah Stein 2 Comments

July Search Quality Updates

Tweet

 

Google has taken to releasing algorithm changes and this month they announced a total of 87 changes they made, affecting everything from answers to images to spelling.  Here are the ones that are specifically identified as Search Quality related.

uefa-euro1   . Answers Addition of a live result showing schedule and scores of the EURO 2012 games (European championship of national soccer teams).
Bamse Page Quality This launch helps you find more high-quality content from trusted sources.
Bamse-17L Page Quality This launch helps you find more high-quality content from trusted sources.
GreenLandII Page Quality We’ve incorporated new data into the Panda algorithm to better detect high-quality sites and pages.
#82353 Page Quality This change refreshes data for the Panda high-quality sites algorithm.
#82367 Ranking Component This launch helps you find more high-quality content from trusted sources.
#82666. Page Quality This launch helps you find more high-quality content from trusted sources.
PandaMay Search Quality We launched a data refresh for our Panda high-quality sites algorithm.
Hamel Page Quality This change updates a model we use to help you find high-quality pages with unique content.
Panda JK. Page Quality We launched Panda on google.co.jp and google.co.kr to promote more high-quality sites for users in Japan and Korea.
JnBamboo Page Quality We’ve updated data for our Panda high-quality sites algorithm.

Not very helpful but better than nothing!

Filed Under: Google

August 10, 2012 by Jonah Stein 3 Comments

FAIL – Victim of Negative SEO Still Suffering

Tweet

I just received a form letter response for a reconsideration request.  A generic response to a letter submitted after hundreds of hours of effort certainly doesn’t come as news to anyone who has been or works with clients  penalized by Google, but if I may mix a few metaphors, this is rubbing salt in an open wound poured by a vengeful and arbitrary god.

I should start with a little background.  In April of this year I was engaged by LogoGarden.com to determine why their Google traffic had suddenly fallen.  Within a few seconds it was obvious that the site had been penalized and before I accepted the project I asked the client about their link building history.  I was assured that while they had engaged two highly respected SEO firms to build links to their site, they had been assured that no links would be purchased on their behalf and that only “white hat techniques” would be employed to garner high quality links from relevant sites.  Here is John Williams’ summary of their link building:

 In 2010 we had a link building campaign for logogarden.com conducted by Stone Temple Consulting which is owned by Eric Enge.  He assured me at the time that they would not buy any links.  The other link building activity for LogoGarden was conducted by SlingShot SEO from November 2011 to February 2012. They were recommended to us by Jillian Muessig of SEOMoz as a “white hat link builder”. Again, they were only asked to build links to LogoGarden.com and they specifically were instructed not to buy links.

After an aggressive round of onsite de-optimization failed  to product any results, I turned to analyze their back link.  It should come as no surprise by this point in the narrative to hear that was I found was not pretty.  Over 20,000 links from about 1,000 linking domain.  Most of the links were from sites that were obviously spam and about half used the exact match phrase “logo design”.  In short, just what you would expect to see if you hired a bunch of crappy links from Fiver.com or engaged an off shore link developer in response to an email — and exactly what the client swore he never did.

Digging further, a couple of patterns became obvious.

  1. The vast majority of these links were added from October 2011 to January 2012.
  2. Links were pointed at both LogoGarden.com and LogoGarden.co.uk

The importance of these facts is quite simple; LogoGarden did not have ANY link builders engaged during October and most of November AND neither of the link building engagements were asked to target the UK site, making it extremely unlikely that the damage was done by their SEO firms.  In addition, the start of this link surge corresponds with a copy write dispute that LogoGarden had with a couple of prominent designs.

Analyzing the facts, I became convinced that LogoGarden.com is  the victim of a prolonged and effective Negative SEO attack.  Knowing you are a negative SEO victim and proving it are two different things.

No link profile is pristine.  Even a site that has never engaged in any link building has been scraped, spun, rehashed and mashed up in enough places that some bad links will exist. That is one of the challenges faced by search engines and in the era of Penguin and other link penalties it makes it especially hard for a site owner or SEO consultant to know what is going on.  It is undisputable that a couple of spammy, pay per post type links to LogoGarden.co.uk dated all the way back to 2010.  For a skeptic (or a search engineer), these posts provide evidence of original sin and can be used to discredit the claim that someone else is responsible.

The only recommendation I could make was for the client to embark on a link cleanup campaign precisely as if they were responsible for the links in the first place.  We pulled the historical data our of MajesticSEO.com and combined it with the reports from Ahref.com and OpenSiteExplorer.org.  We then processed the links into the following categories and attempted to contact every webmaster we could from the spam category.

  1. Spam – spam sites
  2. Offline – site not live
  3. Offline – domain expired
  4. Guest post submitted by John Williams
  5. Blog/articles about LG symbols – these are comments and/or articles about the copyright issues with some of our symbols.
  6. Not a junk link- industry site – these are sites that contain our link but look like legitimate industry websites that are discussing graphic design, web design, etc.
  7. No link- could not find LG link on the site
Details can be found here: https://docs.google.com/spreadsheet/pub?key=0An8EJTH0YMtkdHFNeGVtWncza2tKZU1fUHdWOXVEYmc&output=html:
In addition, my client was justifiably angry to be suffering tremendous losses because of the actions of some unknown individual.  In response, he decided to offer a $10,000 reward for proof of who was responsible for this campaign.  Not only did he blog about this offer, he also included it in every email he sent to site owners asking them to remove the links to LogoGarden.com.  Finally, after hundreds of hours of effort to classify site, seek contact information and reach out to sites that had been used to attack us, John Williams wrote the following reconsideration request:

I am writing to request reconsideration and redress to lift the algorithmic and manual actions taken again LogoGarden.com. It it our strong belief that LogoGarden.com and LogoGarden.co.uk are the victims of a Negative SEO attack resulting in a ranking penalty.

We clearly see hundreds of links from spammy networks that are aimed at both the UK and US (.com) domains. A large volume of low quality links on expired/repurposed domains began to appear in October of 2011, aimed at both the US and the UK site; during a time period when we had no active SEO engagements. This time period also corresponds with a copyright issue we had with with a handful of graphic designers in the US. When LogoGarden was made aware of the issue, we quickly rectified the issue. While we have no evidence of who is behind the negative SEO campaign, the timing is suspicious.

We may have created some of the issue ourselves with the work done by one of our previous SEO Consultants. In 2010 we had a link building campaign for logogarden.com conducted by Stone Temple Consulting which is owned by Eric Enge. Eric’s reputation is excellent and he is outspoken against paid links. He assured me at the time that they would not buy any links and he has subsequently confirmed this with Jonah Stein, who is currently advising us on this matter. We do see a few links built in 2010 that aim at LogoGarden.co.uk are on splog sites with targeted anchor text. All evidence would suggest these are paid links but we explicitly NEVER contracted anyone to build links to the UK site. The other link building activity for LogoGarden was conducted by SlingShot SEO from November 2011 to February 2012. They were recommended to us by Jillian Muessig of SEOMoz as a “white hat link builder”. Again, they were only asked to build links to LogoGarden.com and they specifically were instructed not to buy links.

Their subsequent refusal to provide a list of links they built resulted in their termination but they still insist that they did not buy any links on these networks. Our research indicates that they mostly created forum links. They have refused to take down these links or participate in our reconsideration request.

I understand that it is difficult to prove a third party’s actions are responsible for these link as opposed to our own actions, but here are the steps we have taken to address the issue:

1. We have attempted to contact the owner of every blog site/network that is linking to LogoGarden.co.uk with targeted anchor text to ask them to remove links. Details, on a site by site basis, are included here: https://docs.google.com/spreadsheet/pub?key=0An8EJTH0YMtkdHFNeGVtWncza2tKZU1fUHdWOXVEYmc&output=html

2. We have attempted to contact the owner of every blog site/network that is linking to LogoGarden.com with targeted anchor text to ask them to remove links. Details, on a site by site basis, are included here: 3. We have offered a $10,000 reward for proof of who is paying for them to be built in the first place and have been actively promoting this reward to every site we contact to request a link removal. Here is a link to the article: http://www.logogarden.com/blog/branding/negative-seo-victim-strikes-back/

Here is the actual letter we have either sent to or entered in the contact form of hundreds of sites. $10,000 reward:

My company, LogoGarden.com has been scammed by malicious SEO behavior. I’m offering USD $10,000.00 for information leading to the identification of the individual(s) targeting LogoGarden.com with junk backlinks. Someone has paid blogs, like yours to post links to LogoGarden.com. Just to be clear, your business practices are not in question. You offer a service (backlinks) and it is up to companies if they choose to use your blog not.

In Feburary, 2012, LogoGarden had page one organic search for “Logo Design”. Overnight, LogoGarden’s top term “Logo Design” vanished from Google. After intense analysis, our SEO firm ruled out all possibilities but one: SEO sabotage. Someone is paying blogs with the intent of costing LogoGarden market share. Also, PLEASE REMOVE ANY OF YOUR BLOG LINKS THAT LINK TO LOGOGARDEN.COM

Please contact me atJohn@LogoGarden.com

 

It is not clear what else LogoGarden could have done.  What is clear is that even an enormous effort to contact webmasters cannot begin to undo the damage that can be inflicted by a Negative SEO attack.  A majority of these sites have no contact information and it is doubtful that most of the rest actually go to a human being who can be convinced to take action.  In fact, one of the biggest thing spam sites have in common is that ownership is deliberately obfuscated and contacting the owner is virtually impossible.

Filed Under: Google

March 21, 2012 by Jonah Stein Leave a Comment

Fixing Panda Problems – SEOBook Interview

Tweet

Its been a long time since I updated my blog but I was recently interviewed by Aaron Wall of SEO Book about diagnosing and solving panda problems. The interview is long and hopefully helpful. The feedback has been gratifying, especially since it seems no one is talking about fixing Panda problems anymore. Here are a couple of quotes that may make you want to read the rest.:

My takeaways from Panda are that this is not an individual change or something with a magic bullet solution. Panda is clearly based on data about the user interacting with the SERP (Bounce, Pogo Sticking), time on site, page views, etc., but it is not something you can easily reduce to 1 number or a short set of recommendations. To address a site that has been Pandalized requires you to isolate the “best content” based on your user engagement and try to improve that.

Diversify your traffic!

Last year Google made a huge stink about MSN “stealing” results because they were sniffing traffic streams and crawling queries on Google. The truth is that Google has so many data sources and so many signals to analyze that they don’t need to crawl facebook or index links on twitter. They know where traffic is coming from and where it is going and if you are getting traffic from social, they know it.

Engagement is not conversion or time on site, it is honoring search intent.

More importantly, however, is that they are going to focus on meeting the needs of the user as opposed to simply converting them during that visit. To use a baseball analogy, we have spent 15 years keeping score of home runs while the companies that are winning the game have been tracking walks, singles, doubles and outs.

Filed Under: Google

May 18, 2011 by Jonah Stein Leave a Comment

When A##hat Hackers Attack

Tweet

Today brings yet another story of a “hi profile” hack, this time targetting Ronaldoinho, http://www.ronaldinhogaucho.com/, one of the best soccer players in the world and a man loved and hated by million. No one should be surprised by the lack of security for a celebrity website or that the webmaster managed to “restore” the site while leaving the hidden links in the code. Likely as not, those links were not even left by the hacker Terrorist_MC who defaced the site, since if one person can find an exploit likely many others can as well.

Terrorist_MC, Konut Projeleri and Gebze Evden Eve Nakliyat, three more reasons to sign up for free website backup using the “coupon code” of itstheroi.

What really pisses me off is that someone is paying hackers “build links” for their sites so they can try to rank for “turkish web design” or “housing projects”… or perhaps sabotage their competitors. This type of “link building” destroys the reputation of our industry and makes consumers justifiably nervous when visiting sites that are not from “big brands” (although Ronaldo Inho is certainly a big brand). More importantly, just as adsense monetizes spam and scraper sites, this type of “SEO” monetizes the script kiddies and hackers who devote themselves to making the web a more dangerous place.

Just as upsetting is how bad the “back-up” and monitoring systems web hosting providers. These back-ups sound good in theory–you are assured that your site is backed-up on a system that is completely separate from the main one and that you’ll be able to access it whenever you need it. When you actually need them, like when your site has been hacked or your developers screw up, you often discover that your backup is hard to access, out of date or has been affected by the same event.

I first came face to face with this combination of blame avoidance and finger pointing about 2 years ago when a customer who had been hacked called me praying looking for a backup of his website. The only version i had was about 3 months old, which was actually newer than the most recent version his ISP was able to recover. The indian developers had a newer version–one that had never gone live because it was so full of bug. The upshot of that experience was the germ of a seed that grew into a little startup called CodeGuard.

CodeGuard is unlike ISP based backups. Instead of a static snapshot of your site kept by your ISP (hopefully), we use a File Integrity Monitoring system built on a version control system and store site data in the cloud.  CodeGuard backups are stored as the differential between each daily scan of the site; allowing users visibility into what has changed along with the ability to “undo” changes on their site and restore to a past version in real time (minus the time it takes to push the files over FTP), much like Time Machine for Apple OS does for your laptop.  

If the current backup solutions for webmasters are lackluster, then the systems that alert webmasters if their site has been hacked are criminal. Webmasters discover they have been hacked because traffic suddenly disappears, they see a warning message on Google when searching for themselves, or they get an email from a customer complaining about strange behavior.  

CodeGuard’s differential backup is a game changer for hacking detection and remediation. In addition to pinging the Google Safe Browsing API for our clients, we also scan files that have been modified since our last backup.  This allows us to identify hacking and alert the site owner before they can spread malware, have their links pirated or act as a parasitic host for spammers–hopefully before safe browsing alerts quarantines the site and kills all of the traffic.  

In the event a hack or an unauthorized change is detected, webmasters can quickly revert to the last known “good” version and have their site working in minutes without engaging a developers to remediate the issue.  CodeGuard can then be set to automatically revert the site to that version until the owner is able to patch the vulnerability in the site.

You can sign up for free website backup using the “coupon code” of “itstheroi”.

Filed Under: Google, Punditry

  • 1
  • 2
  • 3
  • …
  • 7
  • Next Page »

Topics

  • Facebook
  • Google
  • Measuring ROI
  • Punditry
  • Random Thoughts
  • RANT
  • Search Engine Marketing
  • Speaking

recent

  • Think Like a Search Engine: SMX West 2016
  • UnGagged Las Vegas 11-9-2015
  • Performance Marketing Summit
  • Building Your Hummingbird Feeder
  • July Search Quality Updates

Intent Focused SEM

SEO and Pay Per Click landing pages should almost always be designed with the same content and the same layout because search engines reward on-page and on-site factors by trying to emulate human users as they crawl the page and navigate the

Copyright © 2023 · Executive Pro Theme on Genesis Framework · WordPress · Log in