The Art Of SEO - The Science Of PPC

  • SEO Secrets
  • Web Content
  • Match Intent
  • User Interface
  • Speaking Engagements
  • Company
    • Team
    • Contact

October 27, 2008 by Jonah Stein Leave a Comment

Steve Huffman, Matt Cutts Defending Web 2.0 From Virtual Blight

Tweet

I am happy to announce Reddit Founder Steve Huffman and Google’s Matt Cutts are joining my Web 2.0 Summit Session , Defend Web 2.0 From Virtual Blight.  They will be joined by Jonathan Hochman , who will discuss strategies Wikipedia uses to address blight and Guru Rajan, who will present a case study about HumanPresent, a new technology from Pramana which offers a less obtrusive (and currently more effective) alternative  to Captcha.

Both Steve and Matt spend most of their day in a cat and mouse game against Spammers and others who seek to game their system for personal gain.  Wikipedia has hundreds of thousands of contributors with a wide variety of agendas.  To keep pace, Wikipedia uses a variety of bots and human editorial strategies. These panelist have a tremendous amount of experiences and will share some powerful strategies to help address blight.  I have been talking about Virtual Blight as a construct for understanding and addressing many of the issues facing site operators for over a year now and it is really great to jet a chance to broaden the audience.  My pitch for the session is below.

The success stories of Web 2.0 are the so called “Social Web”, sites built on crowd sourcing; user participation, user generated content  and user voting/rating systems. Sites such as Youtube, Digg, Yelp and Facebook provide mashups of content sources along with a platform for interaction and participation.   Inherent in this model is the assumption that each “user” is an individual who is participating in a community.  The reality is that many “users” are avatars, bots and sock puppet created to spread Spam, disinformation, attack individuals, organizations and companies or manipulate rating systems to promote a private agenda that is not in keeping with the spirit and intent of the community.

The success of Web 2.0 has made it a prime target for spammers, vandals and hackers who want to exploit the trust implicit in this ecosystem. The May, 2008 headlines about Craiglsist’s ongoing battle with spammers highlights the problem. Web 2.0 companies need to recognize this type of manipulation as a fatal cancer and develop strategies to aggressively defend themselves against the ravages of blight that can devastate their communities.

We are all too familiar with community blight in the physical world: The downward spiral afflicting many of our urban and suburban neighborhoods. Blight is marked by abandoned and foreclosed building, liquor stores and payday lenders on alternating corners, trash strewn lots and front yards, graffiti-covered buildings, broken sidewalks, broken glass and billboards everywhere you look. Prostitutes, drug dealers and scam artist haunt the shadows.

Domains and web properties afflicted with Virtual Blight are like neighborhoods suffering from urban blight. Billboards advertise payday loans, pornography and offshore pharmaceuticals, street corner hustlers offer knock off watches, get rich quick schemes, pirated movies and software along with other products  that are suspect and often illegal.  Kids aren’t safe to roam around and people move out.   Online neighborhoods begin as attractive destinations, but often they turn into vacant, desolate ruins. Hotmail and Geocities are two prime examples of Web neighborhoods that have been impacted by virtual blight, destroying billions of dollars worth of brand equity in the process.

Filed Under: Google, Punditry, Speaking

September 12, 2008 by Jonah Stein Leave a Comment

Pay The Links Love Forward

Tweet

Aaron Wall had a great post today about Black Hole SEO in which he called out The New York Times, among others, for hording Page Rank by refusing to give links.

The idea may appeal to some, but it is a recipe for failure. Page Rank hording is an illusion. Linking out to an authority site actually helps your site be authoritative. My take is that Google probably manages internal Page Rank flow and external flow separately. Linking out to an external page has nothing to do with how much juice you can pass internally.

The willingness of bloggers to liberally link out whenever they write a post has a lot to do with how much easier it is for a blog to rank than a typical commercial site with the same amount of content and link juice. This generous link profile also explains why so many conspiracy SEOs think that .orgs and .edu sites are preferred by Google. Not for profits and particularly academic sites tend to include a lot of citations. Only commercial sites fail to link out or put all of there outbound links on a single page.

Black Hole Linking profiles are pretty easy for Google to spot. A big site without outbound links probably sticks out like the Great Wall of China– search engines can from outer space!

So, play the link of forward. The world will reciprocate.

Filed Under: Google, Search Engine Marketing

August 25, 2008 by Jonah Stein 2 Comments

Google Page One Rankings Get Harder

Tweet

Anyone else out there seeing Google changing from 10 results per page to 7? This appears to be a UI experiment rather than a definite change. While only a few people have seen this tweak, it may be a sign of the biggest change to hit search engine results in years.
Google SERP Of Seven
The idea of 10 organic results seemed sacred, surviving the launch of blended/universal and the integration of local/maps, One Box and Sitelinks, not to mention preserved across all the major and not so major engines. Google may have been preparing for this for some time with changes such as reducing the number of ad units in the right rail.

It will be very interesting to see how this change influences user behavior.  Will this increase the percentage of people who go to page 2?  Will it increase CTR for the 4-7th positions?  Will it increase the number of people who reformulate the query?

Implicit in this decision seems to be Google’s belief that the top seven results contain all that is relevant.  Hubris is the first step towards defeat.

Filed Under: Google, Search Engine Marketing

May 13, 2008 by Jonah Stein Leave a Comment

Friend Connect A Slap In The Face Book

Tweet

I was at the flex for the Campfire One event last night to mingle with the Blogarazzi, drink hot chocolate while being regaled with Guacamole recipes and tales of how Google’s new friend connect was going to allow anyone to add social features to their website with only rudimentary html experience. Google has jumped from gadgets to widget by leveraging all the work done to support their Open Social and Open Auth to make it easy for web-masters and web-apprentices to go social.

It’s not the quality of the widget that bothers me or even trade offs that webmasters make when they join a platform. I don’t mind that Google is leveraging open source to meet Facebook’s platform play. Facebook has lots of dough and lots of hubris.

I am bothered by the sudden lack of oxygen in the room as Google inhales whatever momentum social networking has left and makes sure that the next phase of social networking companies will suddenly need to be Google Centric. Microsoft used vaporware and some very predatory contracts to drive competitors out of business. Open Social, Google App Engine and now Friend Connect demonstrates that Google has perfected the art of co-opting everyone else’s efforts by offering them an infrastructure umbrella.

Google has absorbed the best ideas of the emerging social web for pennies on the dollar while making sure they have the inside track on monetization. They have done it with a team of hard working, dedicated engineers who want to make the web a better place. Unless you are a cynical search marketer or your start-up is suddenly having trouble breathing, you won’t even notice.

Filed Under: Google, Punditry

April 30, 2008 by Jonah Stein 4 Comments

Google Using Location To Rank Sites?

Tweet

Google has been displaying local results for queries that contains a local component for some time now. A common query like san francisco restaurant displays 10 local search results along with a map. We now expect this type of local targeting because the query contains a geographic component which Google interprets as a geographic data.

Many search pundits have been calling recent changes in SERPs another Google dance. Evidence suggests that Google may be adding geographic information about brick and mortar stores and service companies to the algorithm along with reverse IP lookup to help boost site rank for regular results. Take the query engagement ring . I conducted this query in San Francisco and Los Angeles and got very different results.

Google results for engagement rings in Los Angeles

Here is the same query in San Francisco.

Google results for engagement rings in San Francisco

Just to be sure I wasn’t seeing results from different data centers, I went ahead and queried multiple date centers using the SeoLog.com Datacenter Ranking Tool and discovered that the results didn’t match the LA results or the San Francisco results. In fact, some sites rank significantly better in both San Francisco and Los Angeles than they show on any of the data centers using the SEOLog tool.

We know Google been collecting service area information from webmasters from webmaster central. We know that they have extensive data from Google maps and third party vendors about where businesses are located. It makes sense that Google would add relevance to a website that represent a brick and mortar or service that is nearby. It appears they are starting to do so.

Updated 10-20-2009: In a Webmaster Video today, Matt Cutts acknowledged that Google “May” some day uselocation IP data to personalize search results .

Filed Under: Google, Search Engine Marketing Tagged With: Google Algorithm, Google SEO

  • « Previous Page
  • 1
  • …
  • 3
  • 4
  • 5
  • 6
  • 7
  • Next Page »

Topics

  • Facebook
  • Google
  • Measuring ROI
  • Punditry
  • Random Thoughts
  • RANT
  • Search Engine Marketing
  • Speaking

recent

  • Think Like a Search Engine: SMX West 2016
  • UnGagged Las Vegas 11-9-2015
  • Performance Marketing Summit
  • Building Your Hummingbird Feeder
  • July Search Quality Updates

Intent Focused SEM

SEO and Pay Per Click landing pages should almost always be designed with the same content and the same layout because search engines reward on-page and on-site factors by trying to emulate human users as they crawl the page and navigate the

Copyright © 2023 · Executive Pro Theme on Genesis Framework · WordPress · Log in