Top 10 Reasons Why We Cloaked (Google Spoof)

To continue on with this entry:

Busted: Google And Cloaking Scandal, let’s have some fun and list a few excuses Google could use:

Top 10 Reasons Why We Cloaked (Google Spoof)

10. Alegra went horribly, horribly wrong
9. Was that the title tag? We thought it was the keyword tag!
8. We knew that deal was too good to be true when we outsourced!
7. Who let the dogs out?
6. It was ah, er, um, a hacker! A hacker did it! Big, ugly, hairy hacker!
5. Someone pushed the shiny red button
4. The Google logo is possessed – it did it
3. Do No Evil applies to you, not us, suckaaaah
2. Martha Stewart, Michael Jackson, Mark Jen did it (take your pick)

And the #1 Reason Google Will Give For Cloaking:


You tell me.


[disclaimer: This is all pokes and funs – none of these reasons have actually been given by Google :lol:]

Busted: Google And Cloaking Scandal

Oh boy.

2005 is not shaping up to be the Year Of The Google now is it? First the Autolink scandal and now this flat-out-jaw-dropping-eye-popping newsflash:

From ThreadWatch: Google Caught Cloaking And Keyword Stuffing?

What’s all the fuss about? Cloaking is a method of manipulation used on search engines. You show one page to your visitors, and another to the search bots. One way to catch this is to monitor the pages cached in Google for your competitor. What’s shown in the cache is the page that ‘google’ sees, not what the human visitor sees. Sometimes webmasters use cloaking to stuff keywords, remove outbound links, and other tricks so that google will rank their page more favorably.

Is cloaking fair play? Legal? An approved method of SEO? Nope. Not according to Google:

Cloaking – Google FAQ

What is cloaking?

The term “cloaking” is used to describe a website that returns altered webpages to search engines crawling the site. In other words, the webserver is programmed to return different content to Google than it returns to regular users, usually in an attempt to distort search engine rankings. This can mislead users about what they’ll find when they click on a search result. To preserve the accuracy and quality of our search results, Google may permanently ban from our index any sites or site authors that engage in cloaking to distort their search rankings.

Question is: Will Google ban its bad self now? ๐Ÿ˜•

So who got caught doing what where?

Look at the Title (top of browser) on Google’s page here:

Google Adwords Support

Google AdWords Support: Why do traffic estimates for my Ad Group differ from those given by the standalone tool?

Now visit that same page using the FireFox browser with the user agent switched to googlebot (the extension can be downloaded here: User Agent Switcher – Firefox Extension)

Since you’re visiting that page with the googlebot user agent, you’ll see that the Title is now different:

traffic estimator, traffic estimates, traffic tool, estimate traffic
Google AdWords Support: Why do traffic estimates for my Ad Group differ from those given by the standalone tool?

The keywords: traffic estimator, traffic estimates, traffic tool, estimate traffic were stuffed in.

You can view screen caps here:

Google Cloaking Pics

Wonder how Google is going to explain this?

Web Host Cloaking Scam

Heads up and check your pages:

Obnoxious Cloaking Scam

The scam is perpetrated by a hosting company on its hosting clients. When a search engine spider, such as Googlebot, requests a client’s page, the host adds a bunch of links to the page that is returned. The client has no idea that it happens. The links are mostly to the host site’s pages, and are added for the link text benefit.

You can monitor this by doing a search on your domain name in Google, then view the cache option. Look for any added links. Why is this so serious? Your website could be penalized or banned in Google for cloaking pages. Plus who knows where they’re linking you to.

Not a good thing.

Found the news at ThreadWatch: Scamming Clients – Nasty Host Practices For Search Rankings

Better Bad News Google Video

The movement against Google’s new autolink feature is gaining momentum with the just released video from BBN (Better Bad News) …

Google Pollutes Links Stream With Evil Precedent For Market Censorship

In this 12 minute Video Blogmash the Better Bad News panel re-mixes commentary and analysis of a pending threat to online free speech drawn from several sources.

Includes a big push to sign and promote:

Google, Let Our Pages Go! Petition

We, the people who care about the future of the Web, ask Google to provide publishers with an opt-out to their new Toolbar Autolink feature. We feel that publishers should have the ultimate control of any links that are placed in their content. If Google fails to step up to the plate it will set a poor precedent that others will follow and perhaps with less ethics.

I read some of the petition comments, these caught my eye:

If I spray paint a billboard it is vandalizm… why can google deface my website?

Google need to be aware how this feature could set a worrying precedent that others would not hesitate to exploit.

If I wanted my website to be a Wiki, I would have made a Wiki…

It’s one thing to allow my visitors to change the way they see my site. However, my visitors won’t be making decisions about how to change it, Google is.

And many more – there are currently 197 total signatures. Let’s get this out there and shut down this menace before we lose all control over our webpages to the Google’s, the MSN’s, the big suits.

Again thanks to ThreadWatch for the newsflash.

Yes…The Google ToolBar Again

Danny Sullivan has written an in-depth article here Google Toolbar’s AutoLink & The Need For Opt-Out. It’s a long piece, but well worth the read and includes a few comments from Google’s Director of Consumer Web Products – Marissa Mayer.

I found the article to be fair and informative. I especially connected to Danny’s comments here:

My response to the “protect the user experience” argument is pretty blunt. Too bad if it is harmed in this case, from Google’s perspective.

They may be Google’s users, but they are also my users as a publisher as well. If my visitors are upset that my site prevents them from using Google AutoLink, they can tell and lobby me directly. I don’t need Google deciding for me what my users want on my web site.

Google would gain on the public relations front from offering an opt-out. Even better, I’d encourage them to lobby for a single standard type of opt-out that other publishers could support such as through a robots.txt file extension that works for everyone. That would be real leadership in the industry and in line with the software principles statement it started last year.

I asked myself: Who has more entitlement to the ‘User’? Is it the ISP who provides an internet connection to the user, or is it the Browser which provides the tool to view online content? Is it the BHO (Browser Helper Object) or Toolbar that is added to the Browser? Or is it the Publisher who provides the content to the visitor?

Who holds claim to the ‘User’? Whose User is it? The visitor is using more than one tool to go from point A to point B:

ISP == Browser == BHO == Published Content

We each play a role in helping the visitor to the information that interests them, we each have our jobs. However this BHO (the Google Toolbar) is overstepping from BHO and into Published Content by sabotaging the intent of the Publisher, and *will continue to overstep*. Today it’s directing traffic to Amazon. What will tomorrow bring? (And btw – how *did* Amazon get to be the lucky *chosen* one?)

I too second Danny’s suggestion:

Even better, I’d encourage them to lobby for a single standard type of opt-out that other publishers could support such as through a robots.txt file extension that works for everyone.

That way *both* the Publisher and the User have a choice.

Thanks to ThreadWatch for the heads up on Danny’s article.

Killing Google’s Auto Link

Earlier I posted about Google’s new toolbar using our own content to whisk our visitors away from our sites and drive the traffic to their own ‘features’:

Now Google’s Messing With Our Content

Well here’s the good news – a fix has been found over at threadwatch:

Code For Fighting Google AutoLink

What you need to do:

– Download the script
– Insert the script anywhere in the html
– Or link to it using <link> tag in the <head> of your pages
– And it will rewrite, the rewritten urls, Killing the autolink functionality

Also more discussion and code work found in this thread at SearchGuild:

Remember Smart Tags ?

A new article from the BBC regarding Google’s AutoLink Feature:

Google’s toolbar sparks concern

Search engine firm Google has released a trial tool which is concerning some net users because it directs people to pre-selected commercial websites.

I wonder why Google thought it could pull off what MSN couldn’t?