Search Engine Optimization and Marketing for E-commerce

Google May Penalize Ad-Heavy Pages

by akagan 11. November 2011 08:11

Google’s anti-spammeister Matt Cutts announced Nov. 9 at Pubcon that the search giant is testing new algorithms to measure how much content on a page is “above the fold”, or actually relevant to the search query, vs. how much content is advertising that may be obscuring the actual content.

“If you have ads obscuring your content, you might want to think about it,” suggested Cutts in his presentation. “Do they see content or something else that’s distracting or annoying?” While Google’s Panda update already penalizes web pages with little original content, Cutts’ remarks indicate Google is further trying to separate the wheat from the chaff in terms of relevant content.

A quandary for web revenue models

While more testing will reveal “how much is too much”, Google’s impending actions would appear to make it more difficult for content creators to generate revenue from advertising on high-ranking pages. Of course, Google itself derives 99% of its revenue from advertising, so any tweaks to content-ranking algorithms will likely be exceedingly subtle at first. Third-party advertisers and content aggregators will need to watch these developments carefully, though.


Google AdWords to break out top vs. side ad positioning for text advertising metrics

by Andrew Kagan 14. July 2011 11:37

In response to concerns about where paid advertising appears on its search engine results pages, Google announced that it would fully disclose whether AdWords ads were appearing above or beside it’s organic search results.

Beginning this week, Google AdWords reporting interface will now allow you to segment advertising performance data using “top vs. side” metrics to better help visualize where ads are appearing for specific keywords. Prior to this update, ads showed an “average position” against specific keywords without indicating whether or not the ad appeared on the side or above the organic search results.

It’s good CTR to be Number 1!

The change is significant because Google has publicly admitted that there is a tremendous difference in click through rates (CTR) between larger ads appearing above organic search results and the smaller ads appearing to the right. In addition to their larger size, top-position ads closely mimic the layout of organic search results, and many users may confuse them with organic results, further increasing CTR. According to Google Chief Economist Hal Varian:

This distinction is important, since, on average, ads that appear above the search results tend to get substantially more clicks than ads that appear on the right-hand side.

As Google has blurred the line between paid and free search results, advertisers need to be even more careful about ad quality and landing page relevance in order to secure top positioning. But organic search results still enjoy better reputation and CTR than paid advertising, regardless of position, so SEO is still as important as ever.



Will Google Index .XXX domains and what does it mean for SEO?

by Andrew Kagan 12. July 2011 11:51

The future of vanity TLDs (top level domains) and their effect on SEO grew murkier at a recent adult-entertainment industry conference sponsored by YNOT, where a panel convened on the upcoming rollout of the .XXX TLD. Like other “vanity” TLDs, this new adult-oriented TLD (launching at the end of 2011) raises more questions than it answers about the future of search engines and SEO.


The panel, featuring Connor Young (President and CTO of YNOT Group), Vaughn Liley (Director of Sales for ICM Registry), Tom Hymes (AVN) and Eric M. Bernstein (Attorney at Law), revealed fears by the industry that a new TLD “just for them” would force them to proactively purchase domains just to protect copyrights and trademarks. ICM’s Liley made only weak assurances that companies and individuals could protect their copyright/trademark or name, and only by taking the right of first refusal to register them at $60-70 per domain. Anyone not pre-registering their domain would have to fight any interlopers legally through ICANN’s standard domain name dispute resolution process, which commonly costs around $5,000 for multiple domains involving a single trademark.

What is of greater concern are individual-named URLs. Obviously “” will be protected, but what if you (or your 8-year-old child) share your name with a legitimate adult-industry actor? The skies further cloud over with the imminent release of a slew of vanity TLDs targeting commercial trademarks.

The vibe from conference attendees was mostly negative, with many seeing it as a “shakedown” of legitimate companies that will have to defensively register many dozens of XXX domains per brand. Brand protection may become much more complicated with unlimited vanity domains on the way.


There has been no official word from the major search engines if they will index vanity domains or not. ICM’s Liley mentioned that the registry may create it’s own search engine for the XXX domain, leading many to suspect Google, Bing and Yahoo will not index this TLD, but it is hard to believe they will completely turn their backs on the opportunity to grab some advertising dollars from this multi-billion dollar industry. This SEO expects to see an opt-in option on search results in the near future, just as you can opt into adult content on Google image search now.

Vanity TLDs and SEO

While the XXX domain may only concern adult-industry SEOs, the imminent onslaught of commercial TLDs (e.g. “.APPLE”, “.COKE”, etc.) throws a wrench in conventional SEO strategies, because search engines are most certainly going to index the new TLDs.

Since TLD is already a known metric for weighting search results (.EDU and .GOV hold special relevance in search rankings, and .COM still holds precedence over geographic TLDs), it remains to be seen how search engines intent to weight the vanity TLDs. Will “” have greater relevance than “”? What about “” vs. “”? Will companies adopt a protect-and-hold strategy (i.e. register the domains but not use them)--effectively defeating the use of vanity TLDs—or will this usher in a new era of usability issues and URL confusion?

Assuming the standard rules of SEO apply, companies will have to decide whether to consolidate their brand around a .COM identity or a vanity TLD, or perhaps split their SEO between a public vanity TLD and reserve their .COM registration for corporate use. In terms of SEO, the former can be handled properly through the use of redirects, but it remains to be seen how search engines will weight the new TLDs. Even worse, the latter would involve splitting relevance between two websites and negatively affect search rank as a result.

Bottom line: you’re still going to have to have a single point of presence for maximum relevance, and for the forseeable future the current TLD weighting still applies.


Google+ not ready for business

by Andrew Kagan 7. July 2011 10:27

Google announced that it’s new social media platform, Google+, is not quite ready for prime time, asking businesses to hold off on creating profiles until the service is opened to the general public.

Google+ has been in limited rollout to individuals only for the past couple months. Much as they have with Facebook, some businesses had tried to create business profiles using personal Google+ accounts.

Google+ is a new “social media” service from the search giant that will compete head-to-head with current king of all social media, Facebook. Google+ groups its social functionality into “Circles” (individual associations, like Facebook’s “friends”), “Streams” (a type of public newsfeed, like Facebook’s “wall”), “Sparks” (a means of sharing information with friends), “Hangouts” (a form of communal space to share interests), and “Chat”.

Combined with Google’s other services, these new “social” features place Google in direct competition not only with Facebook, but also with communication suites such as AIM, Skype, Webex and Microsoft Live. Google has already tackled other media-sharing giants such as Flickr (with Google’s Picasa services) and YouTube (with Google Video), with limited success.

Success or failure of Google+ will hinge on how fast this new service is adopted by both individuals and businesses, and any delays may result in a fall-off of interest by users who have already invested significant effort in Facebook and other brand-aware networks.


Google restores J.C. Penney’s rankings after link-bait scam

by Andrew Kagan 7. July 2011 09:32

After removing most of J.C. Penney’s links from search results after it found Penney was engaging in a link-bait scam to improve SEO, Google finally restored much of the company’s links after a 90-day period.

Recent tests of keywords J.C. Penney had previously ranked well against show Penney’s SERPs creeping back up, for both long-tail and short keyphrases. The penalty, imposed after the all-important holiday shopping season, had significantly impacted Penney’s online sales.

Penney had always proclaimed ignorance of what it had empowered it’s search marketing firm, Searchdex, to put it on the first page of Google’s organic SERPs for high-value keywords, such as “furniture”, “skinny jeans” and “comforter sets”. Searchdex abused a weakness in Google’s ranking algorithm that placed a great deal of weight on inbound links, regardless of their source. By seeding tens of thousands of websites with links back to Penney’s product pages, those pages rose to first position in Google’s results.

A report of the successful grey-hat manipulation in The New York Times Feb. 12 this year led Google to accelerate it’s latest series of algorithmic changes, commonly known as “Panda”, which better filters irrelevant links from skewing SERPs. Google also imposed a steep penalty on Penney, surpressing its pages from the search results for hundreds of keywords.

Penney claims to have fired Searchdex, and recently has been revising it’s website pages, most likely in an effort to better meet Google’s website guidelines. Google, however, is still returning old page links that Penney previously ranked for, not the new links, indicating that Google has restored some of the company’s rankings without have updated it’s own indexes.

Clearly it was in both companies’ best interest to restore Penney’s rankings. Penney had been running a multi-million dollar AdWords campaign prior to being penalized, and had likely suspended that campaign until Google lifted the penalty.


Google+ vs. Facebook Social Media War Heats Up

by Andrew Kagan 5. July 2011 10:12

Google's push to make their ubiquitous search engine more social with its "+1" tool is heating up the war with Facebook over referral rankings, leading Facebook to restrict access to users’ own data when trying to export friends’ info to Google.

Google has long had personalization built into its search-related apps, and its Local search services are already competing with destination-based review services like Yelp, FourSquare and HeadsUp. Google’s +1 service, which relies on a new Google+ friends database, is now moving recommendations into its main search results, creating direct competition with Facebook for user ratings.

The struggle reached a flashpoint with the introduction of Mohamed Mansour’s “Facebook Friend Exporter” extension for Google’s Chrome web browser, which exports all your Facebook friends’ contact info including email addresses to a CSV file. Early adopters of Google’s +1 service have started to utilize this tool to conveniently migrate Facebook friend data into the Google+ friends database. As use of the tool skyrocketed in recent weeks, Facebook began to block it claiming it violated their terms of use.

Privacy advocates immediately cried foul, criticizing Facebook for preventing users from easily accessing their own information. New solutions for getting your friends data out of Facebook are now popping up all over the place as a result. Facebook will undoubtedly suffer a black eye if it tries to escalate and prevent “friend seepage”.

Thumbs Up vs. Like

Google’s +1 service allows logged-in users to give individual webpages a “thumbs up” recommendation, which other google users in your Google+ friends list will see in the search results. Website admins need to add the “+1” snippet to pages they want to make available to ratings. Similar to Facebook’s “Like” button, users can only register their approval of a page…there is no “-1” or disapproval of pages.

Google will also weight the search engine results for pages that are recommended by your friends, creating a powerful SEO metric whose success will depend on how many Google users log into google and use its contacts database (all Gmail users are automatically opted in to this system when logged in and using Google as their search tool).

While Google has tried to soft-launch the entire Google Plus system, it is undoubtedly a direct attack against Facebook’s primacy in social media, and fully integrates social relevance with search ratings.


Favorite spam comment of the week

by Andrew Kagan 18. April 2011 12:25

Comment spam is an ongoing problem, and it’s very difficult to eliminate completely. Putting up spambot barriers is effective, but still some spam slips through, especially human-generated trackback attempts, such as this one, masquerading as an anti-spam comment!

Hello, i read your blog occasionally and i own a similar one and i was just curious if you get a lot of spam remarks? If so how do you prevent it, any plugin or anything you can suggest? I get so much lately it's driving me mad so any help is very much appreciated.

Of course, this was posted using a bogus email and a trackback to a spam domain, and I wasn’t lured in to enabling it. But unless you are moderating your comments, this one would likely escape attention.

Search engines such as google, yahoo and bing are continuing to try to separate the “wheat from the chaff” when it comes to figuring out which backlinks are relevant to search, and which are just spammers trying to seed backlinks across thousands of unsuspecting blogs and message boards. This practice has accelerated even as search engines have become better at devaluing them, leading to more headaches for moderators and admins.

As noted, comment spam can be completely prevented by moderating comments, but this requires the blog or message board admin to manually evaluate each comment. Even then, spambots and trackback services using live individuals create a tidal wave of comment spam, so using third-party tools like Akismet are a necessity. Google’s free reCaptcha service is another useful tool, but is vulnerable to brute-force attacks by trackback services using humans to solve reCaptchas.

Implementing both of those tools in concert will cut down on comment spam dramatically (usually by 95-98%), but still requires comment moderation to effectively block the 2-5% that gets through. The above spam comment got through both filters on the Search Partner Pro blog.

Tags: , ,


Microsoft Expands Content-Farm Filtering Internationally

by Andrew Kagan 11. April 2011 12:54

Google announced today that it had expanded the content filters it put in place this past February in the U.S. to all its English-language search engines internationally. The algorithmic change (code named “Panda”) was an effort to improve search engine results by filtering out “low quality” pages and websites, typically content farms and link farms that tried to boost relevance by generating millions of links to website pages. While the move will likely improve the ranking of white-hat SEO websites, it does create new challenges for optimizing webpages.

Google has estimated that the original “tweak” to its ranking algorithm affected about 12% of all search queries, and its implementation led to a new “Google Dance” as keyword rankings oscillated wildly before settling down. Google expects the additional rollout will broadly affect rankings for many websites.

Trying to leverage link popularity

Since the Caffeine update early last year, Google has steadily expanded its attempts to incorporate “link popularity” into its rankings, in an effort to make it’s results more timely, even reflecting up-to-the-minute changes. An important technique for this was to monitor the number of “inbound” links to a webpage (other websites linking to a given webpage). This would provide a “relevance boost” if many people appeared to be interested in a news event, or story, or blog post of import.

As Google came to give more relevance to referral links, the balance slowly shifted away from the content on the webpage, and more to its popularity. Thus during the Caffeine rollout, many webapages with high relevance and solid content suddenly dropped in search rankings against websites with many inbound links.

The  problem got worse when people began abusing this algorithmic sensitivity, by “seeding” millions of links all over the web to a particular webpage, forcing it to the top of the search results for a particular keyword. These auto-generated pages generally had little information on them other than the keywords and link they were trying to promote.

It has proven particularly difficult for Google to weed out these forced links from natural ones, as was witnessed in the JC Penney Linkbait Scandal over the christmas holidays last year. The problems are magnified as we get into the “long tail” of search results, where very specific search phrases return fewer search results, and there is more opportunity to manipulate those results on a page-by-page basis.

The new changes are designed to take link popularity into account, but devalue links coming from low-quality pages. Unfortunately for the rest of the universe, we don’t know exactly how Google defines “pages of low quality”, but we certainly know what Google’s goals are in estimating page quality…relevant content, and more than just a paragraph of bogus copy and a link.

User Data Further Contributes to Rankings

Google also revealed that it was now incorporating user actions to block sites in its ranking calculations. Google had initially used blocked-site data (reported back to Google by the Google Toolbar extension) to corroborate it’s own data, but the correlation was so high (>84%) that the search engine will now use user data as a secondary factor (also known as a “signal”) in search ranking.

As reported by Vanessa Fox, a contributing editor at SearchEngineLand, Amit Singhal at Google cautioned that “high-quality” website should not be affected by the algo changes, but encourages SEOs to use Google’s Webmaster Forums to alert Google to any ranking problems created by the rollout of the new algorithm.

A renewed focus on content

Google’s judgment of link quality is likely to affect the relevance any pages with limited content. For SEOs and their clients, it underscores more heavily than ever the need to develop quality, relevant content for website’s particular market focus, and to do a better job isolating long-tail keywords and targeting content specifically for them.

As Google juggles page relevance with popularity, rankings will continue to shift. Knowing your most valuable keywords and targeting them in your webpages is the only strategy that makes sense moving forward.



April Fools! Google unveils Gmail Motion

by Andrew Kagan 1. April 2011 09:23

Google announced today Gmail Motion: a new innovation in email technology, designed to use motion-capture technology to speed the composition and manipulation of emails. Using easy to learn, simple and intuitive gestures, Gmail Motion improves productivity and has the additional benefit of increased physical activity, as you jump out of your task chair and start gesticulating wildly at the screen.


Okay, not really…the April Fool’s Day jest was launched on the Google homepage, with a link to an official looking presentation of a new technology. The whimsical take on using motion-capture technology to control opening, sending and editing emails may be a subtle kick at Microsoft’s Kinect motion-capture gaming system, which is flying off the shelves and likely leading a revolution in interactive and immersive gaming.

Google has often been criticized for being a late adopter of user-interface improvements in search products, and this side-wise look at improved human-computer interaction may be pointing to real software innovations just around the corner. More likely than not, this will be in the realm of voice-recognition and voice-to-text, which is already firmly embedded in Google’s Android OS for mobile devices (coming to a PC near you!).

In the past, Google’s easter eggs have usually looked backward, such as the “Plumbing Net” take on sending messages through the sewer system. It’s always a risk to make fun of UI innovations that seem novel at first but spawn practical uses we don’t imagine at the time. Clearly, gesture and facial recognition is an evolving technology that could potentially revolutionize UI interaction…let’s hope Google stays at the forefront of this innovation.



New Features in Google Analytics 5.0 enhance usability

by Andrew Kagan 20. March 2011 09:18

Google announced March 17 that it would be rolling out a major revision of its free web analytics software, Google Analytics, raising it from version 4 to 5. The enhancements are essentially to the user interface, in an attempt to make it more customizable, more user friendly and better organized, as well as more competitive with other analytics packages.

The major UI change is the addition of "widgets" that can be positioned on the screen to create a custom view comprising different data points. AJAX-based screen layouts have become a standard feature in other analytics packages such as Omniture Sitesearch and Webtrends, so Google is playing catch up to a degree, but clearly it was a necessary (and welcome) feature as more and more reports have been added to GA's interface.

The ubiquitous line chart that tracks activity over time has been augmented with a motion chart (it existed before, but was hard to find), which shows activity trending using colored points instead of continuous lines.


Some traffic sources and content reports can now be organized to display a “term cloud” visually differentiating keywords by size instead of displaying them in rows.


This feature will be useful for quickly assessing the most effective keywords on your website, as well as which have the highest bounce rates.

Though much of the content and features are still here, reports deemed less useful or popular by the GA team have been removed, and many of the sections have been renamed. “Visitors” has been split into “Demographic Reports” and “Technology Reports”, separating user data from underlying tech data like browsers and ISPs. Google notes that this is to make the interface more “user friendly” to non-technical users.

Major changes have been made to the custom reports section, allowing you to apply filters directly to reports instead having to first create advanced segments, and the ability to create “flat” reports with multiple dimensions displayed in columns of data, a feature which has been a component of other analytics packages for some time.

Google Analytics version 5 is still in beta, but like the Caffeine update to the search engine, it will be rolled out in different markets in the near future. We’ll post more info on this new GUI as it becomes available.


Powered by BlogEngine.NET
Theme by Mads Kristensen updated by Search Partner Pro