Search Engine Optimization and Marketing for E-commerce

Cleaning up bad links with Google’s Disavow Tool

by akagan 30. October 2012 11:05

Google’s announcement and release of it’s new link-disavow tool October 16 is a welcome addition to your webmaster’s arsenal, allowing you to clean out inbound links (“backlinks”) from poorly ranked or negatively ranked websites. The disavow-links tool joins a similar tool released by Bing earlier in the year.

webmasterblogiconInbound links (links from other websites to pages on your website) are normally a good thing, helping Google and other search engines gauge how much interest there is in your webpages. Widespread abuse of this metric, however, primarily through the use of link-farms and paid-link schemes, has led to significant and frequent changes in Google’s ranking algorithms, creating churn in search engine results, especially for webpages with thousands of inbound links. The problem has been compounded by “grey hat” SEOs manipulating search results by deliberately pointing negatively weighted links at their competitors, also known as “link spam.”

Google already alerts webmasters (through it’s Webmaster Tools website) when there are questionable/negative or “unnatural” links pointing at their website, which would potentially (probably) negatively impact the webpage’s search rank. Until now, however, there wasn’t anything you could do to remove the links, assuming you weren’t responsible for them.

Google’s new disavow-link tool is an effort to address negative links, by allowing users of Webmaster Tools to upload a text file with a list of links to ignore. As most SEOs will realize, if Google is alerting you to “unnatural” links on your website, then you’re already likely being penalized for those links. The best defense is to aggressively monitor inbound links using Webmaster Tools, and actively keep track of and disavow links using the new tool. It should be noted that Google does not guarantee it will take disavow requests into consideration in all situations.

Google’s Spam Czar Matt Cutts posted a video explaining the process in detail. The text file has a simple format of one URL per line:

badsite.com/questionableBackLink.htm

Fortunately, you can also list entire websites using the shorthand format:

domain: badsite.com

Each time you add to this list, you must upload all the links again. Cutts cautions that extra care should be used with the disavow tool, to prevent inadvertently removing valuable links from their web page rankings.

Unsure about how or when to use this tool? Then don’t…Google has provided it as an advanced method of correcting problems with backlinks, and it is designed for SEO professionals.

Tags: , , ,

SEO

SEO Advice at Google I/O reveals webmaster weaknesses

by Andrew Kagan 8. June 2010 10:48

 

Google’s Matt Cutts posted an hour-long video from Google I/O 2010, where he and three SEO experts performed live reviews of websites submitted by webmasters. What was striking was how poorly many websites have been optimized, when the basic rules are public and easy to meet.

The first website “Phoenician Stone”, a manufacturer of stone mantles, tiles, etc., had no text on the homepage at all, with a poorly descriptive two-word title tag (“Phonecian Stone”). The only significant amount of text was in the meta keywords tag, about whch Matt made sure to mention “Google doesn’t index that text”. He went on to emphasize “We [Google] don’t trust the meta keywords tag”.

SEO Tips To Take To Heart

Tip #1—Put text on your page

Tip #2—Think about what users will type when searching for your services, and put those words on the page.

Cutts recommended using any free keyword research tool to find actual search phrases people use on search engines.

The second example was “rodsbot.com”…as Matt noted, the domain name is not particularly descriptive or intuitive (it displays weird Google Earth images). Like the first website, there was virtually no text on the homepage, but since this was a “community” site where individuals posted images, an easy way to generate lots of search-relevant text would be to include users reviews and comments. “Why do the work when you can get someone else to do the work for you, right?” mused Cutts, rhetorically. Another point Cutts made was that the owner of the website had 6 other websites, and clearly wasn’t devoting enough attention to each site for any of them to rank well.

What’s in a (domain) name?

The next site profiled was a news site about Google’s Android operating system called “androidandme.com”.  The homepage was top-loaded with ads and a large logo area, to the point that most of the actual content was rolled off the bottom of the screen. While search engines may return the website because the content’s there, the drop-out rate on the page will probably be higher than it should, because the content is too hard to find. On the positive side, the website was running on the latest version of Wordpress, and was configured to use descriptive names in the URLs.

But how do you differentiate your website from others covering the same industry or products? Cutts pointed out that branding “outside the box” would help differentiate your website from the rest of the pack, using as an example the mobile phone website “Boy Genius Report”…the name has nothing to do with mobile phones per se, but it does have a lot of resonance with gadget-hungry geeks and nerds, and it certainly “stands out from the crowd”.

Mal-de-Ware

One of the sites submitted for review actually had been hacked with malware scripts, and the owner evidently was unaware of it. Vanessa Fox pointed out that if you register your website with Google’s Webmaster Central (and who doesn’t?), you will be notified of any malware detected on your website. Panelist Greg Grothaus added that Google has a new service called “Skipfish” that will allow you to test your development website for SEO and malware before you’re released your code to your live site.

The Mayday Update

Cutts admitted that the radical shift in search rankings around the beginning of May was a deliberate algorithmic change that is here to stay. The ranking algorithm shift caught many SEOs off guard and caused much misery as sites with long-time ranking saw huge shifts in their SERPs. Google claims this update will return “high quality sites” above sites they evaluate as having lower quality.

TLDs Don’t Matter (Maybe)

During the final open Q&A that closed the session, it was asked if the TLD (what your website ends with—.COM, .INFO, etc.) affected search rankings. Cutts was emphatic that there was no ranking preference based on TLD, although he added parenthetically that other websites tend to aggregate links to .GOV, .EDU, etc. Certain TLDs may have a bad reputation that might impact on click-through rates, so I would still recommend staying away from .BIZ, .INFO and other spam-centric TLDs.

The H1 Tag…Still No Consensus

Many SEOs insist that you have to use the <h1> tag on your pages for the headline content…Cutts mentioned that Google will index the page regardless, and that what’s more important is that the page validate as HTML. He did not say that leaving out the H1 tag will penalize your search rank, so use it if you want, but don’t obsess over it!

 

Tags: ,

SEO

Powered by BlogEngine.NET 3.3.0.0
Theme by Mads Kristensen updated by Search Partner Pro