Search Engine Optimization and Marketing for E-commerce

JC Penney Linkbait Scam Exposed, Penalized by Google

by Andrew Kagan 14. February 2011 04:08

The New York Times reported on Sunday that J.C. Penney had been exposed for implementing link-bait on an unprecedented scale, skewing search results and leading Google to levy severe penalties on the company's page rank against important keywords.

The article, titled The Dirty Little Secrets of Search, detailed how Penney had enjoyed 1st position results for highly competitive keywords like "dresses", "bedding", and "area rugs". and more valuable terms like "skinny jeans", "home decor" and "comforter sets". What was revealed was a widespread campaign of seeding thousands of links to J. C. Penney on largely irrelevant and unrelated, even obscure, websites. This process, commonly known as "link-farming", is a well-known black-hat technique for gaming Google's search results, even though Google publicly announced several years ago that it was preventing this techniqe from skewing its search results.

Apparently, not so...while Penney feigned ignorance about the use of link-farming, it summarily fired its search company, and Google proceeded to take punitive action by de-ranking the company for various keywords.

While Google insists that external links have less importance to a web page's SERPs than content, it is an inextricable component that Google can't ignore, especially as a gauge of momentary popularity. While Google claims to monitor SERPs for evidence of link-farming, it is a larger problem to identify social-media link abuses, as these results are critical to Google's "real time search" rank that takes into account social media links from Facebook, Twitter et al.

What's chilling for most white-hat SEOs is that black-hat techniques are alive and well, and put white hats at a competitive disadvantage. A black hat interviewed for the article implied that "S.E.O. is a game, and if you’re not paying black hats, you are losing to rivals with fewer compunctions." Even Matt Cutts, Google's top search-spam cop, noted that it's impossible for Google to police every link scam, although they do red-flag suspicious things like rapid growth of inbound links. It shows, however, that any proactive action on Google's part requires manual intervention by an employee; there is no automated process in place yet to deal with this type of exploit.

 

Tags: ,

SEO

SEO Improvements in ASP.NET 4

by Andrew Kagan 15. November 2010 02:47

Microsoft’s latest iteration of its server-side programming language, .NET Framework 4, has new features designed specifically to address SEO shortcomings in previous versions of the Framework. In a recent whitepaper on Microsoft’s ASP.NET website, a number of these features were detailed:

Permanent (“301”) Redirects

Permanent redirects send a visitor from one page to another, but their function in SEO is critical because they tell search engine crawlers requesting the old page to transfer all it’s accumulated rank to the new page, and simultaneously drop it from the search engine’s results.

Returning a “301” response in the header tells the requestor that this redirection is permanent (a “302” response tells the requestor that the page is temporarily unavailable, which does not help SEO).

Prior to Framework 4, permanent “301” redirects were accomplished in one of two ways, either injecting the response object headers into the page before it’s sent back to the requestor (using “response.addheader”), or using IIS 7’s optional URLRewrite module (prior to IIS 7, rewriting was usually performed by a 3rd-party ISAPI module). Now permanent redirects can be handled in a single line of code:

RedirectPermanent("/newpath/foroldcontent.aspx");

If your website uses dynamic pages built from a database, this new syntax makes it incredibly simple to manage site upgrades and URL changes. If you’re trying to permanently redirect static content to new pages, URLRewrite maps are still your best friend (now directly supported by .NET 4’s routing engine).

Setting Keywords and Description META tags

While the keywords META tag has pretty much fallen out of use on major search engines, the META Description tag is still one of the most important elements of SEO, since it’s content is both used by search engines and displayed in search results. It plays a critical role in click-through rates and needs to correspond closely with the page content. It also needs to be unique to each page it appears on, or it will negatively impact search rank.

.Net Framework 4 provides two new methods for adding META tags at runtime (before the page has been sent to the requestor). You can now add them directly to the “@Page” directive at the top of every ASPX page, along with the page TITLE:

<%@ Page Language="C#" AutoEventWireup="true" CodeFile="Default.aspx.cs" Inherits="_Default" Keywords="These, are, my, keywords"  
Description="This is a description" %> 

When the page is rendered and sent back to the browser, these elements will be correctly rendered:

<head id="Head1" runat="server"> 
  <title>Untitled Page</title> 
  <meta name="keywords" content="These, are, my, keywords" /> 
  <meta name="description" content="This is the description of my page" /> 
</head> 

Prior to Framework 4, programmers commonly manually inserted this content using placeholders, or overloaded the “header” declaration in the page object. Now, the page object itself has been extended with specfic methods, making for a one-line solution:

Page.MetaKeywords = “My, keywords”
Page.MetaDescription = “My Description”

For dynamically generated content, this is a tighter and less error-prone method, and allows for a complete separation of programming and page design.

Improved Browser Capability Providers

Another frustration for .NET developers has been browser compatibility. If you’re designing pages to display properly on mobile devices as well as desktop computers, the “.browser” file tells .NET how to compile the pages for a specific browser. This process has been streamlined in .NET using the new, cachable and extensible HttpCapabilitiesProvider.

Replacing URLRewrite with Routing

Most web developers have a love-hate relationship with Microsoft’s URLRewrite Module, which allows rewriting and redirection rules to be applied before pages are compiled. While IIS 7 provided a simple rewrite-rule generator, it was accessed through the IIS Admin interface, limiting access to developers using shared hosting environments.

Routing support existed in .NET 3.5 sp1, but it has been simplified and improved in .NET 4 with the following features:

  • The PageRouteHandler class, which is a simple HTTP handler that you use when you define routes. The class passes data to the page that the request is routed to.
  • The new properties HttpRequest.RequestContext and Page.RouteData (which is a proxy for theHttpRequest.RequestContext.RouteData object). These properties make it easier to access information that is passed from the route.
  • The following new expression builders, which are defined inSystem.Web.Compilation.RouteUrlExpressionBuilder andSystem.Web.Compilation.RouteValueExpressionBuilder:
  • RouteUrl, which provides a simple way to create a URL that corresponds to a route URL within an ASP.NET server control.
  • RouteValue, which provides a simple way to extract information from the RouteContext object.
  • The RouteParameter class, which makes it easier to pass data contained in a RouteContext object to a query for a data source control (similar to FormParameter).

After creating a public class for the routing object, .NET 4’s “MapPageRoute” method simplifies

the syntax back down to a one-line statement:

RouteTable.Routes.Add("SearchRoute", new Route("search/{searchterm}", new PageRouteHandler("/search.aspx")));

This example sets up a route mapping an SEO-friendly search URL (e.g.: “mySite.com/search/widget”) to a physical page, and sets the search term as the first parameter. On your search page, you can capture the requested product in a single line as well, either in your code-behind:

string searchterm = Page.RouteData.Values["searchterm"] as string; label1.Text = searchterm;

or in the page directly:

<asp:Label ID="Label1" runat="server" Text="<%$RouteValue:SearchTerm%>" />

With the new routing syntax, friendly URLs have never been easier!

Tags:

SEO

Google's 200 Search Engine Ranking Factors

by Andrew Kagan 9. October 2010 01:57

Google has mentioned in the past that it has 200 separate ranking factors for evaluating the relevance of a web page to a keyword (the "Google Algorithm"). But SEOs trying to reverse engineer ranking factors is like trying to peek at the man behind the curtain with tweezers...through multi-variate testing and other empiric methods, we can make changes to websites and monitor the search results, but only broadly guess at which changes carry the most weight or have any effect at all.

Darren Revell of Recruitwise Technology threw down a challenge on the Search Engine Land  discussion group on Linked In, challenging it's members to identify all 200 ranking factors. A heated discussion followed, which identified most, if not all, commonly accepted factors. After a week of discussion, Darren summarized the results:

1 Search terms in the HTML title tag? 

2 Search terms in the HTML body copy? 

3 Search terms in bold typeface? 

4 Search terms in header tags? 

5 Search term in anchor text in links to a page? 

6 PageRank of a page (the actual PageRank, not the toolbar PageRank)? 

7 The PageRank of the entire domain? 

8 Quality of link partners? 

9 Type of back links that bring anchor text juice for search terms? 

10 The speed of the web site 

11 Search terms in the URL - main URL and page URLs ? 

12 Search term density through body copy (About 3 - 5%?) ? 

13 Fresh content ? 

14 Good internal linking structure ? 

15 Age of the domain ? 

16 Links from good directories ? 

17 Image names ? 

18 Image ALTs ? 

19 Reputable hosting company 

20 Diversity of link partners 

21 Geo located results 

22 Rate of new inbound links to your site ? 

23 Relevance of inbound links - subject specific relationship with target page negative factors too: 

24 Pages 404s, 414s etc ? 

25 Duplicate title/keywords 

26 Participation in link schemes 

27 Search Terms the First Words of the Title Tag ? 

28 Search Terms in the Root Domain Name (searchterm.com) ? 

29 Search Terms in the Page Name URL (e.g. acme.co.uk/folder/searchterm.html) ? 

30 Search Terms in the Page Folder URL (e.g. acme.co.uk/searchterm/page.html) ? 

31 Search Terms in the First Words in the H1 Tag ? 

32 Search Terms in other Headline (H) Tags ? 

33 Search Terms in Internal Link Anchor Text on the Page ? 

34 Search Terms in External Link Anchor Text on the Page ? 

35 Search Terms in the First 50-100 Words in HTML on the Page 

36 Search Terms in the Page’s Query Parameters (e.g. acme.co.uk/page.html?searchterm) ? 

37 Search Terms in the Meta Description Tag 

38 Social graph fans 

39 Social graph fans earned impressions 

40 Social graph fans earned impressions with links 

41 Secondary fan connection citations earned impressions 

42 Otherme citation (social media linking) ? 

43 Rich snippet geo-reference ?

44 Rich snippet UGC rating ? 

45 Placement of backlinks in page ? 

46 Quantity of backlinks ? 

47 Quantity of linking root domains ? 

48 Quality of linking root domains ? 

49 Link distance from higher authority sites ? 

50 Outgoing followed links from back linked pages

51 Country specific domain ? 

52 Domain classification of linking domains ? 

53 Domain sculpting 

54 Redirect permanent (not 302) ? 

55 Page accessible ? 

56 Sitemap_Index/Sitemap limit 10K ?

57 Sitemap folder geotargeting 

58 Index/Follow ??And the more controversial ? 

59 Bounce rate (personalization) ? 

60 Visits (personalization) ? 

61 Visits (scraped from Alexa) ? 

62 Semantic relevance (synonym for matching term) ? 

63 Reputation/advocacy (positive chatter) 

64 URL length ? 

65 Frequency of Updates ? 

66 Domain Name Extension Top Level Domain - TLD ??# Some -Ve factors, as: ? 

67 Link to a bad neighborhood ? 

68 Redirect thru refresh metatags ? 

69 Poison words ? 

70 Keyword stuffing threshold ? 

71 Keyword dilution ? 

72 Dynamic Pages ? 

73 Use of Frames ? 

74 Gateway, doorway page 

75 Keyword saturation (Saturation levels do play a crucial role in the ranking of your pages) ? 

76 Traffic buying (effect adversely) 

77 Link buying (effect adversely) 

78 Over optimization - There is a penalty for over-compliance with well-established, accepted web optimization practices. ? 

79 Excessive cross linking (effect adversely) ? 

80 Linking between all the domains hosted on same IP (This one may be debatable) ? 

81 Hidden content (effect adversely) ? 

82 Cloaking (effect adversely) ? 

83 Excessive use of graphics (effect adversely) ? 

84 JavaScript (effect adversely) ? 

85 Comment spamming (effect adversely) ? 

86 Title attribute of link ? 

87 Sitemaps: XML, Text, HTML (XML Sitemap (Aids the crawler but doesn’t help rankings) ? 

88 W3C compliant html coding 

89 Duplicate content on site (effect adversely) ? 

90 Duplicate tags on site (effect adversely) ? 

91 Page file size/load time ? 

92 Number of links on page (too many will effect adversely) ? 

93 Video header and descriptions ? 

94 Video sitemap 

95 Quality content 

96 <noscript> tags (even though I don't know anyone who doesn't have JavaScript enabled) ? 

97 IP address range (many are blacklisted for spamming) ? 

98 Whether the site has been previously de-indexed due to malpractice 

99 Relevance of title tag to page content ? 

100 Relevance of META Description to page content

101 Code-to-text ratio 

102 Canonical URL 

103 Directory depth 

104 Querystring param count 

105 An active adsense campaign. We have noticed our page rank higher when we are also running an active adsense campaign. 

106 Server calls, Images, JavaScript, Database calls (affects speed of website) 

107 keyword spamming 

108 multiple domains to same website 

109 link structure - do you link to '/', 'index.htm' 

110 SERPs 

111 Quality & Number of Blogs 

112 Authority ranking of subject matter 

113 Link attributes - like rel=nofollow 

114 Use of WebmasterTools 

115 Popularity (no one seems to have mentioned this one before?) 

116 Brand recognition 

117 Linear Distribution of Search Terms on the html 

118 IP address link to be varied, not from the same server which have lower link juice 

119 Standard Deviation of Search Terms in the Population of pages containing Search Terms 

120 URL shortener 

121 Snippet 

122 Microformats 

123 Mobile accessibility 

124 Synonyms , language, query terms 

125 Page category 

126 SERP click thru rate. Say your website ranked #1 for "bike shoes" keyword phrase but 60% off the traffic went to website in the #2, I guarantee you Google will notice and would make an adjustments. 

127 Relevance. (to searched phrase) 

128 Comprehensive. word-count and pages of content on topic to top keyword of entire site along with relevent named images, videos, news, with top keyword 

129 Fresh. Latest page updates with accurate sitemap so googlebot re-checks your site frequently. 

 

Domain / server factors 

130 Domain age; 

131 Length of domain registration; 

132 Domain registration information hidden/anonymous; 

133 Site top level domain (geographical focus, e.g. com versus co.uk); 

134 Site top level domain (e.g. .com versus .info); 

135 Sub domain or root domain? 

136 Domain past records (how often it changed IP); 

137 Domain past owners (how often the owner was changed) 

138 Keywords in the domain; 

139 Domain IP; 

140 Domain IP neighbors; 

141 Domain external mentions (non-linked) 

142 Geo-targeting settings in Google Webmaster Tools 

143 Domain registration with Google Webmaster Tools; 

144 Domain presence in Google News; 

145 Domain presence in Google Blog Search; 

146 Use of the domain in Google Analytics; 

147 Server geographical location; 

148 Server reliability / uptime 

 

Website elements: 

149 Page internal popularity (how many internal links it has); 

150 Page external popularity (how many external links it has relevant to other pages of this site); 

 

Visits: 

151 Number of visits; 

152 Visitors’ demographics; 

153 Bounce rate; 

154 Visitors’ browsing habits (which other sites they tend to visit) 

155 Visiting trends and patterns (like sudden spiked in incoming traffic. 

156 How often the listing is clicked within the SERPs (relevant to other listings) 

157 Use of Google Check out on your site 

158 Domain name is one of the important factors that give good page ranking in that particular sector. 

159 Compression for size by eliminating white space, using shorthand notation, and combining multiple CSS files where appropriate. GZIP can be used. 

160 You can use CSS sprites to help to consolidate decorative images. 

161 No redirection to other URLS in the same server through flash banner images

[Factors marked with a "?" are subject to dispute as to their importance]

The sheer number of identified factors is overwhelming at first, but also shows how interrelated many ranking factors are. Improving your title tags, for example, will address many individual ranking factors at once, as would targeting inbound links from websites with good reputation and relevant content to your own website. Publishing content for syndication, when done properly, will create many positive factors as well. And of course there are negative factors to avoid, which would reduce relevance.

All in all we came up with approximately 80% of Google's ranking factors...but like the recipe for Coca Cola, the rest are some of the search industry's most tightly guarded secrets. And one can assume many unidentified factors are weightings between groups of individual factors. Will we ever know Google's secrets? All we can continue to do is test and measure, and slowly build up the empirical evidence that will point to the rest.

Thanks everyone for the inspiration and collaboration...knowledge is search-rank power!

Tags:

SEO

Yahoo! faces irrelevance in search wars

by Andrew Kagan 5. October 2010 03:37

With Yahoo close to ceding second place to Microsoft Bing, and now sharing ad revenue with its former competitor, many search analysts predict a swift decline for the former web content powerhouse. In the waning days of September, both Nielsen Media and Comscore report Bing has surpassed Yahoo in search share, and it will likely continue to take over Yahoo's share as more Windows 7 computers come online (Bing is the default SE in IE8, hastening the process).

Bing introduced many innovations in search results last year, placing Google in the improbable position of playing catch-up in its search results pages. Bing's consolidating results onto a single page was quickly copied by Google for its image search, and Google's latest innovation, Google Instant predictive search results, is receiving mixed reviews after producing incoherent, and sometimes offensive, results. The AJAX-enabled search results are also reducing the number of clickthroughs on Google itself, although this has little impact on PPC revenue for Google.

With ad revenue spiraling downward at Yahoo, however, the company has positioned its portal as more community-oriented than Google or Bing, fighting it out with AOL and other classic content portals providing "unique" content. But web communities are coalescing around social networks, not search portals. My prediction: Yahoo will be fully absorbed by Microsoft in less than two years.

Tags: , ,

SEM

SEO Advice at Google I/O reveals webmaster weaknesses

by Andrew Kagan 8. June 2010 05:48

 

Google’s Matt Cutts posted an hour-long video from Google I/O 2010, where he and three SEO experts performed live reviews of websites submitted by webmasters. What was striking was how poorly many websites have been optimized, when the basic rules are public and easy to meet.

The first website “Phoenician Stone”, a manufacturer of stone mantles, tiles, etc., had no text on the homepage at all, with a poorly descriptive two-word title tag (“Phonecian Stone”). The only significant amount of text was in the meta keywords tag, about whch Matt made sure to mention “Google doesn’t index that text”. He went on to emphasize “We [Google] don’t trust the meta keywords tag”.

SEO Tips To Take To Heart

Tip #1—Put text on your page

Tip #2—Think about what users will type when searching for your services, and put those words on the page.

Cutts recommended using any free keyword research tool to find actual search phrases people use on search engines.

The second example was “rodsbot.com”…as Matt noted, the domain name is not particularly descriptive or intuitive (it displays weird Google Earth images). Like the first website, there was virtually no text on the homepage, but since this was a “community” site where individuals posted images, an easy way to generate lots of search-relevant text would be to include users reviews and comments. “Why do the work when you can get someone else to do the work for you, right?” mused Cutts, rhetorically. Another point Cutts made was that the owner of the website had 6 other websites, and clearly wasn’t devoting enough attention to each site for any of them to rank well.

What’s in a (domain) name?

The next site profiled was a news site about Google’s Android operating system called “androidandme.com”.  The homepage was top-loaded with ads and a large logo area, to the point that most of the actual content was rolled off the bottom of the screen. While search engines may return the website because the content’s there, the drop-out rate on the page will probably be higher than it should, because the content is too hard to find. On the positive side, the website was running on the latest version of Wordpress, and was configured to use descriptive names in the URLs.

But how do you differentiate your website from others covering the same industry or products? Cutts pointed out that branding “outside the box” would help differentiate your website from the rest of the pack, using as an example the mobile phone website “Boy Genius Report”…the name has nothing to do with mobile phones per se, but it does have a lot of resonance with gadget-hungry geeks and nerds, and it certainly “stands out from the crowd”.

Mal-de-Ware

One of the sites submitted for review actually had been hacked with malware scripts, and the owner evidently was unaware of it. Vanessa Fox pointed out that if you register your website with Google’s Webmaster Central (and who doesn’t?), you will be notified of any malware detected on your website. Panelist Greg Grothaus added that Google has a new service called “Skipfish” that will allow you to test your development website for SEO and malware before you’re released your code to your live site.

The Mayday Update

Cutts admitted that the radical shift in search rankings around the beginning of May was a deliberate algorithmic change that is here to stay. The ranking algorithm shift caught many SEOs off guard and caused much misery as sites with long-time ranking saw huge shifts in their SERPs. Google claims this update will return “high quality sites” above sites they evaluate as having lower quality.

TLDs Don’t Matter (Maybe)

During the final open Q&A that closed the session, it was asked if the TLD (what your website ends with—.COM, .INFO, etc.) affected search rankings. Cutts was emphatic that there was no ranking preference based on TLD, although he added parenthetically that other websites tend to aggregate links to .GOV, .EDU, etc. Certain TLDs may have a bad reputation that might impact on click-through rates, so I would still recommend staying away from .BIZ, .INFO and other spam-centric TLDs.

The H1 Tag…Still No Consensus

Many SEOs insist that you have to use the <h1> tag on your pages for the headline content…Cutts mentioned that Google will index the page regardless, and that what’s more important is that the page validate as HTML. He did not say that leaving out the H1 tag will penalize your search rank, so use it if you want, but don’t obsess over it!

 

Tags: ,

SEO

Click Fraud Hits 35% on some U.S. PPC Networks

by Andrew Kagan 11. April 2010 07:25

Anchor Intelligence, in it’s quarterly traffic report, reports that click-fraud has reached an all-time high in the first quarter of 2010 for affiliate marketers using its services.

Unsurprisingly, the highest click-fraud rates are coming from countries with historically lax controls on internet traffic and PC security. Vietnam has the highest rate of detected click-fraud (mainly through botnets installed on trojaned computers), at 35.4% of all measured clickthroughs.

What is surprising, however, is that click-fraud in the U.S. is running at 35.0%, which represents the lion’s share of all click traffic by volume, followed closely by Australia, Canada and the UK. Click-fraud in the U.S. is predominantly committed by sophisticated organizations usually hired by competitors to increase a company’s PPC advertising costs.

What is most disturbing is that major PPC providers such as Google and Yahoo clearly have the means to identify concerted click-fraud attacks, which have obvious signatures such as automated high-volume traffic from distinct IP ranges, yet they have little incentive to address it, as cracking down on click-fraud just takes money from their pockets. While Google some time ago published an independent report of its fraud-detection techniques,  the conclusion by that researcher was that Google’s effort to filter invalid clicks was “reasonable”, however he adds that the CPC model “is inherently vulnerable to click fraud.”

What to do if you suspect Click Fraud

It’s up to the advertiser to track the clicks and identify fradulent behavior, and then petition the network to adjust the CTR billing. The only way to do this is to monitor your server logs and identify PPC traffic, and then look for patterns in the originating IPs. For websites with high traffic and PPC volume, it can be difficult to separate valid traffic from fraudulent clicks, and even more difficult to “prove” to the network.

Tags:

SEM

Windows Live Writer makes you a better blogger

by Andrew Kagan 26. March 2010 00:42

I’ve started using Windows Live Writer, Microsoft’s free offline blog editor which it distributes under its “Live Essentials” toolkit. What’s nice about this software is that you can compose blog posts in a WYSIWYG environment with a full set of editing and format tools, save and organize your posts before publishing, and much more.

The software mimics what most blogs already have in terms of editing functionality, but provides a more complete feature set and, most importantly, a common editing environment that allows you to manage an unlimited number of blogs simultaneously…a godsend for SEOs working on multiple client accounts. Unlike the latest versions of Microsoft Word, which also allow publishing directly to blogs, Live Writer is free for anyone to use, and works with virtually any blog. The GUI includes useful image editing tools for which you’d normally need additional software, as well as table and charting functions, photo albums, video, and it also supports an extensible plug-in architecture for developers to add their own features.

I will post some how-tos and screen shots to help anyone interested in setting up this software. Whether you’re editing a single blog or manage dozens of different websites, this offline blog-editor really makes for better blogging.

livewriter

I recommend you try Live Writer out…and start blogging better.

Tags:

General

Yahoo and Microsoft To Merge Search Marketing

by Andrew Kagan 19. February 2010 00:21

In an effort to compete with Google's dominance in search marketing, Yahoo! and Microsoft announced today that they have received regulatory clearance by the Federal Trade Commission to form a search alliance. The announcement means that the entire webiverse of PPC marketing will now be controlled by two large players, but for advertisers it will be somewhat easier to manage PPC budgeting across two major services instead of three. 

Yahoo's revenue from and share of the PPC has been steadily eroding over the past year, mostly due to inroads by Microsoft with the introduction of its new search engine, Bing. Yahoo cited increasing losses from its affiliate-marketing sites, from which it generates a great deal of its search-marketing revenue, in seeking the merger.
Google's share of the PPC market has been flat at about 80% of all PPC revenue over the past year. For Microsoft, the merger will provide a big boost to its relevance in the paid-search market, although the combined PPC revenue for both companies is still only about a fifth of the market.
Yahoo says there will be no immediate changes for its Yahoo! Search Marketing advertisers, and will probably wait until after the all-important 2010 holiday season is over before migrating to a shared platform with Microsoft.

Tags:

SEM

Video of Luge Accident Pulled from Web

by Andrew Kagan 12. February 2010 23:38

The fatal luge accident that killed Georgian Olympic hopeful Nodar Kumartashvili during a training run in Vancouver Feb. 12 was widely reported within hours of its occurrence. Initially videos were posted on liveleak.com that showed the entire crash, which then exploded across youtube and other media-sharing websites. In the wake of complaints against the visceral footage, news organizations hastily edited out the final milliseconds showing Kumartashvili's impact against the steel stantion that killed him, and then, citing copyright, had all incidences of the footage pulled from the web within 24 hours.

But was this a copyright issue, or are their other legal issues afoot? Reports are circulating in the blogosphere that the IOC asked that the video be withdrawn pending an internal investigation, amid a flood of negative reporting that warnings of teams, coaches and competitors of the dangers of the track design went unheeded before and during trials leading up to Kumartashvili's death. At a hastily convened press conference Friday night, "Olympic Officials" (not officially representing the IOC) announced that they would make "minor changes" to the "track configuration" and "ice profile" prior to the first official runs this weekend.

But what is more likely is that the video has been pulled pending legal action brought against the IOC by Kumartshvili's family, in an attempt to limit the negative publicity it has engendered for the IOC. The only thing one can be sure of is that the video will be reproduced, ad infinitum, during the trial.

Tags: , , , ,

General

Google vs. China: Lax security led to hacks

by Andrew Kagan 13. January 2010 22:12

As more information was released into the nature of the attacks by Chinese cyberwarriors against U.S. companies, the "smoking email" appears to be lax security procedures on the part of Google, but more importantly on the part of the companies that were successfully attacked (only two are known at this time).

Google quietly started forcing all access to Gmail to be rerouted over secure (SSL) connections recently...as it became apparent that Gmail accounts were compromised in order to discover users' corporate account information. As is all too often with webmail accounts, users fail to realize that the entire contents of an email viewed over webmail is easy to intercept at any point ("hop") between the webmailserver and the client. The convenience of webmail far outweighs the security concerns...until now. For the record, most corporate webmail systems, e.g. Outlook Web Access, use secure communications between the server and client for this reason.

Encrypted connections between the Gmail server and client would provide a much higher level of protecting the data in the emails, but it takes a performance toll on Gmail servers that Google probably wanted to avoid. Gmail users had the option of securing communications between them and Gmail's servers, but few took advantage of it.

Of greater concern is the actual cyber attacks, which used a vulnerability in Adobe Reader (the "zero day" vulnerability) to embed a trojan in a PDF, which when downloaded to a user's computer was activated when the PDF was scanned by Windows' Indexing Service. Apparently this vulnerability was used to compromise corporate computers, leading to the security breaches cited recently. Google admitted that at least 32 companies had been attacked...but likely the numbers are much higher.

But the tragedy here is that the vulnerability was announced 9 months ago, prompting both Microsoft and Adobe to release security patches shortly thereafter. It is likely that the companies were attacked more recently, having left these vulnerabilities unpatched, as is so often the case...pity the IT directors who will soon be posting their resumes on Linked In.

Tags:

Security

Powered by BlogEngine.NET 1.6.0.1
Theme by Mads Kristensen updated by Search Partner Pro