Firms at risk from Web 2.0 sites

User generated content could contain malicious code, say experts

Web 2.0-type sites such as Wikipedia and YouTube may contain malicious code in their pages that could put enterprise systems at risk if firms do not have sophisticated content security solutions, according to experts.

"These sites would not be anything without user-generated content, but they can't check all aspects of that content [for malicious code]," warned product manager at SmoothWall, Tom Newton. " You might be prevented from uploading executables but they will let you put in some benign HTML that can produce known exploits for Internet Explorer, for example."

These kinds of passive attacks render URL filtering-only security tools ineffective, because users can "no longer trust a site by what's in the address bar", so firms need to employ more comprehensive, real-time content scanning technologies, he added.

"People say we need better user education but if that was possible it would have happened by now," argued Newton. "Firms need perimeter antivirus, internal firewalls, content filters and limited user privileges [in place]."

Web security specialist Finjan also highlighted these vulnerabilities in its latest quarterly Web Security Trends Report. The firm's Limor Elbaz argued that the sites should be scanning all uploaded content at the gateway to mitigate such threats.

Nigel Stanley of Bloor Research agreed that "behavioural-based anti-malware with smart algorithms" is the best way to detect and block such attacks, and he
warned that enterprise systems could be affected if users are allowed to visit potentially infected sites.

"The people looking to distribute malware are being increasingly creative about how they do it; businesses should block access to these sites," he argued. "And if your site is hosting malware you have a responsibility to mitigate that, but whether you can is another matter."

Chris Seth, UK managing director of social networking site Piczo, said that the firm has "various layers " to mitigate these risks, including automatic checks to protect against any malicious code and a member services team that scans pages manually and responds to any alerts flagged-up by users.

"It's very much our responsibility to [deal with any security] issues; ensuring the site functions as it should is important from a business point of view because users drive the business," he explained. "A larger part of our business is devoted to member services and engineering than to the commercial [side]," he added.

But chief technology officer for Webroot, Gerhard Eschelbeck, argued that this is "a very difficult challenge to solve on the operator side "because of the problems in defining what constitutes malicious code and what is acceptable. He said the decision to block certain pages or sites has to be made "at the end node" by an individual or organisation.