William Jackson | Web gets even more untrustworthy
Connecting state and local government leaders
Cybereye'commentary: The risk of visiting even a legitimate and trusted Web site has increased, and cannot be countered by blacklists or Internet usage policies.
First, signatures for malicious code were used to block viruses from our computers. As hackers became more proficient at hiding and changing the code to avoid signatures, other defensive techniques were developed.
Global network monitoring could spot malicious or suspicious activity by some servers or addresses and give some warning and protection, but attacks have gone stealthy and now often fly under our radar. URL blacklisting and reputation filtering are used to block known bad players, but hackers apparently have found ways to beat those, too, by carefully hiding their tracks.
'They are looking at the techniques of the security companies and coming up with ways to overcome them,' said Yuval Ben-Itzhak, chief technical officer at Finjan Software.
Finjan today released details of an apparently new crimeware toolkit it has dubbed random js because it serves random JavaScript code through trusted Web sites to avoid signature detection while delivering a malicious Trojan to visiting PCs. Even more insidiously, it also hides from prying eyes by keeping records of the IP addresses it has infected and the Web crawlers working for antivirus reputation engines. When the server gets a request from one of these addresses, it serves up only legitimate code, not the malicious code.
'We call it anti-forensic,' Ben-Itzhak says of this behavior, because it makes it difficult if not impossible to find where an infection came from. It also keeps the URL off blacklists and preserves its ranking by maintaining a trusted reputation.
Once the Trojan has been downloaded, it executes to retrieve commands from its controller and installs the standard malicious programs to bring the victim into the botnet and steal data.
That Finjan was able to detect this attack using the company's content inspection tools to examine the code being served up by Web sites shows that its defenses are not perfect. Nothing is. But it is pretty good. 'They are infecting a lot of machines,' Ben-Itzhak said. Since its discovery in December, random js appears to have infected as many as 10,000 Web servers in one day, each server hosting hundreds or even thousands of domains.
All of this adds another layer of threat to the World Wide Web, one that is difficult for users to counter on their own. It means the risk of visiting even a legitimate and trusted Web site has increased, and this cannot be countered by blacklists or Internet usage policies.
The response to this threat requires another layer of defense in your enterprise or hosted by your Internet service provider. It is becoming more important to have some inspection at a level where ' without significantly slowing performance ' high-throughput appliances can identify and block malicious traffic based on its content and behavior rather than relying on a signature or its origin.
Sadly, this does not mean we can do away with the more traditional signature, content and blacklist filters now used on our networks and desktop PCs or that we should abandon appropriate-use policies. Adding additional layers of threat and defense does not remove the existing threats or the need for the defenses against them. So for the foreseeable future, the cat-and-mouse game we have been forced into will continue, and advances in the speed and richness of content and functionality on the Web will, to an extent, be offset by the growing layers of defense needed to insulate us from the evil intent of others.
NEXT STORY: Secure E-mail standard released