back to article CA issues false warning on JavaScript apps

This story was updated with a statement from CA on 2 January A mis-firing anti-virus update from CA issued on Monday wrongly identified legitimate JavaScript files as a virus. The eTrust signature update wrongly identified JSQuery (a JavaScript AJAX library) and Mootools (a JavaScript web 2.0 library) as being contaminated …


This topic is closed for new posts.
  1. John Macintyre

    not the first time...

    ajax has been identified as a virus/bug....

  2. The Reg-ular

    Don't use packers

    This is because CA has decided to block the use of the Dean Edwards JavaScript "packer" code. JavaScript that uses the wrapper code:


    ... is being blocked and reported as JS/Snz.A. A lot of site have packed versions of JavaScript libraries such as MooTools, JSQuery, and so on. Many sites use Dean Edwards' on "IE7 pactch" JavaScript which is, of course, packed using his tool.

    Following its use in a successful XSS worm that infected 600,000 users of a social networking site, many other hacker groups have begun using it, too. They are taking advantage of the fact that it has been whitelisted as a "legitimate" (non-underground) tool and, until now, not blocked.

    There is simply no good way to tell if the packed code is benign or malicious. Given recent events, there is a much larger chance today than in the past of it being malicious. There, more anti-virus, web filtering, IDS/IPS, and firewall vendors will begin blocking it.

    Have the bad guys won, then? No!

    Research has shown that these packers are not effective in doing what they are designed to do, which is to reduce page load times. For most JavaScript "in the wild" (as some like to say), their space-saving advantages are offset by the addition of the unpacker stub code. Readability and auditability (Firefox says that's not a word) are sacrificed, too -- is that packed file name mootools.js really MooTools, or is it a downloader for the latest Storm Worm EXE? It's not easy to tell. Besides, the load time is usually only impacted once in a long while, when it's first loaded into the browser's cache.

    The biggest drawback is execution time. By a wide margin, whatever gains are made in load time are lost in execution time. Some benchmarks I ran show these packers adding significant overhead to the code -- enough to impact the user experience negatively. On my test systems, four different publicly available packers added an average of 600 ms to execution time for each script. IE7 is by far the worst. The String maniplulation done by the unpacking code, fo some reason, executes very slowly in IE7, adding between 3 and 12 seconds!

    This happens every time, because the packed version is what's stored in the cache.

    While use of these packers is usually well-intentioned, it doesn't generally have the desired benefits for end users.

    With no real-world advantages today, these scripts are primarily used to prevent casual ripping of copyrighted script, offers a layer of security through obscurity, and provide camouflage for hackers' exploits and malware.

    I think we should urge developers way from the use of packers. I think more security companies should proactively protect their clients from packed scripts instead of waiting to write a signature based on every attack already underway.

    Bravo, CA, for taking the initiative. Clients of companies that are too conservative in blocking packed scripts, just because some people use them with good intentions, are sitting ducks for the next XSS worm or 0-day exploit and whatever payload it delivers.

    Oh, happy new year!

  3. Anonymous Coward
    Anonymous Coward

    Blocking Packers

    for javascript there is very little reason to use such tools, if it saves 1k on the file size that is literally nothing, it loads no faster - and if it saves 100k, perhaps your javascript could be optimized a little better?

    however for executables there are several packers, and I personally use them for every program I release - reducing a 500k file down to 50k doesn't help for small time downloads, but for several million downloads it can make the difference between 1 web server handling it easily or having to get 2-3 servers to host it

    however when it comes to AVs detecting packers as malware, i can only assume it is through laziness, they see some malware and see 5 variations, all have the same header on the file, so they add that to the detection rule without bothering to check that the header is from a widely used packer (or widely used installer.. had the same AV first report a program of mine as a virus for using a common packer, then a little later it started reporting the commonly used installer as a virus!)

  4. tony trolle
    Thumb Down

    good read

    nice one The 'Reg-ular'

    I find it very odd that a lot of CA stuff cost very little after you get your rebate back (if you're lucky enough) in the USA.

    They have there own site

    Just checked frys no rebates running at the moment but Nortons and KASPERSKY are rebated to just paying sales tax . WTF

  5. Rich

    False positives will be big in 2008

    I predict we'll be reading false positive stories in the non-IT press before long.

    AVG baulked at one of my VS files the other week.

    I'm thinking that the size of a typical virus "signature" string was set at a reasonable level some years ago, based on the number of viruses and the number of distinct files in the world We may have reached the point where this is too small and the antivirus firms would be advised to change their applications.

This topic is closed for new posts.