"only three of 33 antivirus programs detected the malware"
Care to divulge which three?
Security maven Mary Landesman is in the midst of piecing together a who-done-it involving the infection of hundreds of websites that are generating an enormous amount of traffic. Or maybe it's a how-done-it. Either way, she's mostly drawing blanks. Landesman is a researcher for ScanSafe, a company that monitors the web surfing …
I was worried for a moment then realized that this primarily only affects those poor saps that insist on running mission critical stuff on Windows/IIS. Have fun! It's getting boring keeping these Linux/Apache/MySQL/PostgreSQL servers humming along (months without a reboot, yawn).
"I was worried for a moment then realized that this primarily only affects those poor saps that insist on running mission critical stuff on Windows/IIS."
Um, as much fun as Windows bashing is, make sure you at least sound credible:
"They don't use the same web host, and while most use web serving software from Apache, the versions vary widely, making it unlikely that attackers are exploiting a vulnerability in that program."
The only really common server platform vulnerability I can think of that may not not be patched for reasons of backwards compatibility belong to Python thats not to say there aren't any but that one stands out as many scripts are popular and will simply not run on the latest version. Don't go by what I say I am often wrong. Off to check the sites then.
Checking random sites from the list on uptime.netcraft.net, looks like they're all running Apache 1.3 and PHP 4.4 (although these arent technically EOL). I didn't see any with 4.4.8 (latest version). Who knows what other unpatched PHP software/modules are on there. Even Linux needs security updates once in a while :P
I know I just went through a critical Perl update a couple of months ago on my webserver , if not done cPanel and other associated server side software wouldn't update either
"Mom and Pop" websites ? .. does this perhaps mean they are using cheap shared hosting or webhost reseller packages without proper WebHostManagement licenses(?), and therefore aren't being updated daily, rarely, or not at all(?) ... plus the server ops not giving a rat's ass
was there years ago, alot of crud webhosting and reseller packages out there .. once found some very bad stuff in the files of 1 site (of about 25 I had hosted on a cheap reseller package out of Canada) ... very poor administration and security ..cheap is cheap might be the common thread
just another thought
Of the hosts that responded with a "Server" HTTP header, all of them had mod_bwlimited/1.4 installed. Versions of Apache, PHP, etc varied. It looks like most of them are old cPanel installations (mod_bwilimited was widely included with that).
My suspicion is that someone broke in via SSH (probably using brute force) and then built a new mod_bwlimited module after gaining root (via an old exploit, as these systems all seem to be quite old). All of the hosts seem to have SSH and just about every other service imaginable open to the world.
As mailed to the author of the article, perhaps the solution is far simpler than a mysterious cross-platform exploit.
The infection occurs on multiple server platforms but all on small to medium sized business sites. The kind which a web developer would create on their desktop PC then FTP up to a shared hosting server.
Could the infection method be via stealing cached FTP passwords from (easy to compromise) desktop systems and then FTPing the infection code up to the site? Not too hard to search for index.html on a server and insert a <script> tag in the <head> block.
Lots of small UK sites -- online florists, specialist travel agents, that sort of thing -- over the last month being detected by our Bluecoat ICAP with Sophos signatures as Mal/ObjJS. But the reported URLs don't end in .js.
I don't really pay enough attention to what gets stopped -- makes me a bad citizen I suppose.
But it's nice to have figures for my "no site is really safe" educational campaign....
I am the owner of this site and noticed it mentioned and found there was indeed some fishy actvity although my own AVG and antivir software do not detect any thing how ever I spoke to the hosting provided and it seems it is a bug in apache and now has been resolved at least on our web site.
However I've just found some more intriguing behaviour; on the second wget to the same site (having picked another one at random) the .htm file doesn't contain the link to the .js file. Followed those two with a wget to get the .js, but found that a second wget to fetch the .js got 404'd.
I assume from this that it's keeping track of IP addresses and making sure that only one copy of the .js gets delivered per machine.
At first Jamal, your site is still infected:
body onload="initDate()";>------- language='Java------' type='text/java------' src='bjkwq.js'>--/------->
------------Dubai hotels, Tours, Desert Safaris
and online hotel reservations. Lowest Rates Guaranteed------
( Sorry I have to sanitize severely, else it won't post )
For info the js attempts 8 different exploits ( which can work probably only on windows desktops ).
The virus is smart enough to attempt the exploit only once per ip address.
Then, all these sites run linux. Many post here show clearly the real problem with security on linux: too many users deny the problem even exist. No matter the facts, all the blame is rejected on admins, windows, "end of life" , specific distribs... They boast about "no reboot for so long.."..
Linux users should accept that security is also problem on this platform, and because of the success of this platform they have too deal with users like Jamal who search worms on their server with AVG on their windows desktop..
So now I've got two copies of the .js; they differ in one line:
< var arg="qgenahfr";
> var arg="dqwejbdj";
arg is appended in the script to the hostname thus:
and again it's a one-shot download - the second GET is a 404.
The download appears to be a Windows binary - I ran "strings" on it and it's full of this sort of thing:
A) Client machines has Apple Updater installed (usually via QT)
B) infected servers are smalltime servers that probably aren't quite uptodate on SSL libraries
C) Infected servers have mod_bwlimited/1.4 installed.
Methinks the infection started with someone engineering a .mov that scripts downloading the server-infecter and backdooring code.
Solution: Avoid quicktime (and it's evil twin iTunes, since the latter gets force-downloded as soon as apple-updater has a chance to spoonfeed it to you!)
(who really enjoys not having slowtime installed)
This is certainly an interesting one. Initially I thought it must be an Apache module that had installed or doctored which was inserting the code into the pages, I've certainly not seen anything like this working at kernel level before.
This doesn't appear to be that difficult to write signatures (or heuristic rules) for though, the exploit xxxxx.js files are all the same across domains other than the filenames and the very first line of the file 'var arg = '
Looks like a bit of a pain to clean up your server though - especially given that chkrootkit doesn't appear to find it according to that WHT page.
This has gotten me thinking now. Following a bit of Googling for 'trojan kmem kernel' I've found a number of forum posts reporting very similar issues:
Here for instance: http://www.gossamer-threads.com/lists/fulldisc/full-disclosure/27857
As one of the posts I've read [somewhere] says, inserting the malicious code through an Apache module would be extremely straightforward to do, though having a kernel module doing the same thing (although much more complicated to write) would be much harder to detect (it's got us talking!) and much more difficult to remove - half the problem being to know what you're looking for.
It looks like an evolution of the code on that xpire.info link I posted above which inserted an Iframe into the page; in that now the compromised server is hosting the whole shebang. It also shows hallmarks of a modern web attacks:
1) Hiding from sys-admins trying to remove it in order to remain active as long as possible
2) Giving researchers the run-around by only exploiting once per IP
3) Using a number of published exploits in order to get a binary onto the target machine
4) Obfuscating the actual exploit code through various means to try to prevent static/automated analysis
It's certainly not 'randomly' inserting the code into the page / serving the exploit. It's doing it once per IP, once you've had your fill of exploit then there's no coming back for seconds. Randomly inserting the code would be pretty silly - some people (potentially AV researchers) would get multiple copies, whilst others wouldn't get it at all. Some people may visit multiple pages within the same site - thereby giving further chances that they may randomly encounter the exploit, whereas others may visit the homepage then move on. Serving the malicious code once per IP gives everybody a fair shot at getting infected whilst slowing researchers down at little.
Unlike Storm and similar the server isn't generating the obfuscated exploit code on the fly. The server contains a static copy of the trojan and the obfuscated exploit, with probably a simple string replace on the "var arg = xxx". This means that every copy of the xxxxx.js file is identical across all servers and all domains. As posted on the WHT forum page, there are no traces of the .js files on the server - these are obviously being generated on the fly. There is obviously quite a bit of keeping state happening on the server though - who has downloaded the files already? what filename did I tell this IP address to use? (unless it's a hash of some sort), I would guess this is stored in a file somewhere on the system (which the rootkit is then denying the existance of).
It would be interesting to know how the trojan infected the servers in the first place, given that in order to install either an Apache module or a kernel rootkit would require root privileges. Could this be a buggy PHP script with some fancy privilege escalation or has the attacker somehow SSH'd into the box? I guess the only way to answer these questions would be to get hold of a compromised box and hope they didn't clear the log files out.
It would be interesting to look at the rootkit itself though - modifying Apache replies on the fly can't be the easiest thing in the world to achieve.
An exploit in APACHE? Good grief Charlie Brown, does this mean LINUX MIGHT NOT BE SO SECURE AFTER ALL? Sorry for the caps, but I really had to yell that.
And the exploit hook on Windows OS machines (well, you want to attack the majority, thats Statistics 101) invloves APPLE SOFTWARE? NO! THE MIGHTY APPLE IS INFALLIBLE! Well, thats what Mac-fanbois always yell...
All computers are exploitable, so long as most of the people writing software keep looking for the shortest, easiest quickest dodgiest ways to do things; lucky we don't build physical infrastructure with such easy exploits, like tram systems!
So now, how will the "community" react?
Those who have the chance to find someone like "Scott.MC" from WHT will have their servers cleaned, one by one, manually. Those who will get the help of the very many "linux-tech" from WHT will just reboot / reinstall endlessly, keep on serving viruses, and complain about "frontpage virii"..
Will Redhat release "critical updates" like the "Malicious Software Removal Tool" or will they just say customer should migrate to RHEL5 with SELINUX enforced ( ok, "targeted" is enough for protecting /dev/mem I think )? What will happen for the other (non commercial) distributions? How many linux admins will be able the fix the systems they manage?
> An exploit in APACHE? Good grief Charlie Brown, does this mean LINUX MIGHT NOT BE SO SECURE AFTER ALL?
Yes, an exploit in Apache... no, it doesn't mean that it's fundamentally more secure than Windows... after all, there will always be Linux exploits, but there will never be Linux viruses... let's not go through all the FUD again, let's just agree that no-one has yet, in the last decade, claimed the thousands of pounds that has been on offer from Netproject for anyone who is able to infect one of their properly-configured Linux machines with a "virus".
Get a real multi-user operating system.
I see some of you trying to find commonalities.
Doesn't it occur to you that if an exploit checks the OS to see what it is vulnerable to, in the same vein an exploit of webserver software could also have a similar plan so the only commonality would have to be that someone is looking to hack them for some reason.
(Most likely one of two things)
1) Create a new bot army for childish reasons.
2) Cause general chaos, regardless of whether it be a PO'd individual, group, government, or company looking to make the competition look bad.
Yawn, same story different day, only thing different this time is more forethought before the attack. IOW, you don't put all your apples in one basket if you want to survive someone else shutting things down.
I don't care if I look dozy because operating systems and desktop software are not my field at all.
On my home PC, I always refrain from surfing with administrative privileges. I only do it if I'm trying to use WindowsUpdate.com or something like that. I also allow IE7 to block add-ons and do not run them unless really necessary. Does this mean my PC is unlikely to be infected, or does it make no difference?
Secondly, I have a small website hosted on which I update from the same desktop using WS_Ftp. How can I be sure whether it is infected or not?
Sorry but this is a release from a company that makes money scaring people into buying their product.
It could be that at any given time it is normal that 15% comes from a few hundred sites?
Seems like criminal packages client exploit, criminal picks server vulnerability to exploit (usually old software for which a security updates has existed for month), criminal infects several hundred sites ......
Outbreaks like this could be the norm.
The only thing new or clever is maybe renaming the .js file so you just find all the sites with evil.js and then know they were infected as part of the same attack.
That'll teach me not to get up until lunch :o/ Seems all the domains listed have been scrubbed and are (with the exception of reallybored.com which prolly actually gets some traffic) showing cPanel holding pages.
I mean Jesus - web hosting support staff working on a weekend?! Who'd a thought it?
The domains seem to be spread across a number of hosts, although it is of course possible that some of those hosts are resellers and it's one parent host that has been exploited. Probably not tho - I would bet 50p on it being an automated cPanel exploit.
Oh well, no reversing on a quiet Sunday afternoon for me :o(
It doesn't serve the JS once per IP. I automated 100 requests to three of the sites listed (whilst they were still running) and the JS was inserted between 3 and 10 times on each. Interestingly, the frequency of which it was present declined as the number of requests increased (i.e. it was always there on the first, then usually the third, then the tenth, then maybe around 20-30, etc...).
I agree that there will be some kind of hash table storing information about recent visits, but I imagine that it's probably an in-memory table, and not likely one that you'll find on disk anywhere.
I too would be interested in having access to a compromised server (not that I'm volunteering one of my servers!!)
Ok - I am no good on these things - i just keep my av running and pray.
BUT - How long has it been since an attack came out on the source of software.
There were supposedly two bad code problems that came out of suppliers on original cds THIS WAS BEFORE WWW. One had gotten on a machine because the programmer took his code to a trade show it was infected there and then was put in production.
What I am getting at is a contaminated source for the download of Apache or an associated program. Mom and Pop sites could have gotten downloads instead of original CDs. Only one alternate download site need have been contaminated or rerouted to China or something.
Just a thought - paranoid but it matches the criteria - different hosts etc
AC says: > Why oh why does everyone still keep linking Apache/PHP as "Linux".
I agree, but let me fix your phrase, which clearly comes from the heart:
"Why oh why do people that can't distinguish an operating system from a webserver spend their time posting here instead of reading Computers For Dummies instead". To say nothing of cretins who in some incomprehensible manner suddenly bring up Red Hat (?!??!).
Token effort at saying something intelligent:
Article says the infected sites are "generating an enormous amount of traffic". I imagine that this is not only due to visitors visiting said sites. Does this mean that the "sites" in question do lateral attacks, i.e. that the hosting machine has become a general attack platform (aka. roach motel).
(Image of His Holy Jobsiness just for the hell of it)
I ran some scripts.
Run 1: 9 js
Run 2: 4 js
Run 3: 6 js
Run 4: 1 js
Run 5: 2 js
Run 6: 3 js
(Average % chance of catching the js file is 25/6*35 = 12%)
cgolu.js czynd.js eenom.js eqfps.js erztp.js frpmg.js iggmy.js jiodm.js khkev.js kksyr.js kobgw.js kolqj.js lvmlt.js nrvaj.js oalhi.js pcqab.js tezam.js tfxep.js unolc.js vduoz.js vjytq.js wdnfn.js xihrj.js yrslu.js zouoq.js
Then I wget's www.peshawarjob.com/index.html 52 times.
46 times the filesize was 20834
6 times it was 20917, the extra bytes being the js reference:
eeeoc.js fsqnp.js fxpui.js ibumz.js qfkjh.js rajuw.js
6/52 = 11.5% by the way
wget -S showed a diversity of servers. * means "latest version"
1 WebServerX (ie Apache)
27 UNIX & Linux with 7 Undefined
7 (Red Hat)
13 PHP of 28 reporting
10 OpenSSL of 28 reporting
10 mod_ssl of 28 reporting
Others (of 28 reporting)
(While doing this I got more js names:
arqmi.js bdjpm.js gljkm.js gtrmu.js ietnf.js lgvte.js oavnn.js pmglm.js qriox.js tdhse.js tzkuo.js urofu.js uvbvk.js wpjph.js wrfbn.js xcats.js )
All the names returned today are unique in 46 attempts.
I am not a security guru and this is less than scientific doesnt seem to point the finger at any single module. However, all instances are Apache so my guess is something is going on in Apache or lower down.
By using wget I was able to catch the js file 12% of the time, about 1/8th. It was happy to serve me multiple js scripts using wget.
When you get the js, it is basically 31K of escaped hex. When you run it in a safe place (ie unconnected sacrificial ubuntu with firefox) you see references to ActiveXObjects, AJAX, and something called "mosvs8.exe" - the JS_IESLICE. Heh. ActiveX. Again. Switch it OFF!
I think the .htaccess buffer overflow was fixed in Apache 2.0.51.
Perhaps asking the site owners for an inventory of all software / patch levels from the boxes serving this content would allow for some cross reference? Perhaps Mary should consider this?
Personally I suspect some unpatched CMS may be responsible for this.
Many of these sites seem to be using SEF urls based on the category / title of the content you click through to i.e "homes-for-sale.htm" etc, and this is classic CMS behaviour. This would also account for the fact that there is no obvious pattern. Most CMS's based on PHP / Apache will run cross platform, and admins usually try to conceal the fact it's a CMS from the client browser and thus quite difficult to spot a pattern.
Hope this helps
I wonder if these servers have had their Cpanels patched in the last 18 months? I wonder if perhaps somebody's found another hole...
(Steve icon seems appropriate, seen he's been implicated in this too.)
Biting the hand that feeds IT © 1998–2020