Boffins defeat all randomware: By installing linux.
Florida U boffins think they've defeated all ransomware
Researchers from the University of Florida and Villanova University reckon ransomware can be stopped by watching what it's doing to the target's files. Taking a “save what you can” approach, the authors of this PDF reckon in their tests they were able to lower the boom on ransomware when it had encrypted just 0.2 per cent of …
COMMENTS
-
-
Tuesday 12th July 2016 04:10 GMT Flocke Kroes
For the time being, just about ...
There have been several attempts at ransomware for Linux. Some actually encrypted files. Last time I looked, the encryption keys were recoverable for free.
Offline backups still have additional value as a ransomware recovery strategy. Backups should not be considered successful without a restore. As the restore is required anyway, I use a non-networked machine that is not x86 or ARM, and check for some canary files.
I would be grateful is other Linux users were at least as paranoid so Linux ransomware does not become a multimillion dollar per month enemy.
-
-
Tuesday 12th July 2016 09:41 GMT Dave 126
Some ideas, feedback appreciated:
- Backups would benefit if a different OS to the user's machine was used. The chance of a single piece of ransomeware being able to encrypt say a Windows workstation and a Linux server at the same time is less than being able to encrypt either system.
- A router with OpenWRT can be configured so that the backup machine is only accessible at certain times of the day or week, to conincide with scheduled backups. This means that the user's machine, including any nasties, can't access the backups. This obviously will never be as secure as physically unplugging the back up server, but removing the responsibility of plugging / unplugging from the user might be worth it. The router could be configured so that server A is accessible on Monday, server B on Tues etc.
Thoughts?
-
Tuesday 12th July 2016 11:21 GMT Sitaram Chamarty
my backup strategy
(I know you didn't ask me, but still...)
I have a simple strategy that consists of actually reviewing the files that my incremental backup program reports as having changed. (The backup program itself is "borgbackup" -- awesome stuff; look it up. Unix only though).
A modification of this could be to keep a trend of number of files in each top level directory that are changed per day, and if something unusual happens, alert someone.
An even simpler way that often works (for single desktops) is to count how many files changed today, and alert if it is at least 1.5X larger than the maximum number of files changed in the last N days (adjust N to taste). The alert should list the actual files that were changed so someone can quickly determine if there was a problem or "oh yeah those files, we know what all those changes are".
The assumption is that the malware (if any) has not borked my borgbackup software to produce false reports of what it is seeing. I suppose in theory that could happen with a more popular backup tool so YMMV.
-
Tuesday 12th July 2016 16:00 GMT Flocke Kroes
@WebLogons
This is my home system. Computers are mostly Raspberry Pi's and some similar devices predating Pi's. This applies to home directories. When I modify system config files, tested and working versions get copied to a directory in /home/ so everything is in one place. Video files are dealt with separately because they are too big and new ones do not appear every week.
Out of lazyness, nightly backups are cron+rsync over the network to a single point of failure. Weekly(ish) backups using tar+bzip2 to one of two USB disks. The USB disk gets moved to an old mips board and is checked with a script that includes: bzip2 -cd /mnt/backup/latest.tar.bz2 | tar -xT path/to/canary_list
bzip2 will go through the entire file and complain to stderr if it is not valid. The canary files are checked with cmp. Remember to delete the canaries before extracting them.
This strategy depends too much on human discipline and does not scale. The advantages are that it was made mostly out of kit I had lying around gathering dust, it is sufficiently odd that mass market malware will not understand it and that the data protected is not valuable enough to be worth a competent cracker's time.
I hope there are some ideas you find helpful. If you need an old mips device, look for a dusty box full of old routers and see if any of them are on openwrt's supported devices page. Linux's supported architectures are X86_64, i686, ARM, mips (often) and various problems I hope I never have to deal with again. Mips should be sufficiently odd to annoy crackers. If you can find less than three working i686 laptops, one would be a good choice.
-
-
-
-
Tuesday 12th July 2016 21:33 GMT Fatman
RE: WinTrolls vote-flame logical post!
<quote>WinTrolls vote-flame logical post!
They need to go out and play Pokeball and let us real geeks talk Linux ;-)</quote>
Or the latest craze:
http://wfla.com/2016/07/11/pokemon-go-is-addictive-spreading-fast-in-tampa-bay-area/
There have been at one local media report of some IDIOT being robbed playing this STUPID GAME.
-
-
-
Tuesday 12th July 2016 03:54 GMT Flocke Kroes
Next gen ransomware
Disks are way too big these days, so there is room to add plenty of extra data that is not required for decryption. For each old file, create a new one of the same type. Each new file contains chunks from other files of the same type in a random order, an encrypted map where the chunks came from and repeats of sections of the encrypted map to reduce entropy. When an old file's contents is mostly stored in new files, modify the old file preserving the type, including the remaining chunks of original data in a random order and add the final encrypted map.
This will not trigger any of the indicators mentioned in the article. It costs some code for each file type, but even with just jpeg and docx people will have plenty of incentive to pay up. Ransomware distributors are well funded, so I am sure they can afford the development effort more than Florida University boffins can afford to counter it.
-
-
Tuesday 12th July 2016 09:34 GMT Mayhem
Re: Next gen ransomware
How about a process that iterates down through the folder tree, spawning a separate encrypting process for each individual file, which then completes as the file is finished. Keep a watchdog so that only a certain number of child processes are actually running at once so that the system doesn't slow down too much.
Effectively bulk changing files one file at a time, in a slower but less detectable fashion.
-
Tuesday 12th July 2016 14:41 GMT Flocke Kroes
Re: Next gen ransomware
DavCrav: The article had "Bulk modification of file types", so my plan was not to change the file type, just the data after the header identifying the file type. The bulk of the data in a file does get modified. If the defenders try to detect that, then I would make a small modification to a bunch of source files, then go through them again and again until thoroughly trashed, then pick another bunch.
Mayhem: I like the idea of splitting the work among child processes. I thought the defenders were looking at the file system, not the activity of individual tasks, but might as well burn that bridge before someone tries to cross it.
Dr Syntax: "if the file changes look OK" is the tricky bit. Attacker and defender can both arrange that files with properly documented formats are valid. If the file format is documented except for some secret binary blobs, then attackers cannot create valid files and defenders cannot check them.
-
-
Tuesday 12th July 2016 10:45 GMT Doctor Syntax
Re: Next gen ransomware
"Disks are way too big these days"
So there's the basis for a defence strategy - copy on write. Then apply the sort of approach these guys are taking to detect file changes - if the file changes look OK surplus old copies can be quietly garbage collected.
However, I still think the best approach is one where user programs don't get direct access to the files, they request a server and there should then be a means for the server to verify the requesting program - cryptowhatnot doesn't get the ability to read and write your spreadsheet and it's not the recognised client to get the server to do that for it.
-
Wednesday 13th July 2016 19:50 GMT Charles 9
Re: Next gen ransomware
"However, I still think the best approach is one where user programs don't get direct access to the files, they request a server and there should then be a means for the server to verify the requesting program - cryptowhatnot doesn't get the ability to read and write your spreadsheet and it's not the recognised client to get the server to do that for it."
All you do then is switch the target from the program to the server. What man can create on a computer, man can usurp.
-
-
-
Tuesday 12th July 2016 09:46 GMT Dale 3
Rate detection
Our IT guys stopped a ransomware infection in the act (by pulling the network cable on the victim machine). They detected it by one of their monitoring systems noticing high volumes of files being changed on the network by the same PC in a short space of time. Network files were restored from backup within a couple of hours. The victim PC got re-imaged and lost everything local, but we're encouraged to avoid keeping things locally for just this sort of reason. I suppose it might have gone unnoticed longer if the malware had trickled its network activity. It's always going to be an arms race between malware and anti-malware, but being able to analyse and monitor the types of changes being made to files sounds like a good supplement to volume and rate statistics when defending against attacks.
-
Tuesday 12th July 2016 10:15 GMT Somone Unimportant
...or use honeypots
Our file server is pretty much open to write by all (loooong story but we are moving to a new filestore system) and cryptolocker has had a big hit on us in the past.
However I've found that placing honeypot files around the file system and checking their integrity every minute or so by comparing them to a known secure copy of the file and flagging an immediate alert if there are any differences does the trick pretty well.
Some of the cryptolockers are getting smarter and randomly targetting files instead of iterating through a filesystem, or encrypting two or three files and then sleeping for an hour, then waking up and repeating, so this method is becoming less useful.
But if you had enough disk storage I'm sure that you could do something like a disconnected RAID-1 and watch for weirdo changes like the authors propose.
-
Wednesday 13th July 2016 11:10 GMT Paul Crawford
Re: ...or use honeypots
Use a server with something like ZFS that supports snapshots and is copy-on-write. Then seeing massive disk use between snapshots is a clear sign of bulk modification, plus you can go back to previous snapshots to recover the data quickly.
Try FreeNAS on, say, a bottom end HP Microserver with, 4 * 6TB disks or similar and 12GB or 16GB RAM. Under a grand for a system with 12TB of well protected storage. OK, you need to make damn sure that snapshots are on and *WORKING* (hint - make sure 'recursive' is ticked) and that control over the NAS is secured so malware cant go in and disable stuff or simply wipe it. But that is kind of basics anyway.
-
-
Tuesday 12th July 2016 11:57 GMT Jeff 11
Detection could be a moot point if a successive generation of ransomware works silently as a rootkit, encrypting the disk gradually in the background and intercepting filesystem calls to provide the plaintext version until everything is locked up. Then it uploads the keys to a box on the net somewhere, removes them from the machine, and sticks up the user with the 'pay up' dialog box.
If I were a malware writer, that's how I'd do it!
-
-
Tuesday 12th July 2016 15:41 GMT Dave 126
Re: For what it's worth
For sure, if in doubt, keep it simple. And you can add redunacy to your regime by having a few external drives on rotation.
However, it does require downtime, and user interaction. These aren't deal-breakers for many users, but some people will want an automated backup solution - every hour, perhaps.
Speaking as someone who has been called upon to fix friend's PCs, I sometime think it would be nice if every consumer PC sold came with external HDDs and an image back up system by default. :)
-
Wednesday 13th July 2016 21:12 GMT Charles 9
Re: For what it's worth
"Speaking as someone who has been called upon to fix friend's PCs, I sometime think it would be nice if every consumer PC sold came with external HDDs and an image back up system by default. :)"
Two problems with that approach.
One, sleeper infections exist that stay quiet for a while so as to get themselves INTO backups, meaning restoring the backup just gets you infected again (since you probably won't know which files contain the payload and a smart one will hide in multiple locations, including WITHIN legitimate programs).
Two, you overestimate the intelligence of the average computer user. Given an external hard drive, they'll probably find some way to break or usurp it. Didn't the late Terry Pratchett write once that if there was an End of the World button, the paint wouldn't even have time to dry?
-
-
-
Wednesday 13th July 2016 12:53 GMT Richard 12
Re: Defeated ALL Ransomware?
Chances are good that the 0.2% are files that haven't changed at all since the last backup, so unlikely to lose any data.
0.2% is also much better than the 50-75% or more before a user spots an encrypted file, or the 10% or similar before an alert sysadmin spots an unexpected traffic spike or hears fans running more than usual.
-
-
Thursday 14th July 2016 08:48 GMT joliver
Hi
This conference in Japan was in June 2016
But security researchers / security companies have been doing this for a while
https://zeltser.com/detect-impede-ransomware/
http://esupport.trendmicro.com.au/solution/en-US/1111377.aspx
https://www.extrahop.com/community/blog/2016/ransomware-detection-ransomware-prevention-methods/
They report it is very effective - but a part of the problem with ransomware is that once your solution is out there it will be attacked. The ransomware guys will inject into whitelisted applications that are allowed to encrypt. They will obfuscate / work their way round your solution. So this is a good thing to do - but not a panacea.
-
Thursday 14th July 2016 10:55 GMT Christian Berger
Like with all those classification problems there is a blurry line
I mean sure, current ransomware is easy to defeat that way. After all it tries to encrypt all files as fast as it can.
Now imagine it encrypts one file an hour, or even less. Of course with some randomness, and with transparent decryption for userspace applications. Even if your software would detect that, it couldn't distinguish it from normal behaviour.
The obvious solution is to lower your attack surface. Make it hard for the user to install software from random sources, make sure you always use a minimal amount of code so you minimize the chance of getting compromised via a bug... and so on. You know, normal best practices security.
-
Thursday 14th July 2016 17:30 GMT Charles 9
Re: Like with all those classification problems there is a blurry line
"The obvious solution is to lower your attack surface. Make it hard for the user to install software from random sources, make sure you always use a minimal amount of code so you minimize the chance of getting compromised via a bug... and so on. You know, normal best practices security."
But that doesn't work well against the average user (who BTW can't be educated). How do you deal with people unwilling and unable to protect themselves (and by extension, everyone around them)?
-