Best practise ?
I was reading it was better to route to 0.0.0.0 than 127.0.0.1. Something to do with 0.0.0.0 failing without waiting for a time out. probably makes little difference in practise but I am curious to know the "right" way :]
Attempting to prevent Malware from infecting computers is an important duty of a systems administrator. If you are attempting to secure systems then anti-malware applications, restricting the use of vulnerable third party applications and browser extensions are all important. But attempting to prevent – or at least contain – …
I get the impression that this may well make a significant difference if you're running your own locally-hosted server or, based on some anecdote I've seen on various forums, if you've got a weird OS. But at least according to MVPS, any difference is a myth:
...and despite some considerable Googling on my part, I haven't found anyone who has conducted any sort of proper test, only anecdote. On the other hand, if you have an absolutely gigantic hosts file, using 0.0.0.0 does technically reduce it's size a bit.
AdBlock Plus addon for Firefox:
Blocking malicious sites with Adblock Plus
"... another layer of protection..."
IPCop + URL Filter + Adv Proxy
This setup handles it.
IPCop | Services | URL Filter | Custom blacklist | (Remove all the crap except first column) paste in list | Save and Restart
URL FIlter other things...
ads: adult: adv: aggressive:
agressif: alcohol: audio-video: automobile/bikes:
automobile/boats: automobile/cars: automobile/planes: chat:
cleaning: costtraps: dangerous_material: dating:
downloads: drogue: drugs: dynamic:
education/schools: finance/banking: finance/insurance: finance/moneylending:
finance/other: finance/realestate: fortunetelling: forum:
forums: gamble: gambling: games:
government: hacking: hobby/cooking: hobby/games:
hobby/games-misc: hobby/games-online: hobby/gardening: hobby/pets:
homestyle: hospitals: imagehosting: isp:
jobsearch: leo: library: liste_bu:
mail: military: mixed_adult: mobile-phone:
models: movies: music: news:
phishing: podcasts: politics: porn:
proxy: publicite: radio: radiotv:
reaffected: recreation/humor: recreation/martialarts: recreation/restaurants:
recreation/sports: recreation/travel: recreation/wellness: redirector:
religion: remotecontrol: ringtones: science/astronomy:
science/chemistry: searchengines: sex/lingerie: sexual_education:
shopping: socialnet: spyware: strict_redirector:
strong_redirector: suspect: tracker: tricheur:
updatesites: violence: warez: weapons:
webmail: webphone: webradio: webtv:
Blocked domains (one per line) * Blocked URLs (one per line) *
Allowed domains (one per line) * Allowed URLs (one per line) *
Custom expression list
Blocked expressions (as regular expressions) *
File extension blocking
Block executable files: Block audio/video files:
Block compressed archive files:
Local file redirection
Enable local file redirection:
Network based access control
Unfiltered IP addresses (one per line) * Banned IP addresses (one per line) *
Time based access control
Block page settings
Show category on block page: Redirect to this URL: *
Show URL on block page: Message line 1: *
Show IP on block page: Message line 2: *
Use "DNS Error" to block URLs: Message line 3: *
Enable background image:
To use a custom background image for the block page upload the .jpg file below:
Enable expression lists: Enable log:
Enable SafeSearch: Log username:
Block "ads" with empty window: Split log by categories:
Block sites accessed by it's IP address: Number of filter processes:
Block all URLs not explicitly allowed: Allow custom whitelist for banned clients:
URL filter maintenance:
The new blacklist will be automatically compiled to prebuilt databases. Depending on the size of the blacklist, this may take several minutes. Please wait for this task to be finished before restarting the URL filter.
To install an updated blacklist upload the .tar.gz file below:
Create and edit your own blacklist files
Backup URL filter settings
Include complete blacklist:
Restore URL filter settings
To restore a previously saved configuration upload the .tar.gz backup file below:
Stupid computers running Windows 2000 that can't be upgraded and in which nearly everything must run as Administrator. There was an article about it a ways back from me as well as much discussion and debate in the comments. I've since taken further precautions, but let's be honest here: how many folks (especially at home) do you know not only run as administrator, but click "yes" every time the "would you like to run this app" box comes up?
I agree that in an even halfway-well-run and up-to-date corporate network it’s not a practical threat…but not everyone gets to work in those environments. So many networks I know are band-aids on top of band-aids on top of other band-aids held together with tape.
Still, as people move away from the 2000/XP era into a work where running things as limited users becomes more common and practical, DNS blackholing becomes more valid as a defence as a result.
Pint because it's Friday.
Baulked at the idea of getting a list of bad domains onto my ISA server with the budget available (ie none). However found a program named on the site that does it and our network is now getting a bit more secure.
I like simple clear advice like these articles - IT management is not my main job.
I've little knowledge of windows admin but the hosts file should be by default read-only, absolutely non writeable, for users for exactly this reason. Just checked and it is so on my Win2K8 (real machine), win2k (a VM, not that that makes any difference) and Mint linux (also a VM).
Unless you + users are running as admin/root, altering hosts shouldn't be possible. So what's happening??
Wow, we've been doing this for years... Our firewall gets a package sent about once a week updating both known safe and known unsafe domains, and we outright block the unsafe and limit access to unknown (not safe). We also add to the white and black lists regularly, and choose filters based on OU.
I read about similar practices years ago and even tried it for a while. It was high maintenance, and unconvincing.
Personally i'm not fond of a large hosts file - I would prefer it to be empty - for performance, maintenance and security reasons. sadly there are two mandatory applications in our org that require entries in hosts files on all clients. The programmers are assholes about it to boot, so no change forthcoming yet.
I prefer to block before it enters the network with Untangle.com and OpenDNS.com combined.
The problem with blackholing DNS is that many cyber-crooks know about it and they therefore change the domain/subdomain they use frequently. Thus if you just block certain domains - even if you update the domains from malwaredomains.com frequently - you will fail to block the malware for long. A far better approach is to block the IP addresses of the malware providing hosts because typically the crooks use the same host with the same ip address, they just change/add new dns links to it.
As we mentioned on our blog (er yes this is a commercial plug) a few months back - http://threatstop.wordpress.com/2010/05/10/iframe-droppers-and-other-drive-bys-how-threatstop-protects-you/ - we provide our subscribers with frequently updated lists of known bad ip addresses that may be quickly and automatically plugged into the firewall and which block many malware sources. I'd love to say we block all but then you'd know I was a lying marketing droid instead, I believe we stop most of them though but since the crooks unaccountably refuse to give us a list of compromised hosts for us to check against I can't prove it.
MichaelC above would certainly benefit from our system since stats we have analyzed from DShield indicate that about a third of all threat sources change in a week (and about a quarter in less thna 24 hours). Thus by uploading new data once a week he will be missing a significant portion of the threats he thinks he is protecting against.
Don't get me started. That was /not/ my idea, and it has taken me four solid years of fighting tooth and nail to be allowed the opportunity to replace it. There are things which make me rage. There are things which make me cry. Then there are things which make me experience desires to commit war crimes. Actually, only one thing has ever fallen into the latter category, and that is ISA.
While I'm sure that malwaredomains do an admirable job, it's pretty certain that there's no way they can capture all of the fast-flux domains used by modern botnets. When you have 20 million domains like dlxfrglh.com and orutyerou.com and so on, the blacklist becomes huge, unwieldy and seriously impacts network performance.
I know, because I tried this a couple of years ago, and Internet access slowed to a crawl. In the end, I simply ended up bit-bucketing anything to do with China, Russia, and most of Eastern Europe - because on the odd occasion when we did get infected, it nearly always came from, and reported to, one of those places. While I acknowledge that this is not a workable solution for many enterprise-level networks, for SMEs whose business is largely local (and whose networks aren't exactly high-powered) it takes a huge amount off the blacklist, leaving only the US and Netherlands as the main offenders, and that is easily dealt with using a much smaller blacklist. It doesn't eliminate every possibility, but good security practices and proper system maintenance should cover the rest of it.
Oh, and @PC Tech: While I'm as big a fan of Firefox, AdBlock and NoScript as anyone, they are not really a good defence in a network context (no client-controllable solutions are), simply because users can disable AdBlock and NoScript, or in the case of NoScript, simply allow scripts from a suspect domain. I actually caught a few users in my workplace running with NoScript in "Allow Scripts Globally" mode, because they complained it was "too annoying" to have to keep clicking "Allow" in Noscript for each new site they visited! So while it's a reasonable supporting plan to have client-side defences in place, it's a very bad idea to rely on security in the hands of your users!
They've done a good job with the Zeus botnet, and there are commercial alternatives coming on-stream to handle it. Again, Malwaredomains.com isn't the One True Solution. It is part of what should be layered defence in depth.
As to no-script, the debate was had in the comments section of my previous article: