back to article Clusters f**ked: Insecure Hadoop file systems wiped by miscreants

Administrators of Hadoop Distributed File System (HDFS) clusters have evidently not heeded warnings that surfaced last month about securing software with insecure default settings. Attacks on Hadoop clusters have wiped the data of at least 165 installations, according to GDI Foundation security researchers Victor Gevers, Niall …

  1. James 132

    Better defaults would be nice

    It's obviously important that administrators familarise themselves with the ports that services open, but these things are also setup by developers and students that have no idea.

    I think it's infuriating they're not enabled by default (just switch it to and let the user know where to change it), so many big products do this, too. I'm looking at you, MongoDB.

    1. This post has been deleted by its author

    2. Anonymous Coward
      Anonymous Coward

      Re: Better defaults would be nice

      "Better defaults would be nice"

      You are absolutely right. Unfortunately in Hadoop land it's challenging for a number of reasons:

      0) Hadoop is inherently a distributed system made of dozens of bits of software with no real single-node analogue, so you can't just lock everything down to a single interface and be done with it.

      1) Security was bolted on long after the core product, and has taken something of a Unix approach of letting other (older/better written systems) do the heavy lifting. Authentication requires a Kerberos environment, proper authorization requires a user/group resolution system (e.g. SSSD to AD), encryption is arcane black magic configured through impenetrable XML (backed to an external keystore) and TLS is a clusterfuck of settings for the 20+ pieces of software and thousands of daemons that constitute a Hadoop deployment. You can't just "switch it on". There are efforts underway in all the major distros to kerberise by default (which would eliminate most of these attacks) but it's nontrivial, and ...

      2) There's little commercial drive for the vendors to fix this. Like ES and Mongo there's little interest in your one man and his dog 1-3 node setups on AWS for your as-good-as-personal use, and everyone who is or could be a customer will be doing this properly already.

      On the plus side, because the data in HDFS are immutable and the encryption system requires additional daemons and utterly incomprehensible configuration there's no chance of this evolving past vandalism into ransomware.

  2. BlackKnight(markb)


    so the recommendation is to follow basic security principals and actually protect your infrastructure from unauthorised access?

    1. Dabooka

      Re: Doh!

      Astonishing isn't it?

      Other security advice includes locking your house and not leaving your wallet on the bar when going out.

    2. Anonymous Coward
      Anonymous Coward

      Re: Doh!

      Yes but any half decent system wouldn't let you keep "dangerous" default settings on install.

    3. Doctor_Wibble

      Re: Doh!

      I tried following the security principal but got arrested for stalking.

  3. shrdlu

    Only 165 sites hacked?

    OK, so 165 sites had their data erased. How many had their data altered?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like