back to article Shadow over Fedora 34 as maintainer of Java packages quits with some choice words for Red Hat and Eclipse

Fedora 34, a feature-packed new release of Red Hat's leading edge Linux distribution, was released today, though the main Java package maintainer has quit, urging "affected maintainers to drop dependencies on Java." Fedora 34 is used by Red Hat to try out new features that are likely to end up in first CentOS Stream and then …

  1. Disgusted Of Tunbridge Wells Silver badge

    What's the point of a distro if it isn't to abstract things like that?

    It's too hard for us, but don't worry, here's some documentation.

    1. Anonymous Coward
      Anonymous Coward

      Not just Java

      It's all the modern languages/applications that do their own package management/dependency resolution/automatic updates outside the distribution scaffolding.

      1. b0llchit Silver badge

        Re: Not just Java

        It is a nightmare what everybody is using as "package manager" where none deserve the title. You have the python universe, the php universe, the javascript/node universe. Then you have ruby's way and another and another. Lets not talk about the "where must it be installed" dependencies and "I cannot move it to another directory" or only one single version of package X can be on one machine or else doom happens.

        A nightmare.

        One thing most linux distros did well previously was the central package management. That is a very important feature which has been consistently undermined by the WCDIB(*) crew because they often cannot see beyond their own little universe. Sigh.

        (*) WCDIB: We Can Do It Better

        1. Anonymous Coward
          Anonymous Coward

          Re: Not just Java

          I agree with this. It could be to do with the fact that these languages all need many bindings against C (this isn't just a language, it is the *entire* computing platform).

          So because of all the bindings dependencies, the individual language developers come up with their own solution to generate them (and automate the system C compiler to hide this messy business from the user). And then because this is in place, they start to leverage it more and more, until it is just a dependency collection cluster fsck.

          1. teknopaul Silver badge

            Re: Not just Java

            I don't think not using rpm for java libraries will affect anyone much.

            I have never seen a Java app that uses that.

            In theory you can, but because it's nobody's primary use case I am not surprised package maintainers care little about rpm builds in Fedora compared to versions in maven central.

            Java has always been, and always will be resource intensive: a bit of HD space in ./libs is the least of you worries.

        2. Short Fat Bald Hairy Man

          Re: Not just Java

          With python I have not had a problem, ever since I started spinning up throwaway virtual environments. Expecting the OS packagers to maintain all sorts of esoteric packages seems to be harsh.

          Yes, I am told it has problems, but my fairly naive use seems to pose no problems.

          Could never wrap my head around Java, though.

          1. rcxb Silver badge

            Re: Not just Java

            After a few 1TB venv's for a bunch of simple apps, you might start to question the logic of this deployment method. Not to mention the logistics of updating package X in every venv when you find out there's a vulnerability that needs patching.

            I guess it's better than a docker container for /bin/true, but it's still pretty inefficient.

          2. b0llchit Silver badge

            Re: Not just Java

            For your little local problem and throw-away environment, well, yes, you may get away with it. In a production environment you will pay because you are accumulating tech-debt at a high rate.

            The real problem is when you are doing a large scale projects that are supposed to be flexible, maintainable, buildable and runable on a diverse set of environments. It is better to look for a supportable solution from vetted system- and stable-repositories than a "oh, that is a nice library" approach and often duplicating something.

            The next problem is the concept of code/data/config separation. Supporting multiple instances of something requires a lot of thought. That is something the (real) system packagers normally support and think about. Using many other package systems thrive on duplication. Yes, they say, storage and memory are cheap... But good design is always cheaper and better in the long run. But then you'd have to plan for more than a hacked-together-application.

        3. FlamingDeath Silver badge

          Re: Not just Java

          Humans, they really are self-interested wankers

          1. ICL1900-G3

            Re: Not just Java

            Oh, so very true.

    2. Anonymous Coward
      Anonymous Coward

      To be fair, Java developers are very similar to Python, Perl, Javascript and Rust developers. They drag in far too many dependencies for trivial things.

      They rack up technical debt that simply isn't worth the effort. It seems to be a concerning culture for "non-C" languages.

      And I personally blame these language based package managers. CPAN,, NPM, etc. All add a big mess into projects.

      1. J27

        It's not the language, or even the design of packages. What you're describing is just poor or overworked developers. You can misuse even the best tools.

        1. Anonymous Coward
          Anonymous Coward

          I am still inclined to believe it is a culture of these languages. For example you see it with open-source projects too.

          Typically these passion projects result in better quality output or at least demonstrate a little more thought than your average internal business glue app.

          1. teknopaul Silver badge

            The fact that so many issues can be shaken down at compile time in Java and that compiling is fast is a big factor. You can breaks apis freely in Java if you know the consequences will be compile time errors or caught in Juints.

            As a java dev I do that all the time with my own code. Eclipse supports this as a development technique. You can break apis and fix all the use cases in a single operation sometimes. IntelliJ not so good, but never more that one compile run away from highlighting what needs to be done.

            C requires more diligence in maintaining api compatibility.

      2. FlamingDeath Silver badge

        I’m of the belief that no problem should be solved twice

        With that said, importing crap you go no idea about, isn't really solving anything.

        It’s a bit like saying look, I solved this puzzle while blindfolded, ain’t I smart

  2. keithpeter Silver badge

    Is this a case of xkcd 2347?

    I'll chuck R in as well (although they are working on a way of pulling out dependencies required for a given script as a snapshot).

    Icon: It's 2037 and Mildred is doing her Phd and trying to make sense of a mix of Python and R scripts used to process data back in the 20s...

    1. Anonymous Coward
      Anonymous Coward

      Re: Is this a case of xkcd 2347?

      For R you can use Microsoft Open R, which they finally updated to v4 after looking like they abandoned it at v3.5, and make use of MRAN (their CRAN snapshot). This ensures that anyone using version X of Microsoft Open R will get the same version of package Y when they install it.

      Obviously that's all here and now and not much use in 20 years time after MS have abandoned it. Not sure Docker would be any better in that regard.

      1. Korev Silver badge

        Re: Is this a case of xkcd 2347?

        You could also use RStudio Package Manager which supports snaptshots.

  3. Doctor Syntax Silver badge

    "the first time since GNOME 3.0 came out that there's a real rethinking of the basic desktop experience."

    As an onlooker from a safe distance I wonder to what effect? That last real rethink prompted two new desktop projects to reinstate the original desktop experience. Would it actually be something that would lure me away from KDE?

    At least in the Linux/BSD world we don't have to just get on with what a vendor chooses to inflict on us.

    1. Anonymous Coward
      Anonymous Coward

      Did you ever use KDE 3.5?

      If not, it is likely you would much prefer that to the current state of KDE.

      Every open-source desktop has regressed. It is actually becoming a real problem.

      The fact that a single person can't maintain Xfce compared to i.e CDE is also suggesting that modern approaches to packages and dependency management are broken.

      And now, no-one dares risk start a new replacement, because Wayland is still too young and Xorg is perceieved to be too old. So we are at an absolute stalemate until Wayland finally disappears.

      1. Citizen99

        KDE 3.5 is still available and maintained as TDE Trinity Desktop Environment.

        1. Doctor Syntax Silver badge

          I used that for a while. Eventually I was concerned that as other stuff moved on there might be a problem with backward compatibility. Specifically stuff I expected to work in Lazarus didn't seem to work on TDE so eventually I moved on.

      2. Doctor Syntax Silver badge

        "If not, it is likely you would much prefer that to the current state of KDE."

        First question, yes.

        Second more complex. The one thing I missed in 4 and still miss in 5 is the ability to confine unhiding an auto-hid panel to a corner rather than to the whole edge. And 5 certainly wasn't ready for the big time when incorporated into Debian & via that into Devuan (and it wasn't an LTS version either). The current version (as in Mint), however, seems fine.

        An exception is that Gwenview seems to have acquired some misfeature that I take to be an effort at response to gestures; when scrolling thorough a n image it will suddenly decide that what I really intended was to swith to the next image even though I hadn't taken advantage of the specific button provided for this. However that's not been enough to prompt me to look very hard to see if it can be turned off. On the whole It's still a better option for me than, say Cinnamon which would be my second choice. And "choice" really is the relevant word here.

    2. DS999 Silver badge

      Rethinking of the desktop experience

      Is it going back more towards what it was in GNOME 2.x or even more towards assuming touch than 3.x? The only thing GNOME 3.x did for me was force me to search out Cinnamon so I could keep a mouse oriented desktop.

      1. RegGuy1 Silver badge

        Re: Rethinking of the desktop experience

        * No top bar.

        * No hot corners.

        * No auto-maximise.

        * No stupid thing on the side.

        * No buttons in the window menu bar.

        * No removing perfectly good menus with stupid random buttons everywhere (see: gedit).


        * Make the slide-bar wider.

        * Put the small arrows back on the slide-bar (have you tried viewing a 30,000 line terminal output with these stupid slide-bars?).


        Do I need to go on? Why, when something has worked perfectly well for 20 years does some smart arse choose to fuck it up?

    3. Anonymous Coward
      Anonymous Coward

      The screenshot in the article shows that gnome windows still look no better than old "graphical" MS-DOS programs.

  4. boblongii

    Gaze on your future, Python devs

    A core language which changes too frequently for reasons other than bug fixes is on a one-way road to entropic death. It means everyone is building their "ecosystem" on shifting sands and patching like made to keep up.

    1. Anonymous Coward
      Anonymous Coward

      Re: Gaze on your future, Python devs

      Too true. I use a little Python in my current job, and decided a while ago I just didn't have the will or capacity to keep up with all the (sometimes breaking) changes between versions (and not just 2 to 3).

  5. Claptrap314 Silver badge

    Limitations of distrubutions

    This isn't just a Linux thing--Apple has the same issue. And if you look closely, so does u$.

    In my mind, the real issue is that every distribution has at least two user groups: the end users, and the developers. To the extent that these groups are identifiable, there will be differences, and these differences mean conflict.

    Even better, to both of these groups, the distribution is substrate. This came as a shock to me when I realized this at AMD. Here we were, developing astounding technology, and all the user cared about was their apps. Everything else--the OS & all of the hardware--is just overhead for the app. I still remember how angered I was when I switched over to Ubuntu and found out I had to install dev-essentials.

    So end users care about their apps running. To ensure that, the OS that the apps run in need stability. Meanwhile, the developers of the new apps want the new shiny. The only solution is that the devs end up with a system with up-to-date versions of their tools, and distribution-set versions of everything else.

    The maze of rats' nests mentioned above is the inevitable result of this situation. Feel free to suggest a better way, but I'm not hopeful.

  6. Anonymous Coward
    Anonymous Coward

    When you leave because you're ***** off...

    ...never send that email. You'll only regret it later.

    1. Anonymous Coward
      Anonymous Coward

      Re: When you leave because you're ***** off...

      Fucking send it. Life's too short to forego satisfaction.

  7. Chewi

    He's right

    Former Gentoo Java lead here. I totally feel his pain as I burned out some years ago. It's even worse when you're trying to allow users to build the stuff from source. I'm familiar with many languages, including those that are known to be particularly troublesome for distros (like Rust and Ruby) but trust me, Java is the worst of all worlds.

    One reason for this is its approach to optional dependencies. Take log4j 2, for instance. When I last looked, it had about two mandatory dependencies but tens of optional ones, most of which hardly anyone would care about. That's fine if you're grabbing the precompiled jars with Maven or Gradle or whatever. Grab just the ones you need. Or hey, just grab them all, it's only a few more KB to download. If you need to build from source though, as Fedora policy dictates, you're screwed. Although it's possible, no one uses a preprocessor with Java so all those dependencies that are optional at runtime suddenly become mandatory at build time. And guess what, those dependencies have more dependencies and so on and so on, and before you know it, you've had to package and build half the Internet. Maintaining a single distro package, particularly in Gentoo, carries significant overhead that just doesn't scale in the context of the Java ecosystem.

    Gentoo is not as strict about building from source as Fedora is so I considered just using precompiled jars where possible. You then have to ask what the point of packaging Java stuff is at all though. There are some small benefits but I didn't feel it was worth my time so I moved onto other things. I'm now the Gentoo Games lead. That's much more fun!

    1. This post has been deleted by its author

    2. Anonymous Coward
      Anonymous Coward

      Re: He's right

      Absolutely feel your pain here.

      I've never - ever - used the libraries packaged by the OS in a project, for the same reason. Most API developers are extremely careless when it comes to backwards compat, which makes versioning a nightmare. I say this with some authority as we've been shipping our own API for 20 years, so I'm familiar with my own errors from the past, and how I need to think when designing new features to ensure they can evolve.

      If you're building on others APIs, outsourcing the version management of those APIs to anyone else is just asking for trouble. And yes, I suspect it's particularly troublesome for Java, which sits in that awkward spot between "application failed to load because it couldn't link" and "language is dynamically typed, so I can duck-type missing features".

  8. Shadow Systems


    When did Zombie Sexually Transmitted Diseases enter into any of this?


    I'll get my coat, it's the one with the copy of the "Necronomicon For BOFH's" in the pocket. =-)p

  9. Henry Wertz 1 Gold badge



    Hmm... I tried it years back; it seemed unstable and I suffered some data loss from it. I tried it about a year ago, people report it's nice and all that. I ran an dedpluicater on some of my stored stuff, and compression on some stuff. It would run fine as long as you had 100% up time. Oh, you had a power cut or something? btrfs still has no strategy to recover from that kind of thing; it'll detect issues and go read-only, if you're lucky enough to not lose access to any files at that point (I had files or directories just go at that point), it can tell you that a generation of stuff is corrupted.... ok, it tells you the most recent generation. fsck doesn't help. Rolling back a generation fixes problems there, then it goes read only again because the generation count doesn't line up somewhere else in the directory tree. Seriously it was crap unless you have flawless hardware. rsync and virtualbox both seemed to have remarkably poor performance on btrfs.

    Plain ext4, never a problem -- worst case if you have a poweroff is an empty file if you were in the middle of copying over a file. But, no deduplciation, no compression.

    s3qlfs lets you have a filesystem mount with deduplication and compression, with the actual data stored on your ext4 filesystem. I had a dodgey USB drive for a while so I can tell you it's pretty fault tolerant. It has a proper fsck command that usually worked; once or twice it complained about the database being corrupt (which it does back up regularly, so you don't have a total loss if it's irrecoverable, you use one of the about dozen backup copies), I was able to run a sqlite3 .recover on it, and with an fsck it had everything but whatever I copied in within the last minute or so (which it stuck in lost+found). The performance is quite good, I back up a bunch of junk into s3ql and can also run virtualbox out of it (I doubt the .ova files shrink much since they're probably already compressed, but the live .vdis sure do.)

    1. Alumoi Silver badge

      Re: btrfs?

      Who would have guessed deduplicat* is such a hard word to spell? Although 'dedpluicater' sounds like fun :D

      Grammer nazi and proud of it!

      1. Huw D

        Re: btrfs?

        Grammar. Or a case of Muphry's Law?

  10. Anonymous Coward

    Hell has frozen over!

    > For audio, PipeWire will be used instead of PulseAudio [ ... ]

    This is monumental news.

    Lennartware has been removed from Fedora? I never thought this could possibly happen!

    1. Doctor Syntax Silver badge

      Re: Hell has frozen over!

      It's a start.

    2. Steve Graham

      Re: Hell has frozen over!

      And there's a systemd module to mitigate out-of-memory conditions? What does it do - kill systemd?

    3. Anonymous Coward
      Anonymous Coward

      Re: Hell has frozen over!

      Just more proof that ALSA has failed.

      Why don't Linux people swallow their pride and finally implement OSS properly? It's the nearest to a standard Unix audio system out there, and implemented properly, like in the BSDs, it doesn't require all sorts of soundserver hacks.

  11. Claverhouse Silver badge

    Not As Other Gnomes Are...

    ...since GNOME 3.0 came out


    I entirely misread that.

  12. arachnoid2

    Bowie for the hell of it...........

    Ha, ha, ha, hee, hee, hee

    I'm a laughing gnome and you can't catch me

    Ha, ha, ha, hee, hee, hee

    I'm a laughing gnome and you can't catch me

    Said the laughing gnome

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like