back to article Catch Java 22, available from Oracle for a limited time

Oracle released Java 22 (JDK 22) on Tuesday, sporting a dozen new features for Java developers. Based on OpenJDK, the open source reference implementation of the Java SE Platform Edition, the commercial Oracle JDK is offered under a subscription license while OpenJDK is offered under a GPL license. JDK 22 is a Feature release …

  1. BinkyTheMagicPaperclip Silver badge

    What's the point?

    You could buy the Oracle JVM based on the open source code, and pay $180 for each and every employee, every year

    orrrrr you could pay *nothing* and use the open source version, that everyone else will be using. Certainly the (fairly large) company I work for stripped Java from practically all systems as the licencing cost was significant. Am I missing something amazing ('support' ?) about the Oracle release?

    On a vaguely related point I looked at CLion recently, which is also 'subscription based', but if a year's subscription is purchased you retain a 'perpetual fallback licence'. That's the way to do it.

    Haven't used Java for years. When it came out it was useful, and cross platform. Used it in Jython, a few Java only programs, created a moderately complex Java object called by Coldfusion. The Enterprise APIs were interesting, if a bit slow.

    Now there's a lot more cross platform frameworks, web development options, Python, Powershell, and others.

    1. BinkyTheMagicPaperclip Silver badge

      Re: What's the point?

      (and before anyone points it out - yes I know CLion needs (Open) java, and things such as LibreOffice also use Java, or at least they used to). However, my first choice these days would either be Python or .NET/Mono for pretty much everything from web development to scripting. If I had to do cross platform desktop client development, I doubt I'd look at Java first, but (Open) JDK would be on the list.

    2. Steve Channell
      Facepalm

      Re: What's the point?

      The paid version includes GraalVM which "promises" higher performance as long as you don't use any of the stupid API that can't work

      Java 22 still includes RMIClassLoader 22 years after it was near universally agreed as a bad idea. RMIClassLoader is responsible for most security violations (including log4j & struts)

  2. katrinab Silver badge
    Alert

    Who is doing AI development in Java?

    I thought Python was by far the most popular language for that, and the second place language has so small a market share to be insignificant, I'm guessing it would be something like Go, Rust, C++, maybe even Swift.

    1. Paul Herber Silver badge

      I'm sure Python is the Major Major Major language for AI.

    2. Anonymous Coward
      Anonymous Coward

      Well, yes. Java is the worst programming language ever invented, and its use in CS programs (to teach programming skills) has been one of the worst disasters in software engineering education history -- students ended up "learning" to import library on top of library to solve even the simplest of coding problems, that could have taken just 3 lines of code in a proper language. On top of that, Java is slow, as molasses, treacle, high-fructose corn syrup, international snail mail, etc...

      Python, on the other hand, is worse than Java (even for AI). It uses the same horripilation of pulling in libraries on top of libraries for everything and nothing, and is even slower than Java. It is the QWERTY keyboard of program development, aimed at curtailing student imagination, incarcerating their code into indentation-sensitive silos, and imposing a cult-like Duncan MacLeod Highlander pseudo Zen that mandates "there should be only one [...] way to do it". Its head should roll!

      The better way to do AI is obviously in JavaScript (cousin of Scheme, pal to Haskell), or even Hoot, running on top of WebAssembly. No catch-22 Hotel California lock-in here, just pure unadulterated enjoyment of first-class closures, and continuations (in Hoot). Your brains will thank you for it!

      IMHO

      1. Anonymous Coward
        Anonymous Coward

        Java is hot garbage

        But it is just the latest in a long series of languages that had their day in the sun and outlived their glory days.

        Java it a hot mess, but it was replacing Visual Basic in many of the places it was being deployed in businesses. Visual Basic was pushing out old COBOL code. Each was deeply flawed in some ways, but those flaws were the things we didn't intend or know we needed to fix most of the time.

        COBOL took over the business world because it was vaguely intelligible, and reasonably portable across (radically) different hardware architectures. VB used more modern programming and syntactic structures and brought deep Windows integration during the age it was the center of the universe. It was also a dumpster fire, so Java seemed an improvement at the time for enterprise software, and was under Sun's stewardship at least fairly cross platform.

        It was also deeply wacky, and always came off like something that crawled out from under the sofa of an alternative lifestyle club. Strict and scolding, delighting in tying the hands of it's developers. It was also a dumpster fire of software quality and reliability. It's strict garbage collection meant that performance intensive code was crippled, hugely bloated and to often unstable.

        So now we see that like so many others, it is in the twilight of it's era, as large scale projects move to platforms that actually deliver memory safety, instead of tying programmers hands while constantly blowing it's security promises from JVM bugs the average Java programmer had no control over, bugs that often gave over root to the underlying hardware, not just the data the programmer had access to.

        Python will be the next on the list, displacing Java not because it fixed Java's failings, but because it was the center of the universe for big data and early machine learning research. It will last long after the AI bubble bursts, and by then Rust may take and hold the crown for a while, or maybe something else. But the research bears out that the success and popularity of computer languages is only loosely coupled to the technical aspects of them. They are driven by broader external trends as much as anything.

        They also tend to last decades past their prime, because refactoring code is expensive and provides not new functionality while introducing bugs. For many companies it's too tempting to pile on technical debt instead of re-achitecting until the system is on the verge of collapse.

      2. Blank Reg

        I can see why you went anonymous, that's a load of BS. You complain about python being slow and then you go for Javascript? It's also dog slow and a complete abomination of a language.

        1. Androgynous Cupboard Silver badge

          I’m just tickled that “Java was replacing Visual Basic”. In much the same way that the motorcar replaced the pineapple.

          1. Anonymous Coward
            Anonymous Coward

            ... which was a right step up from when the pineapple initially replaced the suppository.

            1. An_Old_Dog Silver badge
              Thumb Up

              Better than a Pineapple

              Too many computing fads and silver bullets are pineapples, which once inserted, are quite painful to remove.

      3. ldo

        Re: Python, on the other hand, is worse than Java

        Python has Jupyter.

        You remember the old 8-bit micros with BASIC in ROM? Switch them on, and be instantly at a prompt where you can type lines of code? Python+Jupyter is a worthy successor to that. You can keep a history of all your input with autosave, and you can get rich output (e.g. 2D/3D graphics, audio, interaction) from your code cells.

      4. swm

        I taught C++ at a university and one of my previous students said that that class was the best class to prepare him for the real world.

        1. Bebu
          Windows

          Reading between the lines...

          《C++ .... one of my ... students ... said that was the best class* to prepare him for the real world.》

          The real world being what it is I am not entirely sure it amounts to an accolade. ;)

          The STL pretty much did it for me. The C++ standards remind me more of Gormenghast than Byzantium.

          Decades ago at uni I had a programming/SE course that included Simula67, a language for which I recently retrieved the language reference and guide - it wasn't a half bad language if you were familar with Algol. (I was trying to recall how Simula handled references.)

          *A potential pun there. ;)

        2. druck Silver badge

          I agree. I was on one of the first C++ courses in the UK at UCL in 1987, it was very primitive C++ back then and had to be compiled in to C by Cfront. There weren't many text books at the time, so our lecturer was writing one, he handed out a draft to everyone and offered to buy a pint for anyone that spotted a mistake. The next day I came in with a list of 10, did I get a single pint? Hell no.

          But learning C++ before even C worked out really well. I was able to drop down to C and do embedded stuff for a number of years, before coming back to C++ in Visual Studio for Windows applications. Going to C# was easy (on Linux with mono rather than Windows), it had the clean syntax that C++ should have had all along, and still trying to achieve with everything after C++11. I'd even say it helped with Python, which although different in many ways, I still come at problems from the OO perspective rather than purely functional.

          1. ldo

            Re: from the OO perspective rather than purely functional.

            In Python, you can do both at once, since functions are first-class objects.

      5. RAMChYLD Bronze badge
        WTF?

        Having developed in both Java and Python, I was about to agree with you.

        Then you brought up Javascript and claimed it superior, and lost all credibility.

        I went through college being taught Java is the future. And then came out and not only had Oracle bought over Sun, the hot programming language on the block is now C# and ASP.NET. I was just lucky it was easy enough to learn that I could pick it up in 6 months and secure a job with that even if all I've been taught at college is Java.

    3. LateAgain

      Jython :-)

      Write it in python. Compile it to Java byte code.

      1. Michael Wojcik Silver badge

        Of course lots of languages have back ends that target JVM. You can compile COBOL — traditional procedural COBOL, ISO OO COBOL, or enhanced-syntax managed OO COBOL — to JVM bytecode (or to .NET IL). So you can probably have all the advantages of Java with all those of your favorite source language.

    4. Kevin McMurtrie Silver badge

      Python got off to a good start because it can do DSL (domain specific language) reasonably well. It can handle the special operations of AI.

      DSL in Java is hideous. C++ is far better looking but it is extremely technical. Golang is always hideous for high level operations.

      So, AI is stuck with a language that is easy but incredibly slow. Everything runs great as long as you're in a native library but grinds to a halt in interpreted control code. There's also the mess of getting all the native libraries using the same number formats so they play well together in one app.

      If Oracle was smart they'd come up with good DSL extensions for the paid Java so they can jump on the AI bandwagon.

      OpenJDK could also cheat and import more Scala features. There be dragons, for sure.

      1. ldo

        Re: because it can do DSL (domain specific language) reasonably well

        One of the things that helps with that is being able to do custom overloads of standard operators and other constructs (e.g. array indexing) on user-defined types, à la C++. Python is only a fraction of the complexity of Java, yet it managed to include something significant like that in its core feature set, which Java could not.

      2. Michael Wojcik Silver badge

        It can handle the special operations of AI.

        What "special operations"? People have been writing ANNs in various languages since the 1960s. (Perceptron networks date back to the '40s, but I think the first real software implementations didn't come until the '60s.)

        Python is popular for "AI" because 1) it's already popular for scientific number-crunching, thanks to NumPy and Jupyter and the usual faddish luck; and 2) it was the preferred language for TensorFlow 1.0. People already using Python were presented with a Python library and a bunch of Python examples for doing DL-style ANN processing, and, shockingly, that's what they kept using.

        Maybe if Google had made TF 1.0 a Julia project we'd be in a different (better) position. Though the attraction of NumPy seems to be hard to overcome. It's the BLAS of the 21st century. (Yes, yes, I know NumPy and BLAS are not directly comparable.)

    5. Steve Channell
      Windows

      Java is rubbish for AI workloads

      Back when Generic Java (GJ) was mooted for as a successor for Java 2, ther4e was much debate about whether to bite-the-bullet and amend the Java bytecode to support runtime generics, but instead stuck with GJ type-erasure : at runtime ArrayList<T> is the same as Java 2 non-generic ArrayList. .NET 2 did bite-the-bullet and include runtime generics because it was thought that you could never replace C++ with arrays scattered across the heap.

      Net result, betting that "everything will eventually be re-written in Java" was a mistake, because there is no equivalent of native arrays packed in continuous memory, and no support for primate data types that were not understood three decades ago

  3. trevorde Silver badge

    Stop right now!

    Larry has enough private jets, super yachts and Hawaiian islands

    1. Phil O'Sophical Silver badge

      Re: Stop right now!

      Larry has enough private jets

      He has a MIG-29, but the US government won't let him fly it.

  4. ldo

    Java Is UnbeluckingFievable

    I currently have to use Java for a project that involves connecting to an in-house server via HTTPS (modifying an existing app). If there is one word that sums up the Java language and APIs, it is “bureaucracy”.

    Consider setting up a TLS/SSL context. In Python, creating the context object is one call:

    sctx = ssl.SSLContext(protocol = ssl.PROTOCOL_TLS_CLIENT)

    Loading a private CA cert is one call:

    sctx.load_verify_locations(ca_cert_file)

    Loading a cert and key for client-side authentication is one call:

    sctx.load_cert_chain(client_cert_file, client_key_file)

    But in Java, you first need to create a KeyStore. Load the CA cert via a CertificateFactory and creating a TrustedCertificateEntry. Call the factory’s generateCertificate routine to load the client cert. Then go through a whole lot more contortions to load the client key file.

    And then, you have to set up a TrustManagerFactory to create a TrustManager, and a KeyManagerFactory to create a KeyManager, so that you can load the contents of the KeyStore, via those two separate intermediary objects, into an actual SSLContext for use in HTTPS connections.

    1. Blank Reg

      Re: Java Is UnbeluckingFievable

      The “bureaucracy” is what allows you to write millions of lines of code and not have things fail apart down the line when in some long forgotten code path somebody passes the wrong type to a method.

      Python is an awful language, who in their right mind thought spaces should be important? For anything of meaningful size you want a strongly typed language, it will save many hours of time when learning someone elses code or debugging failures.

      1. Anonymous Coward
        Anonymous Coward

        Re: Java Is UnbeluckingFievable

        Yep. Java has many issues and networking is certainly one of them - the URL class is well known abomination, and a lot of the SSL code is showing its age. You are absolutely correct about the steps required to customise an SSL connection, it’s a pig.

        But I am running Java code I wrote 25 years ago without change, whereas Python couldn’t get from 2 to 3 without breaking compatibility. Boiler plate code is acceptable if I only have to write it once.

        1. ldo

          Re: whereas Python couldn’t get from 2 to 3 without breaking compatibility

          The whole point about Python 3 was about learning from one’s mistakes. The Unicode handling in Python 2 was bad, and couldn’t be fixed without breaking backward incompatibility. So they thought, while they were about it, why not fix a few other sore points while leaving the legacy baggage behind as well?

          Compare Java, which is still saddled with that UTF-16 albatross in its Unicode handling. And the weird way they added generics—for some odd reason they decided they needed binary compatibility with old code as well, hence the whole “type erasure” and “reified” and “raw” types rigmarole.

          What took the cake for me was doing some date/time conversion work in an Android app some years ago: I found this Java class which seemed like the logical starting point, then noticed that every single method (including its constructors) was deprecated. So I thought, OK, this class is obsolete, I should use something else. But all the other date/time APIs kept referring back to this class! Then it hit me: all the old methods of dealing with that class were deprecated, but the class itself was not!

          1. Anonymous Coward
            Anonymous Coward

            Re: whereas Python couldn’t get from 2 to 3 without breaking compatibility

            Yup! And then there's that GIL of doom ... soon to be excrutiatingly optional, at a slow, slow, slithering, pace ... moving ever so slowly, yet breaking things ... (like a snake out-of-water, in a china shop!)

            1. ldo

              Re: then there's that GIL of doom

              The GIL was there to support the use of reference-counting as a first resort for disposing of discarded objects, instead of having to run a garbage collector all the time, like Java does. This makes Python better-behaved than Java for memory usage in many common situations. But the downside is the impact on threading performance.

              Rather than take the easy way out and go full-on GC like Java does, they have come up with a scheme for managing “nondetermistic” reference counts.This should allow for an improvement in threading performance, without giving up the well-behaved memory usage. But the implementation is going to be quite complicated and likely take time to get right. As JFK said: “We choose to [do these things] not because they are easy, but because they hard”.

              1. Blank Reg

                Re: then there's that GIL of doom

                Calling a full-on gc the easy way out tells me that you've never written one. While it's not that hard to write a bad one, writing a good one that is fast and efficient is pretty damn hard.

                1. ldo

                  Re: writing a good GC that is fast and efficient is pretty damn hard.

                  Is it even possible? “Fast and efficient GC” sounds like an oxymoron.

          2. Androgynous Cupboard Silver badge

            Re: whereas Python couldn’t get from 2 to 3 without breaking compatibility

            OK, OK. URLs, most crypto stuff, XML processing, most aspects of the AWT, Dates, anything coming from Java EE. Those all suck massively (although no deprecated date methods in core Java, so that blame goes elsewhere). Generics, yes they suck too although you get used to it after about a decade. Pretty much all the language additions after Java 8 just adding syntax sugar? Suck suck suck. I would rather learn one syntax than two, no matter how short the second. The streaming API is an obtuse abomination.

            The UTF16 thing is a tradeoff - variable length characters have their own issues, and Strings do let you iterate by codepoint if you want to. The Collections, Threads, the concurrency package and InputStream/OutputStream classes are pretty damn good when wielded effectively, The strict typing model they've chosen turns out to be one of the most effective for large software projects, although that's not unique to Java. Finally all this runs on the JVM. I started with Java in about 1998, and filed a lot of JVM bugs (I once persuaded the IBM JIT compiler to multiply two integers incorrectly), and I used to have to test on Windows and Linux because results differed. I haven't done that for a very long time, and the performance improvements over the last 20 years have been staggering. Things like the ZGC don't happen overnight.

            1. ldo

              Re: The UTF16 thing is a tradeoff - variable length characters have their own issues

              UTF-16 is an attempt to make the best of a bad job.

              Remember, the original promise of Unicode 1.x was that it was going to be a fixed-length 16-bit code (“UCS-2”), supposedly sufficient to cover all the world’s extant writing systems.

              Then in 2.0, they thought, “why don’t we include historical systems like hieroglyphics, cuneiform and so on?” And these cool things called “emojis” as well? So “UCS-2” was replaced with “UCS-4”, while platforms like Java and Windows NT had their “UCS-2” coding rebranded as “UTF-16”, with those two ranges of 1024 code points each called the “high surrogates” and the “low surrogates” defined as a hacky way of representing the extra characters.

              Meanwhile, platforms (like Linux) that had been lazy about confronting the Unicode issue discovered this thing called “UTF-8”, which turned out to be a better way of coping with things: if you need a variable-length code (and you do), then UTF-8 makes more sense than UTF-16.

              1. Anonymous Coward
                Anonymous Coward

                Re: The UTF16 thing is a tradeoff - variable length characters have their own issues

                "Discovery" of UTF-8 goes to Ken Thompson, but I absolutely agree with your point - this stuff was still evolving when Java et al were trying to build on it. There's a reason HTML has a default media type of ISO8859-1.

            2. runt row raggy

              Re: whereas Python couldn’t get from 2 to 3 without breaking compatibility

              with surrogate codepoints, utf-16 is also variable length.

              from experience, it turns out that knowing character boundaries isn't as helpful as you'd think. Ken Thompson reimplemented grep for utf-8 and the matching engine only matches bytes using byte classes (think [a-z]) and |.

      2. Dan 55 Silver badge

        Re: Java Is UnbeluckingFievable

        The “bureaucracy” is what allows you to write millions of lines of code

        Sorry, is there any alternative to writing millions of lines of code in Java? There are strongly typed languages and then there's 75% more boilerplate on top of the real code because it's Java.

        1. Blank Reg

          Re: Java Is UnbeluckingFievable

          Sure, write millions of lines of code of C#. C++ etc, I don't care, we just need to stop using scripting languages where they don't belong, it's stupid

          1. ldo

            Re: scripting languages

            What is a “scripting language”? Is it one that compiles to bytecode? That is true of Java and C♯, as well as Python and Perl.

      3. druck Silver badge

        Re: Java Is UnbeluckingFievable

        Python is an awful language, who in their right mind thought spaces should be important?

        So you don't think spaces are important? You write your Java/C/C++/C# as one great long line, or lines with no indentation whatsoever, or maybe random indentation just for the hell of it?

        Of course you don't (unless you are a complete and utter knob), if you write properly laid out code in other languages following any sensible coding standard, Python is the same just without the curly brackets.

        1. Blank Reg

          Re: Java Is UnbeluckingFievable

          No it's not the same at all. Accidentally changing the indentation of any other language only changes how the code looks, not how it works.

          And what is gained by this stupidity of making indentation meaningful? You get to skip typing curly braces. If typing curly braces is what is slowing you down then you are writing some extremely simple code.

          1. druck Silver badge

            Re: Java Is UnbeluckingFievable

            Well if you do delete a space Python will tell you as soon as you run it. C/C++/C# will compile happily, but you'll damn well be putting that space back in as the result of a code review, so what have you gained apart from wasting everyone's time?

  5. Roger Greenwood

    I came here for the catch 22 puns and quotes

    and have been disappointed so far.

    'That's some catch, that catch-22,' he observed. 'It's the best there is,' Doc Daneeka agreed."

    1. Anonymous Coward
      Anonymous Coward

      Re: I came here for the catch 22 puns and quotes

      Not really a pun indeed, but the Java-Python AI-duel is highly reflective of the related paradoxical double-bind Captain Yossarian. One can't be discharged from the ignominy of these coding aberrations without demonstrating the disassociativity required to adopt them in the first place. Doc Daneeka would surely agree that the one sane programmer in that sqadron, is doomed! (eh-eh-eh! 8^O)

    2. Michael Wojcik Silver badge

      Re: I came here for the catch 22 puns and quotes

      The question is, do you want to be Orr, or Snowdon?

      (I hope those are right. Haven't read the book in ... let's see ... 40 years?)

  6. Bebu
    Windows

    An extraneous 'K'?

    《"Prior to the macOS 14.4 update, in certain circumstances, the macOS kernel would respond to these protected memory accesses by sending a signal, SIGBUS or SIGSEGV, to the process," explained Aurelio Garcia-Ribeyro, senior director of product management at Oracle.

    "The process could then choose to handle the signal and continue execution. With macOS 14.4, when a thread is operating in the write mode, if a memory access to a protected memory region is attempted, macOS will send the signal SIGKILL instead."》

    In general I would have thought you wouldn't send SIGKILL to a thread or process because of the mess its likely to leave behind.

    Have to wonder whether it was SIGILL (illegal instruction) that was intended - not that makes a lot of sense either.

    Its not clear whether the java vm and/or runtime and class libraries are responsible for accessing the MacOS protected memory or the application developer.

    If its the former then Oracle's problem.

    The latter then java etc could be changed to preventively detect such accesses and translate them into SEGV etc which the developer's code could handle (or not.)

    1. Michael Wojcik Silver badge

      Re: An extraneous 'K'?

      If the kernel is raising SIGKILL, the JVM can't prophylactically detect the problem. SIGKILL can't be caught; that's the whole point of it.

      Since bytecode is a managed environment, it's either the JVM or some native code running alongside it — a JVM dependency or something invoked through JNI — that's hitting the invalid address. How would the JVM determine ahead of time that the address is invalid? Keep a list of all mapped pages and check every pointer to see if it lands on one of those pages before dereferencing?

      Frankly, if the MacOS kernel is raising SIGKILL for an invalid memory access, then the kernel is wrong. This looks like Apple once again deciding that their OS should be something almost, but not quite, entirely unlike UNIX.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like