We're hiring. He may need to relocate to Bangalore
Oracle's chief technology officer James Gosling, inherited from the take over of Sun, is leaving the company. In a cryptic blog post Gosling said: "The rumours are true: I resigned from Oracle a week ago (April 2nd)." The blog is mostly down at the time of writing. He said: "As to why I left, it's difficult to answer: Just …
".....Dead, hopefully...." Well, I hope not! There's far too many cross-platform apps written in the stuff which we're forced to use. I'd like it if they made it a darn sight faster.... and a lot less bloated... and not need updating every other week. But not dead. I was kinda hoping Oracle would give the whole Java mess some direction, so I suppose that will mean upsetting some of the old hands.
.... and whatever everyone else wants when it's great. And that generates millions and billions and trillions.
"Gosling did not reveal what his next job or project would be, but said he would take some time off before he began job hunting."
Can he not start up something fabulous on his own with others like minded and do something amazing, or is he just a desk jockey type to do as he's told rather than able to do as he wants?
Maybe he panicked after Oracle management asked him to be productive.
At Sun all he ever did was sit in his ivory tower and try to replicate every popular application written in C into another one of his badly written Java knock offs.
Every so often management would box him up and send him to a conference as a geek curiosity.
"Well, I hope not! There's far too many cross-platform apps written in the stuff which we're forced to use."
Good platform-neutral programs are done with a platform-independent (GUI) libraries like QT or GTK or many others.
There is also Mozilla's XULRunner framework, which is the foundation of Firefox.
Java is a huge failure and not an innovation. Everything large&serious must still be done in C++, Pascal, Fortran or some other ALGOL-like, compiled language. Forget GC & VM - a waste of memory and time.
@jlocke: "Java is a huge failure and not an innovation. Everything large&serious must still be done in C++, Pascal, Fortran or some other ALGOL-like, compiled language. Forget GC & VM - a waste of memory and time."
I thought this might be a serious post, until you got to Pascal. No one writes anything in Pascal anymore. And Fortran isn't far behind, but some Fortran programs are still used in niches.
"Java is a huge failure and not an innovation. Everything large&serious must still be done in C++, Pascal, Fortran or some other ALGOL-like, compiled language"
You obviously have never done any large and serious coding to make such a sweeping statement.
how it'll affect the open-source Java community? I get the impression that the supposed reigns at the lead of future Java development would fall, by default, to the hands of whom, perhaps primarily, since this Oracle buyout. and now that Java's chiefs are hopping off the OracleSun ship.
I just hope they keep it professional enough for it to stay afloat.
I am guessing who the mother was.
This one ?
Nah, too pretty.
or probably this one:
Or was it actually Paris Hilton's invention ? That would explain a lot.
What does qualify as "serious" ? A Compiler/Translator of 12K lines, maybe ?
If yes, look at
Its very, very fast and done in C++. I have some serious doubt you can do the same in Java or .Net.
All the Java stuff I must use is sllllooowwww and memory-devouring. VS2010 apparently consumes RAM like crazy, as it is done in .Net.
I bet all the stuff you use to surf the internet, look at videos, write office documents is done in - TATA- C or C++.
When you have finished that Photoshop clone in Java, please call back. And waiting a minute for a rescale that takes two seconds for the C version does not qualify.
Regarding Pascal and Fortran; they ARE still in many ways better than C++ and certainly Java. Small, fast and easy to optimize automatically, namely. Delphi (the best Pascal compiler) compiles huge programs in seconds and the resulting executables are also very fast.
There was a time the majority of the world's population adopted Communism until they realized it was not a good idea. Same with Java.
Java has a fundamental limit to its speed - even if compiled - which is that it doesn't have stack / automatic / scope allocation - all objects are on the heap. It also means it can't support RAII (Google).
ActionScript in Flash is based on the same core (ECMAScript) as Java*Script* - not related to Java in any way but name.
of speed, but then speed is not always needed. Very often programs spend most time waiting for user responses. Java is OK then. Just do not start teaching in Java, as the kids never learn how to TIDY UP THEIR MESS!!! Once they can program properly in C (or even Pascal), and know how linked list work from the inside out, they can be allowed to use prefab code available in Java.
When processing gigapixel and terapixel images, I really need to go to C(++) or similar. Java actually is not the worst: scripting languages are the real killers. I have seen some dismal attempts at processing SERIOUS amounts of text in python. What python does in days C does in minutes. Result: a 16 core machine is constantly chugging away with 16 python jobs which could have been finished WEEKS ago, meaning I cannot test my efficient parallel C-code on this machine (BLEH).
Python and the like are great for prototyping and stringing together bits of compiled C code in a flexible way, just do not process ARRAYs of real data.
That is just one important aspect. Another is that heap-allocated objects are dormant and eating RAM until the next GC run. My guess is that most Java programs have 70% of their RAM consumed by discarded String objects.
A C++ program uses Smartpointers for that and as soon as the String is no longer needed, the memory is reclaimed.
Without anyone to "execute" what will he be? If he were really any good, he wouldn't need Oracle to prop up his position.
If he doesn't get his own company going to produce something worthwhile, he will be just another has-been in the history of dead (programming) languages.
Biting the hand that feeds IT © 1998–2020