
I suspect there are quite a few Java devs out there
The question is how many of them are good?
Among developers, Python is the most popular programming language, followed by C, Java, C++, and JavaScript; among employers, Java is the most sought after, followed by C, Python, C++, and JavaScript. Or so says the 2017 IEEE Spectrum ranking, published this week. IEEE Spectrum, a publication of the The Institute of …
I have ~20 odd years of Java experience.
I try and implement something in it. Its shit, its slow, I have to install 20mb of JVM and crap.
I pull it and re-write the stuff in C + python.
This is just for internal tools. A quick read of the Java license scared me off doing any billable product in Java.
I struggle to remember what the problem that Java is the solution.
"So after 20 years you still suck, basically?"
He prefers other tools - I happen to share his viewpoint. I write Java where folks require it, but most new stuff is in Python by popular choice.
I undertook a fairly small greenfield project that demanded minimal runtime footprint & cost, JVMs were far more costly at runtime in time & space than the equivalent C++ code - and I did actually code up the core loops & profile them in Python/C++ & Java and measure the cost as honestly as possible. Also with the C++ solution we easier to validate from the security point of view - because we didn't have to delve into a bunch of opaque third party binaries - like the JVM for example. There was an absolute minimum of third party code - and the finished beast ran in a privilege-separation format - again much more natural to code in C/C++ than a JVM hosted language + runtime.
"Java isn't slow and the JVM allows you to write code in many languages not just Java."
It's a lot better than it was - but it's not the quickest either - the memory footprint alone means that the electrons are putting in more miles across the various databuses & memory arrays. When push comes to shove Physics will always be on the side of compact run-times.
Firing up a JVM is *slow* in comparison to a comparable bit of C/C++, regardless of the merits of whatever you've compiled into bytecode.
Having defended C/C++ a bit I have to point out that I write the majority of code in Python followed by Java - they have their strengths too, although I'd say Java is actually pretty much lumbering on by convention, toolchain & framework inertia. New stuff does tend to be Python for better or worse.
As my career started in resource constrained times, I was formed as a C/Assembler then C++ guy.
When Java appear in the 90s I made the effort to learn it but soon became disillusioned with it as a fat, lumbering sloth of a thing that would use many times the memory and cycles to do the same thing as I could with my old comfortable friend languages.
I still don't see any reason to use so much resource just because it's available.
Python, great and I got into it helping my son through school with it, it does some exceptional things.
But I still like to be as close to the metal as possible. If something doesn't need real programming then Javascript will do it.
Python will take over the world though, it's the new BASIC and the enthusiasts/makers love it.
Java has 2 key features: GC and not too much power. You can turn loose a bunch of average devs and get them to churn out a pile of Java code that meets the requirements. If they get into a tight spot, the staff bright spark can fix things without too much trouble because Java doesn't provide any sharp things that destroy maintainability when used badly--macros, decorators, metaclasses, etc.
Indeed.
My favorite job interview was a second one (after HR had confirmed I didn't drool on the carpet or attack people at random).
"Here's your login details, the machine, the language manuals and your programs function description. We'll be back in two hours. If it runs you get the job."
Only time I'd ever seen such an interview technique. Yes, it took some effort to set up, but the employer soon gets to separate the workers from the BS merchants.
"Here's your login details, the machine, the language manuals and your programs function description. We'll be back in two hours. If it runs you get the job."
I had a similar 'interview' with the last on-site gig I did. It was a phone interview followed by an e-mail, with a request to take a particular data format and do something with it. "Any language" and it was timed.
I did it in about an hour or two, with a nice robust C++ application. But one guy did it in 5 minutes (using Perl). If I'd known BSD/Linux as well as I do now, I'd have done it in about that much time using 'awk'.
(after that it was 'meet everyone' so they could figure out if they could get along with me, get a tour of the place, and so on - small startup company)
Java programmers should learn C first, THEN C++, and THEN Java.
That would help build some proper coding discipline, so they don't start out every function/process using "ginormous collection object", and instead use some "non-insider" readable code that looks a bit more like C or C++.
THAT, and the discipline of explicitly cleaning up your objects when they're no longer needed...
If its their first and only language then most probably (but not certainly) not many. Languages are very closely tied to methodologies and seem to tie people into a certain way of thinking about how to solve the problem. People solve problems with the tools they are familiar with - that bloke solved Fermi's last theorem in 100 odd pages where Fermi couldn't quite fit his answer in the margin of his book, Different languages tend to take different approaches to things some will go from A to D via B and C and others via F G and H. My memories of Java were it went from A to B via A1, A2, A3 and A4 for a lot of the time but that may have been just my A and B were not in Java at the time.
If you try and learn other languages you will move into other problem spaces the languages were designed to solve and realise that problem space is not the same as language space and no language fits all but more importantly learn there are many ways to skin a cat and when it boils down to it when you have a job the best way to do it is the way the local cat skinners do it for the local cats and even after 50 years in the job you will still be learning shit your computer language has trouble with because it was designed by some anal retentive genius who had a problem with brackets or camel case or something else really irrelevant to a CPU or GPU, or in the case of Java by a committee of them.
All computer languages are shit but some people can at least drive them off road for a bit before they need a sky hook to get them out of trouble,
Fermat not Fermi.
Fermat is what I have at home - just inside the cat door. For them to wipe their feet[1] rather than carefully preserving the cold-and-dampness so that they can walk all over me in the night[2]..
[1] Sadly, this little exercise in cat-training hasn't really worked. Probably because I struggled to think of how to reward them for doing it. And, in general, cats are mercenary little blighters.
[2] In oh, so many ways. After all, how many people would get up at 3am to let out the youngest cat[3] who really, really can't be bothered to use the cat door to go out. She's happy to use it to come in though. Probably because of the lack of suitable servants waiting outside to let her in.
[3] She of the multiple paranoia-syndrome. Even her paranoia is paranoid about the other paranoias she has..
Whilst you could not computationally prove the theorem using infinite sets, its worth mentioning that {C#, C++, Python} together with every functional language (except Scala) supports infinite sets, Java does not.. which is good.. Java (lacking tail call optimisation) would fall over randomly with a stack overflow (differently on every machine - develop anywhere, debug everywhere)
Given that the PYPL measure is for google searches (heavily skewed by JavaDoc), it is not unreasonable to conclude that a large number of Java developers need help getting through the day.
The Redmonk score looks more interesting, though I'm note sure c# "async execution of a Linq closure" compared to Java's "which Date class is best" or Python's "why is 5 + 1 sometimes 6 or 51" is a fair comparison
".... but they write C programs." Yeah, guilty! Nowadays I write code once in a blue moon, but it's usually C code morphed into whatever wrapper is required.
One strange practice I commonly run into is customers that have picked the coding language for a problem long before they have determined even what the problem is. I try to at least steer the project to a thorough pseudo-coding before going to language selection. Agilistas really hate that, they seem to think it insults their skills.
... I have used a lot of criteria when choosing a language to utilize for a given project, or a new language to learn for my own edification. In all that time, I have never, not once, picked a language due to it's popularity.
I have never really thought about it before, but I suspect that choosing anything purely due to it's popularity is a mug's game. Or rather, popularity brings out the lowest common denominator, making popularity quite the opposite of elegant. See music for a rather egregious example ...
Although probably a good idea to avoid "M," the language formerly known as mumps.
A friend of mine made a very good living doing M[1] just after leaving University.
We'd taken the piss all the way through Uni because she was reading Philosophy. Then she got a job in my field on more cash than I was getting...
She's an undertaker now...
Vic.
[1] Yes, it was still Mumps at the time.
"She's an undertaker now..."
That's sort of my point.
IIRC it was Forth based and allows abbreviations of commands. IOW it's for those who find C a bit too verbose.
I think it can legitimately be said that after you've used it you won't want to use another programming language.
Because you won't want to do programming ever again.
Ooops.
MUMPS was not Forth based, as it preceded Forth by about 5 years (1966 Vs 1971).
Although its terseness and design of breaking code into 2KB blocks is very Forth like.
OTOH variables <==> files <==> b-trees mean anything can be made persistent across all instances (IE a file) just by putting "^" in front of the name is not very Forth like.
And then there is the command abbreviation, combined with number of spaces between some of them being significant. That could make for a complete mindf**k when reading through old code, to the point of writing a tool to expand such abbreviations to make the whole thing more readable.
"Surely these languages are popular for a reason?"
Yes, but that reason, more often than not, is hype surrounding them. You have professors who grew up in the pseudo OOP-Hype of the 1980s and 1990s and think C++ and Java are the ultimate languages as they are so OOP.
What he said.
Whenever I learn a language it's because I've found a new (or even old) language that can do something I need my software to do better than the languages I already know. They are all tools, you just pick the best one for the job. Yes you can use a screwdriver to hammer a nail, but using a hammer is just easier.
They are all tools, you just pick the best one for the job
Many (subjective) eons ago, I was a mainframe assembler programmer, writing (bad)[1] code for a system running TPF.
One of my siblings, having obtained various degrees and doctorates, was musing why we still bothered using such archaic languages when they had so many better ones in University..
I managed to restrain myself from beating him to death with a POPS manual[2] and suggested that the 40+ years-worth of code we were maintaining couldn't be replaced in a hurry, especially with languages where very few programmers existed and where there was no long-term commercial experience.
[1] One of the many, many reasons why I stopped[3] being a programmer and went into support.
[2] We chucked ours[4] away some time ago. Then, a few weeks later, discovered how much they were worth online. Doh!
[3] Some might claim I never really started. YMMV.
[4] Senior Controller was also a programmer, in the same company (we were married before we went there). She stuck at it considerably longer, being considerably better at it than I was. I was more of a 'hack it together and then fix it in testing' sort. She is one of those tedious^W meticulous types that actually preferred to design things first.
"if you are choosing a language in a commercial situation, a ready supply of people who know it will aid success of the project and reduce its future support costs. Hence, choosing on popularity makes sense."
Isn't that exactly why so many organisations continue to use Windows, in spite of all the grief they incur by doing so?
And does it really, honestly, cost less in the long run?
I loved COBOL, at least the later COBOLs that weren't so restrictive over line lengths.
A lovely, verbose language and you can make it very modular.
I've used dozens of languages over the years, COBOL has a soft spot, as does Z80 and 6502 Assembler. 68K Assembler was also nice, but x86 Assembler was a nightmare in comparison, assembler equivalent of VHS, compared to 68K laser disc...
Z80 and 6502 Assembler
Indeed. My first (real) computers were a Nascom 1 followed by a BBC Micro.
I did write a bit of X86 assembler during my (short) programmer phase even though I was (nominally) a mainframe programmer. But it was more fun to write an assembler utility that went round the (token-ring) LAN looking for OS/2 print servers and then enumerating all the stuff people had statched on the file shares on the server.
It was *fairly* network intensive, which is why I only ran it on the evening. Found some fairy 'interesting' stuff as well as quite an amount of warez.
This was sometime in the early 90's. When I was young and foolish.
During the Y2K bonanza/hysteria, I was amazed at the amount of COBOL code that was expensively edited to get round the Y2K issue, rather than rewritten in a more modern language. Seeing as the majority of those COBOL coders from 2000 are dead from old age, any young whippersnapper with COBOL in their skill set will probably make serious dough when the next Y2K-like issue arrives and all that code needs to be edited again.
Excuse me, Matt. The reports of my death are greatly exaggerated.
Still making money quietly coding COBOL, and still recommending it as a language for kids to learn if they want a guaranteed income into the foreseeable future. I know lots of Java, Python, C# etc. coders who are out of work, but the COBOL folks are all gainfully employed. Can say the same for Fortran.
COBOL is dead! Long live COBOL!
Overall popularity is meaningless as it depends on use-case. Chinese is a very popular language but it's of little use if you plan to live and work in France.
Programming language popularity is heavily skewed by web and mobile app development. If you want to work in financial services or machine learning though you'd want to research what those industries need rather than look at the overall top 10 languages.
> You can spot a Java programmer even when they write in any other language.
Change Java to anything, still true. One of my "favorite" examples is a guy who does assembler style optimization in Mathematica.
Regarding Python, people seem to assume that just because they use Python, their code is good, even when it is actively terrible.
Regarding Java, I once reviewed a paper where the author had put something like "Java is the highest form of programming" and gave one of the most eye-watering pieces of shit code I have ever seen. He seemed to come from a (bad) C-background and managed to make about every mistake you can make in both languages in just one (printed) page. Strong rejection.
For everyones entertainment I'll mention Fizz buzz.
"Regarding Python, people seem to assume that just because they use Python, their code is good, even when it is actively terrible."
Not true. The community seeks "pythonic" code showing Python good practice. "The Zen of Python" asks new users to think more deeply about what constitutes good code, (import this). PEP-8 is a style *guide* for readability.
You can write bad code in any language, but blog it for comment and the Python community usually give helpful and constructive criticism. :-)
I remember one of my first projects, I had to maintain an ancient corporate accounting data collection system, written in MS BASIC running on CP/M80 and MS-DOS.
It wouldn't have been so bad, but it had been written by FORTRAN programmers and maintained by COBOL programmers (I kid ye not). Unfortunately, neither group had ever read the BASIC manual further than IF and GO TO. Ever loop in the program was performed by "IF A > 0 THEN GO TO 100". They had never heard of FOR...NEXT or WHILE...WEND, let alone a REPEAT.
There were also dozens of computer GO TOs in the code! There were reams of code that were commented out and we were running into size restrictions, so I tried deleting old, commented out code, only the thing just fell over, because it was computing a jump into the middle of a commented out section of code!
I managed to tidy up the code somewhat and optimize it. I got the data collection, preparation and transmission down from over 4 hours to under 20 minutes!
Or for most Java accessors, use project Lombok for much shorter code; it also covers lots of other common boilerplate code, including constructors, and common logging declarations.
Explicit accessors are sometimes compulsory, for validation and security-copying (to prevent mutable object exploits), and trace logging.
A lot of Python frankly looks like write-only code, because it never required type declarations method/function declaration, and I also suspect a lot of security/performance issues given how many easy, but dangerous assumptions it makes! I also view the Python API docs web pages as quite primitive and fugly compared to other language API docs like JavaDocs.
Ahhh. They just don't write code like that any more.
One of the big crash-landings[1] we had when I was a programmer was writing self-modifying code. Since we were writing for an environment where a single code segment couldn't exceed 4K (and that had been raised from the original 1K), some of the previous generations had done some fairly aggressive things to keep their code small..
Like having self-modifying code. Which is fine[2] when, in the old days, you only had a single thread to worry about and nothing would grab the CPU while your code was running, but by the time I got there, we had to code stuff so that it was re-entrant and could be used by multiple CPUs at once.
Which, of course, negated the advantage of self-modifying code since you could never guarentee how many CPUs were running your (single instance) code.
[1] Crash-landing was the term we cam e up for "if you do this it's an instant P45". Stuff like telling the CEO that he was an idiot..
[2] For a particularly difficult to maintain version of "fine". And trying to debug a core dump where the bit of code you are looking at doesn't match the source code isn't fun.
I was taught about this in High School. Mostly that it could be done, but it was a Very Bad Idea to do it.
It took me years to find any actual cases of it being used.
They were
a) The Apollo Guidance Computer b) The Bell Labs "Blit" bit mapped terminal.
Both of which had (for different reasons) severe resource constraints.
So I'm curious, what was your hardware environment?
PDP-11 in effect needed you to be able to write to where the code was running in order to efficiently pass parameters.
JSR PC, MYFUNC
ARG1 .WORD
ARG2 .WORD
etc.
and MYFUNC would use the old 'PC' value as a frame pointer (from the stack), and do a kind of 'PC cleanup' on the program counter so that you returned to the correct address. Or you could call with 'JSR Rx, MYFUNC' and put the old PC into 'Rx' and use it as a frame pointer. I forget the details, but that's kinda how it worked.
And so, you needed to write the arguments to ARG1 and ARG2 (etc) before doing the function call. they might even be general use memory variables if you're really clever with the design. You could even implement the subroutine call by referencing the actual address you call (the last word in the instruction, I think) as 'ARG1 - 2' and poke that before doing the call (making it a dynamic function call of some sort).
Anyway, this was common in the PDP-11 world. I think DEC was kinda proud you COULD do this. But single-thread only, no recursion...
From what I remember the original platform that COBOL was developed for had no stack so the perform start through end was implemented by overwriting the end label with a jump instruction to return to just after the original perform.
The other major gotcha as a sort of self modifying code. This was the jump into an overlay when the wrong overlay was loaded.
> I understood it to mean that the code calculated a variable line number to GOTO ...
I suspect that it was much simpler. There were GOTOs to numbered lines that were comments. This would then drop down to the next executable line. When the commented lines were deleted there was then no target for the GOTO.
In Basic - there is no labels.
10 IF x = 5 GOTO 50
20 REM THIS IS A COMMENT AT LINE 20
30 REM THIS IS A COMMENT AT LINE 30
40 REM THIS IS A COMMENT AT LINE 40
50 PRINT "X = 5"
60 REM THIS IS A COMMENT AT LINE 60
70 REM THIS IS A COMMENT AT LINE 70
Now, if you delete comments at line 20 and 30:
10 IF x = 5 GOTO 50
20 REM THIS IS A COMMENT AT LINE 40
30 PRINT "X = 5"
40 REM THIS IS A COMMENT AT LINE 60
50 REM THIS IS A COMMENT AT LINE 70
(edited for missing rem statements)
> In Basic - there is no labels.
BASIC is not _a_ language, it is a large group of approximately similar, or not so similar, languages. Some do allow labels, even named subroutines.
> Now, if you delete comments at line 20 and 30:
No, no, no, not for any variation of 'BASIC' that I am aware of. For the BASICs that only use line numbers there is _NO_ automatic line renumbering, that would be a complete fail. Deleting lines 20 and 30 would leave lines 40 and beyond with their original line numbers. The whole point of numbering by 10s is so that lines can be inserted, such as 51, 52, etc.
The problem described would arise if the original line 10 had GOTO 30 (which would work correctly) and then lines 20 and 30 were deleted because they were 'merely comments'.
@disgruntled yank
Something like:
10 print "Enter option number: "
20 input a
30 go to a*1000
...
1000 rem b=56*c
...
2000 print "sub menu"
...
3000 rem input c$
Only it wasn't in multiples of 1000. You see the commented out code (rem statement) and think it is no longer needed, so you can delete it to save space and make room for new code... Only to find out later that the code falls over when run.
In my experience, most people can make PERL look like chicken scratchings.
I find it more remarkable that some people can make PERL not look like chicken scratchings ... and, indeed, can write useful, constructive, and efficient programs in that unlovely language.
Why they don't apply their undeniable talents to something, instead, else remains a mystery, though.
I use Java EE and other languages, having learnt with sequential languages, but I don't regard these criticisms as valid reasons for disliking the language. OO programming in Java isn't so different from other OO languages. If someone you know is overly applying OO concepts / design patterns then that's just their convoluted programming style rather than a fault with the language. When applied effectively, those concepts benefit large applications that are maintained over a long lifespan. Hence why OO features were added to languages like C and PHP.
I love Python. It’s a great language - the new ‘Basic’. It’s great for teaching kids how to program, and it’s great for doing real work in as well but…
…for me my one true love is C. It’s powerful (and, yes, dangerous if abused). It doesn’t hide anything or do anything automagically. Memory is yours to play with as you will. Even my C++ looks like C (which I realise makes it bad C++ - except, sometimes, to other C programmers).
I quite like Objective C and Swift. I’ve been paid to develop in Pascal (which was my favourite teaching-kids-to-code language until I discovered Python) and APL (which was a vile experience). But, in my experience, if you can do C then you can pick up most modern programming languages quite easily. If you can do C well then even Assembly comes fairly naturally.
Same here, Python is really handy for many tasks that otherwise would mean something like MATLAB or worse.
"If you can do C well then even Assembly comes fairly naturally" may be true, but even truer is that C is really a universal assembler - there are very VERY few cases when assembly is justified, and even in those cases the fact that it can be in-lined in many C compiler's extensions is good.
@HmmmYes
I like Perl for short bits of text processing. I use it like a more readable version of sed when I need to share code with a non-programmer. Anything more than that and Perl falls down badly - I had to maintain an application written in tens of thousands of lines of (badly written) Perl code. The original developer had left out the "use strict" pragma because in his words "it didn't run when he put that in". I fixed that, and improved overall reliability somewhat - but it still wasn't as good, or as fast, as it could have been if it had been written in a language which was up to the task in the first place.
As for Java, that's a sad tale. So much potential - and ruined by Oracle. You have to admit* though that Microsoft really ran with it and has, latterly at least, come up with a real gem in C#.
*you don't have to admit of course. You could spew coffee over your keyboard and disagree vehemently. There are some strange idioms in English.
I to am an ardent C programmer. It's been a superbly useful tool over the decades. I have done some pretty big C systems very successfully with C.
However, I am intrigued by Rust. If they standardise that, there's a very good chance that I'll convert. It's usable as a system's language, it doesn't need a runtime, but it has some nice high level languages ideas, and does Communicating Sequential Processes too. There's lots to like!
I’ve been paid to develop in Pascal
My Polytechnic code assignment was to write a stock-control system in Pascal. In a dialect that had no random access file handling..
So I gave that up as a bad job and just wrote a reasonable demo instead. I got marked down a bit for not sticking to the brief and them marked up for my creative approach :-)
APL was my first language. I still have a soft spot for it, having written my own interpreter years ago that I still use as a desk calculator from time to time. But it has two serious flaws-- one, it's truly write-only, as even your own code becomes incomprehensible in record time given it's tendency to inspire complex one-liners. And two, it's optimized for the 2741 selectric printing terminal with an APL typeball and keyboard, which no one has anymore. The only language I know of however, that uses real mathematical multiply and divide symbols for the math operations rather than repurposing asterisk and slash. A lovely language, if you ask me, but I'd never advise anyone learn it.
@kdd
I know others who like it too. Maybe if it had been my first I’d feel the same way, but I was a C programmer, and I got tasked with working on an APL system because of my aptitude for quickly picking up new languages. I might be good at learning new languages - doesn’t necessarily mean that I enjoy using them!
APL isn’t the only language, incidentally, that can use real mathematical divide symbol for the maths operations. AppleScript (and IIRC HyperTalk) can too - but only because it’s very flexible as to the syntax (which can, in fairness, be A Bad Thing, if only because no two developers will write code in the same way)
For example, in AppleScript, for this sum, these are synonymous:
display dialog 10 ÷ 2
display dialog 10 / 2
display dialog 10 div 2 (div is integer only)
"....Pascal...." ah, yes, that was a fun starter language, but most learning establishments seem to have just treated it as an intro to Modula 2 and/or C, and never as a viable language in its own right.
I like intro'ing kids to code with HTML, especially as they all use websites every day, and you show them structuring and files calls etc. with quick and easy results. From there it's easy to get them into a backend in C or whatever language you like.
I'm not going to get involved in a 'my language is better than your language' discussion; I've got the battle scars from too many of those already.
What did strike me as a bit strange was characterising a professional institute with a royal charter as "a technical advocacy organization". I wonder if that doesn't sell them a little short. Mad Bob the technology yogi is a technical advocacy organisation, albeit not for any technology that exists, and I'm not sure that's really comparable.
Rosie
Or even the IET as it's now called... My charter certificate is old enough to say IEE.
I once had a very confusing conversation with someone who hadn't heard of the Institute of Electrical Engineers, but had heard of the Institute of Explosives Engineers. Now like every decent electronics engineer I have blown up various things in my time, but only tantalum capacitors or power supplies. This was an aspect of the conversation that took a while to resolve, via puzzled enquiries about really being allowed to do such things in one's bedroom at University...
"I dislike awk intensely. But for some tasks, there is no sane substitute..."
One can use sed as well for quite a lot. Even hairier than using Awk.
Done quite a lot in "sh" as well. (Think this was before Bash, so was more limited. Bourne Shell, as opposed to Bourne Again Shell, if my memory serves me?)
You can tell it to rely on parentheses rather than indentation - comes in handy when doing things like keyboard/mouse handlers and other complicated things that just wont be broken down into smaller functions in a desperate attempt to fit in 80 chars. I can only thank god that no-one has written a python ASP thing like PHP - imagine following indentions down through that code!
I wish I could upvote you a million times. Significant whitespace has to go down in history as one of the dumbest decisions ever (along with the GIL).
The reason is simple. Your code becomes nightmarishly difficult to refactor. Refactoring is one of the most important tasks in large scale software development since it allows your design to develop with the changing use-cases. But when you have to be ridiculously careful about making sure that things end up at the correct indent level it becomes a nightmare problem. The number of times I've fixed someones bug where they accidentally changes the indentation of the last line of a loop or similar.
Personally, I would have had a terminating token for the end of functions/loops, but made it a syntax error to have invalid indentation. That way after a refactor you can use your editor/IDE to fix the indentation.
I have a few other Python gripes - one of the biggest being the absence of a perl-like use strict type construct. If you have ever had to debug a problem where a thread dies due to a syntax error in a little used code path, you will totally get my annoyance.
Incidentally, and without wishing to start a flame war, my favourite language is C++, and I think Julia is the one to watch for.
> c and Perl have always made the most sense to me.
Python 3 does have differences from Python 2, but Perl has been through several rewrites, each of which were incompatible which previous version source code. If Perl makes "most sense" then you obviously never used Perl4 and haven't looked at Perl6 because these are quite different languages.
https://docs.perl6.org/language/5to6-nutshell
>> What's your point? There is Java [1], Java 2, Java 3, ..., Java 8; C++ 3, C++ 11, C++ 14, C++ 17.
Yes, but by & large they have backward compatibility. At least java does. I haven't touched C++ in years but as has been amply pointed out in this thread (I'm paraphrasing a bit) "You can write K&R C in any language" (including C++ last time I checked) at least with a few gcc flags and ignoring the warnings; . This is *not* true of Python (or for that matter, my own favorite scripting langauge perl, though in fairness I'm guessing that the vast majority of all perl ever run in production anywhere was perl 5.x).
My $0.02.
> Yes, but by & large they have backward compatibility.
Yes, but it is only "by and large". When moving from one version of C++ or Java to the next there will always be some issues which need resolving, except in trivial code.
Python3 is a new version of the language designed to be a significant improvement. Python2 is still developed and supported and has 'futures' and other tools to ease the transition to the new language. This has been done by numerous languages: extreme examples are: Pascal to Modula2; VisualBasic - numerous times;
Python3 vs. Python2 should be compared to Kotlin vs. Java. Kotlin is designed to make Java into a modern language and drop 22 years of baggage that it still carries. C++ has 36 years of baggage.
> "You can write K&R C in any language" (including C++ last time I checked)
Actually you can't. K&R C (edition 1) was replaced by ANSI C and few modern C/C++ compilers support the original K&R (though gcc may still do so). And that is hardly "any language".
And I don't know that anyone said that; what they did say was "You can write FORTRAN programs in any language", which is quite a different thing.
> But when you have to be ridiculously careful about making sure that things end up at the correct indent level it becomes a nightmare problem.
I don't have problems with that, but then I have chosen tools, and configurations of those, that would seem to be more appropriate than the ones that you are using.
> If you have ever had to debug a problem where a thread dies due to a syntax error in a little used code path, you will totally get my annoyance.
Syntax errors are discovered during the load/compile phase so you are probably referring to something different. There is usually an exception trace produced unless you deliberately ignore exceptions.
and there are a few in the present - like "C-pound" which relies on ".Not". Both equally shitty.
I think Python has its uses, but is ripe for ABuse and I see this in poorly written DJango code (and imported objects) INCLUDING the DJango implementation itself.
And too many people say "Write that in Python" or "I can write that in Python" when it SHOULD be done as a C utility, at least for efficiency. [converting binary data in python is the *WORST* possible implementation I have *EVAR* seen, because Python is afraid of pointers and C-style structures, apparently, and YES, I'm currently tasked with maintaining code that actually *DOES* this, because python 'expert' did a rage-quit].
> "I can write that in Python" when it SHOULD be done as a C utility, at least for efficiency.
Not all C programs are efficient. I wrote a text merge program in C. It was quite slow due to the str..() library, in particular strcat() having to scan along the strings to get the length. A rewrite in Python was 10 times faster.
Simple. Let them take a pointer to a procedure (or function) and put it in an array in that language.
Which (IIRC) even Ada allows (called "reference" variables), but strongly discourages.
Give them that, and at least 8 character variable names and they're yours.
They'll even tolerate garbage collected memory
void *my_functions[100];
or in C++
std::array<std::any, 100> my_functions;
if you want to be particularly smart, wrap it in a simple class that has a std::enable_if that only allows you to store things for which std::is_invocable is true...
'std' class template-based implementations are HIGHLY overrated. They try to be too much, are sometimes collection (instead of array) based, have some cryptic built-in requirements for memory manager objects and other irritating things, and can be best re-implemented in only a few lines of code by someone who knows what he is doing (like me).
But the C++ language doesn't require 'std' usage so it's all good. I think that the 'std' class templates were written by "Academic Arrogance" types that haven't coded in production EVAR in their entire lives, nor had to MAINTAIN someone else's crap-code. So they're clueless about the real world. And it's reflected in the design.
pirate icon, just because I'm a rebel
return_type (*array_name[])(parameter0_type, parameter1_type, ...) = {function0, function1};
Works fine. Eve's C compiler was limited to 8 letter variable names because of the limited storage capacity of flint chips. Ancient Greek clockwork compilers allowed arbitrary length identifiers, but only the first 63 bytes were significant.
I've decided to start learning Xcode / Swift, only as a hobby mind. If that leads to the ability to work from home and be on better wages than I'm on now, perhaps I need to ramp up my speed of learning it! Would welcome comments from anyone with knowledge about this language and career prospects.
> Any language where whitespace dictates what is inside a conditional and what isn't (eg Python) needs to die a slow and painful death IMHO.
No one cares if you don't use Python (unless your managers do). If you are forced to use it, then get better tools and learn how to use it better.
In what way is 'what is inside a conditional' not determined by the colon that terminates it. Perhaps you are thinking of some other language.
I meant what is inside the code block, not what is part of the condition.
numbers = [2, 4, 6, 8]
product = 1
for number in numbers:
....product = product * number
product = product * number is only part of the loop because of it's indenting. This is what I was referring to.
Trying to post some Python code somewhere that does not allow indenting (I cannot figure out how to do it on this site, even pre blocks strip out leading whitespace) and you simply cannot post valid code.
What happens if somehow you end up with source that contains spaces and tabs? The code could then surely APPEAR to mean one thing, but in fact means something completely different.
> Trying to post some Python code somewhere that does not allow indenting ...
That is not the fault of the language, but of the site. This site recognises <code> and <pre> tags but fails to implement them in a useful way.
> What happens if somehow you end up with source that contains spaces and tabs?
You fire the programmer and/or get better tools and/or use how to configure them.
> The code could then surely APPEAR to mean one thing, but in fact means something completely different.
Like in C:
total=0;
j=0;
for (int i=0; i<10; i++)
....total+=i;
....j++;
>> The code could then surely APPEAR to mean one thing, but in fact means something completely different.
> With C, the rule is that if the conditional has no braces, then the first line after the condition is in the block and nothing else.
Exactly. That is why my example in C was illustrating that "the code could then surely APPEAR to mean one thing [according to the indent], but in fact means something completely different."
"Any language where whitespace dictates what is inside a conditional and what isn't (eg Python) needs to die a slow and painful death IMHO."
I just think of it as 'Allman Style' without the curly braces
https://en.wikipedia.org/wiki/Indent_style#Allman_style
And if you put the ':' right after the control statement, it doesn't look a THING like K&R style (which I HATE)
(and I also dislike hard-tabs, so multiple spaces are fine, and pluma does auto-indent, and highlights things in a readable manner)
Write it out in mnemonics, translate ,by hand,from memory to opcodes (hex notation) and send it directly into unalterable masked read only memory. Should work first time right.
Anything else is for wannabe's.
ps: use lots of absolute jumps (GOTO) and globals with hardcoded addresses just to piss off the modular/reusable crowd.(because such programs run much faster since they don't have waste cpu cycles pushing and popping stuff onto/off the stack and moving data)
quote: use lots of absolute jumps (GOTO) and globals with hardcoded addresses just to piss off the modular/reusable crowd.(because such programs run much faster since they don't have waste cpu cycles pushing and popping stuff onto/off the stack and moving data)
Heh. Exactly what I am doing. Ruby has crappy garbage collection. Unwrapping some of the loops can also work well.
Started when 13 on a Bendix G-15. A machine with little glass tubes filled with nothing. Writing code that resembles BrainFuck. Grew up spitting distance from the Valley. Moved on to Fortran at UC Berkeley and also studied psychology since I wanted to get into AI. AI still doesn't exist.
Moved on to BASIC but quickly dropped into machine code for the speed. Then figured out how to overclock everything. Designed and built bit mapped video cards before the concept existed. Rewrote the BASIC interpreter. Wrote a little program called DamBusters for the PET as well as some others. Published in Transactor. Worked on the computer side of Xerox and also met with Gates and others at the first Computer Fair in San Francisco. Most likely was watching Jobs helping himself to everything he could at PARC.
Since then have worked with various languages but am mostly interested in anything to do with graphics. Python works well in that respect. Have helped to develop the AutoCAD 3D turbulence modeling. C is OK but don't need the speed for what I am now doing. Have figured out how to do brain mapping using Ruby along with some cool sonification using Sonic-Pi.
The thing that gripes me the most is the FLAT everything. If it is a button it should look like a button. It is all about users. Thinking like a user isn't easy. I have been teaching users for many years. I developed the very first computer science course for Thompson River University back in the early 80's.
Python will be the language of choice in almost all the domain in IT including Cloud Computing, Artificial Intelligence, Machine Learning , Data Analytics, IOT, DevOps. In fact python is the most prepared Language in Software Test Automation, Mobile Test Automation, Application development, IT Infra.
ava has seen heavy use building enterprise applications, as such, there are a multitude of mature frameworks (Grails, Spring etc..) and experienced developers to make use of. This means that Java can be used for a large variety of usecases, from small applications and single page sites, all the way through to large enterprise applications with lots and lots of data.
PHP has seen use in a large variety of applications as well, it is well known that PHP powers a large amount of applications on the web. Facebook used it in its earlier days, wikipedia, wordpress, among many others. More recently, we’ve seen a lot of nice looking frameworks become available, Laravel for one.
Personally, I’m a big fan of Java and Ruby for web development. They provide the syntactical consistency that I enjoy working with, along with language features and frameworks that make web development a breeze. While PHP isn’t a bad option for web development, whether its better or not is going to come down to the developers you have on hand and the type of application you intend to build.
If you interesting, check out this where compare this two languages between each other I hope it helps
A quick search on Dice.com shows that working in Java is in bulk. If for iOS there are about 2500 offers, for Java it is more than 17000. Of course, one cannot completely rely on these numbers. But the fact that the market for Java on Dice.com is potentially seven times larger than for the most fashionable iOS suggests that “old Java” feels pretty good.
Java certainly has its own problems. Java haters will continue to sputter and knock on the keyboard, posting malicious comments on the Internet. A garbage collector can cause hiccups and shiver. Data typing is a chore and cannot reject really bad code. Annotations are too complex. New Java features aren't evolving as fast as they were in the past. Braces add some confusion. This list goes on and on.
However, none of the competing technologies could not so widely and deeply land on the shores of the IT industry. Although some of the problems in Java are fairly easy to fix.
Another question is where and how to learn Java?
Java is the primary language for Advanced Placement Computer Science (Advanced Placement (AP) - curriculum and exams for high school students in the US). This means that often for Java students is the first programming language. Thus, Java is further with them "both in sorrow and in joy."
Let's talk about Java learning methods. Who is learning or studying? Very interesting to listen to. Maybe I myself can give a couple of practical training tips.
> If for iOS there are about 2500 offers, for Java it is more than 17000.
You are basing the 'popularity' of a language on the number of empty desks ?
It may be that an iOS offer is filled quickly and thus the offer is taken down while the Java offer stays up for months and thus there are more of these at any one time.
Just like any religious dogma support is searched for while counter arguments are ignored.
These are actual offers. I think it makes no sense to argue about the superiority of Java over the iOS platform.
You did not think that the complexity of the implementation of Java requires more skills than iOS? You did not think that Java covers a wide range of tasks? iOS is one platform. Java is multiplatform. From here and more offers, and I don’t need to prove here that the offers on the iOS fly like hotcakes, and Java continues to stale on the table.
After initial excitement and motivation, you gradually hit a brick wall (figuratively speaking) from time to time, which can be quite demotivating.
Once I started Multithreading, several times I caught myself thinking "Why am I doing this?!" But you get through it, gradually grinding it out.
I personally think motivation is not really the key for any beginner. You need discipline, more than anything, and as you get used to gradually getting through the hard topics, you gain more experience and when it finally clicks you feel on top of the world...and recharge your motivation in the process.
I strongly believe it is also very important to have the right tools and information.
So, let's took about the books.
My First one (not surprising) was "Head First Java".
This book is for fans of the informal presentation of material. When you read this book you get the impression that you are not learning, but just talking with friends.
The next one was "Java. Beginners guide" by H. Schildt.
This is the best book for newbies. I could not read Head Fist for a long time; This is not my book, I have reached half and realized that I can no longer.
In this book, everything is structured, all "on the shelves". The author managed to write a book "not dry" and "without water". This is the best book for newbies!
Legendary "Thinking in Java". B. Eckel.
I recommend this book after reading Head First or after reading Schildt’s book.
If you are starting to learn Java, but you have experience in other programming languages, such as C ++, then you can safely take up this book.
This is a book that can be read in a couple of nights and you will know the Java-core at a good level.
I also have a personal shortlist of online resources, which I hope helps beginners:
CodeGym.cc
+ : free, good design, a lot of practical tasks, game-like course geared for complete beginners, quick switch between light and dark themes.
- : Java only website.
Edabit.com
+ : free, interesting concept with a lot of “challenges” of various complexity, can add your own theory resources to each challenge.
- : not for beginners, no theory apart from links from users to outside sources.
Mooc.fi
+ : free, includes exercises/tasks, examples of code included in the theory, more advanced topics also included.
- : reads a bit like a very long manual with no ‘back to top button’, too much white on the page so hard on the eyes after a while, not much theory.
SoloLearn.com
+ : free, good design, step-by-step process, and test questions.
- : very little theory, no proper tasks to cement the knowledge.
Good like, guys! I hope it will be helpful.