Hoovering?
Being a product from the vacuum cleaners of Redmond, does it hoover up everything you do to the great dust-bin in the Microsoft cloud? Kept securely, and perused at their leisure?
Microsoft has brought a bunch of enhancements with version 1.36 of Visual Studio Code, its popular open-source editor. VS Code's updated Java installer will now set up a Java development environment for you, using a JDK (Java Development Kit) from AdoptOpenJDK rather than Oracle's more encumbered version. It will also install …
why do we need VS Code for that when it can be ALREADY DONE using standard Xorg capabilities?
It's all very simple. For example, if you have any X11 system running (FreeBSD, Linux, 32-bit or 64-bit, does not matter) and you started Xorg with the appropriate parameter (usually -listen tcp or -listen_tcp or similar, see the .xserverrc file or the 'slim' startup file [the one that invokes the X server] or whatever OTHER desktop manager you're running's startup file for that) then you can use a remote TCP/IP connection on port 6000H for that.
It's *kinda* built into X11 already. No need to re-invent it "the Microshaft Way".
THEN of course you ssh in, do an "export DISPLAY=othermachine:0.0" in the session, and every X11 application you run will display and interact with THE REMOTE DESKTOP. I run 'pluma' like this ALL of the time. I suppose Slickedit, IntelliJ, Eclipse, and other 'big tools" wil work, though I've noticed that these *kinds* of tools [which are usually piggy java applications] don't perform quite as well as a native X11 application would [I'v tried it with IntelliJ for Android stuff, with Linux in a VM on FreeBSD].
And of course if you have NATIVE TOOLS available, why do you need some HACK in a JAVASCRIPT APPLICATION (aka VS Code written in, of all things, NodeJS] to do what is _ALREADY_ _POSSIBLE_ _AND_ _CONVENIENT_ ???
Microshaft, here's a thought: Why don't you re-write VS Code using C++ and either GTK or Qt, and *THEN* make it TRULY cross platform!!! Then have it run NATIVELY [not using NodeJS or some OTHER stupid javascript pile of excrement] and make sure it works correctly using the method I just described..
wouldn't THAT be worth something?
and of course I've been using editors like pluma on RPi to develop stuff for quite a while, on headless systems that display the GUI on my FreeBSD Desktop...
this stuff is NOT ROCKET SURGERY, but if you ARE a rocket surgeon you've probably been doing this already.
Now imagine, if you would, you've got a Windows computer with pretty terrible NFS support which craps out every so often, non-existent SSHFS support, no cross-compiling to ARM on VS, no remote debugging.
And the same goes if you develop Linux software but your workplace forces you to use a Windows PC.
So this why this a big deal, although MS are fixing OS problems such as poor remote filesystem support in VSCodium.
"Isn't the X11 protocol in the middle of being replaced by Wayland"
yeah SOME ARROGANT LInux people *FELT* this would happen, and yet it has NOT, and the OBVIOUS deficiency of NOT being able to support "remote GUI" is keeping Wayland off of MANY desktops!
Poettering *FELT* as if systemd was better. A _LOT_ of people became ANGRY at the "mandated change", and some WONDERFUL DEVS forked Debian into Devuan [which is what I use for Linux] so we could keep SysV init on Linux.
The Gnome 3 devs *FELT* as if their feature creep was BETTER, and a *LOT* of people (including Linus himself) became ANGRY and some WONDERFUL DEVS forked gnome 2 into Mate so we could KEEP OUR DESKTOP AS-IS without the unnecessary and unwanted FEATURE CREEP (which was more like 'take away what we like and replace it with something we do not').
And a LOT of us STILL USE WINDOWS 7 because the FEATURE CREEP of 8-10 is NOT wanted, either.
SO I'd say *NO* - Wayland is *NOT* going to replace X11. Not NOW, not ANY TIME SOON. It's not even really ready for 'prime time' in and of itself...
Arrogant developers and their SHOVING FEATURE CREEP INTO OUR BODY ORIFICES... and TAKING AWAY WHAT WE WANT TO KEEP just to get security updates and fixes... THAT needs to STOP.
"If you're running 32bit Linux on your development machines at this point you're doing something wrong."
*cough* Raspberry Pi *cough* still runs 32-bit Linux
And 32-bit Linux works pretty well IN A VM
And 32-bit Linux is actually *slightly* *faster* if you're not running some PIGGY THING that WASTES RAM BY ITS VERY EXISTENCE [like maybe IntelliJ or chromium or anything written by Micro-shaft].
Or the worst: If it has NodeJS in it. Like "guess what" does!
If you have less than 4G of RAM, it's smarter to run 32-bit than 64-bit. And last I checked a LOT of things run really well on a system with less than 4G RAM, _INCLUDING_ developer tools. No need to spend $$$$ on bleeding edge hardware when "what you already have" can easily do the job... for those of us with budgets, who's mommy and daddy aren't paying for it, and when money is better spent elsewhere.
He was wrong and should've said: x86 and not simply all 32-bit arches.
I agree to this corrected statement. Nobody is using 32-bit Intel arch anymore. Nobody sane at least, it is dead since long time.
I havt that the writers here keep on going about it as if dropping a dead arch is something bad.
strangely enough, the "dead arch" makes sense in too many cases:
a) very low end priced portable computers [lets say with Atom processors on them and limited RAM], sorta like 'netbooks' even. Usually those are ARM-based, but not always
b) embedded boards. often useful to do development on them directly if you can plug in a monitor, mouse, and keyboard. Some are still x86 arch (but most, like RPi, seem to be ARM based these days)
c) >15 year old (aka ancient) computers being recycled for new purposes. I've got a few of these laying about that can easily run an old version of Linux [and in some cases, do, and some have XP on them].
Some time ago I brought a 17 year old laptop on site for contract work because I needed a Linux box, and they didn't have one (yet), to do embedded kinds of stuff with an RPi. I got MORE WORK done on that old thing, with 512M RAM and a 20G hard drive in it, than I could have on a BRAND NEW WIN-10-NIC box with the usual hardware load.
32-bit is NOT DEAD, just not all that popular right now, in places where people have plenty of money to spend on new stuff. But amazingly enough, with the developer tools that _I_ use, that 17 year old Toshiba laptop worked EXTREMELY WELL for the purpose that I needed it for. I could not run Firefox on it (with a responsive user interface, anyway), but I could do just about everything else I needed to do with it.
not so amazing, really, when you consider that Linux was designed for old klunky hardware, that it does NOT require the latest hardware and GIGABYTES of RAM and TERABYTES of DISK "just to load".
Keeping first class support for older "less cool" architectures is actually a good way of discovering bugs and other issues.
It is why the OpenBSD team support a massive number of architectures including VAX. Yes, no-one really uses VAX but it is a way of keeping quality up.
Also... developing countries get all our old PCs and thus they are generally x86. I don't particularly want to exclude any talented developers from these parts of the world and instead have to work with greedy fat kids because they are the only ones who can run the newest stuff.