Whatever you do, don't run fsck. (It installs Windows 10)
Your wget
is broken and should DIE, dev tells Microsoft
Well, that didn't take long: within a week of applause for Microsoft's decision to open-source PowerShell, a comment-war has broken out over curl and wget. For those not familiar with these commands: they're open source command line tools for fetching internet content without a browser. Apart from obvious applications like …
COMMENTS
-
Tuesday 23rd August 2016 01:57 GMT thames
Taking over the wget and curl names to provide something incompatible and far less capable was incredibly stupid. That in itself is a breaking change.
It's as if Microsoft kept substituting MS Paint every time you tried to run Adobe Photoshop, and then refused to stop doing it because now some people might be used to clicking on the Photoshop icon to run MS Paint.
The user response isn't a factor of open source. It's a factor of people having a communication platform to respond to problems which isn't controlled by Microsoft and can't be silenced by their PR people.
-
Tuesday 23rd August 2016 04:05 GMT Notas Badoff
Relating to the outside world
It is when Microsoft's world and the real world interact that we find out how narrow and small is Microsoft's understanding of, or experience with, the real world. Way back when it was considered by them a negative if you had any experience with other operating systems or non-native toolsets. I think that must still be true. This is self-inflicted, just like stack ranking. Very sad. Very predictable.
-
Tuesday 23rd August 2016 08:14 GMT Mage
Re: Applause and wget
I've had the windows build of wget for ages on XP, without ANY *nix shell or Powershell thing.
What applause? I never noticed any. This is a halfbaked project.
I use Linux now for everyday netbook & workstation. XP, Win98 etc on some old HW for legacy HW I/O to program old gear via serial or parallel ports.
-
Tuesday 23rd August 2016 12:03 GMT John Sanders
When Microsoft's world and the real world interact...
It is not even that, PowerShell solved (more or less badly and for the second time [wscript anyone]) an issue that existed in the Windows side of things.
PowerShell does not solve any problem in Linux, this is not a question of PowerShell being better or worse, PowerShell does not provide anything on Linux that's not there already in a better more refined fashion.
-
-
Tuesday 23rd August 2016 04:59 GMT Robert Helpmann??
Too little, too late
"It's a factor of people having a communication platform to respond to problems which isn't controlled by Microsoft and can't be silenced by their PR people."
Such a shame it wasn't in place prior to Windows 8. Perhaps criticism of PS will convince them to scrap it and start from scratch. I mean, whatever they came up with would have to be better, right? Right?
-
Tuesday 23rd August 2016 08:00 GMT Sirius Lee
This seems like just an anti-Microsoft gripe from the Linux fundamentalists.
#1 PowerShell aliases are not commands on the command line except in the PowerShell command environment, so don't use it.
#2 PowerShell command aliases can be changed.
Don't want wget to do what it currently does? Then change the alias which is a reference to .NET assembly entry point.
There is an opportunity here for an enterprising developer to provide a PowerShell module to change the aliases so they more closely meet the expectations of Linux users.
Or on the other hand, we can be subjected to the usual whining expected from Linux users when confronted by anything from Microsoft.
-
Tuesday 23rd August 2016 09:46 GMT Roo
"This seems like just an anti-Microsoft gripe from the Linux fundamentalists."
We see the same complaints from MS Office lovers every time folks suggest LibreOffice can be used in place of MS Office. Plus in this case MS are intentionally ripping off brand names with the intention of fooling people into thinking they are using the real deal. I'm pretty sure the MS community at large wouldn't react any better to LibreOffice renaming their products Excel, Word, Access and Powerpoint.
-
Tuesday 23rd August 2016 12:11 GMT John Sanders
Wintards thing everything must work like it does in Windows...
No, and a thousand times NO!
In a Unix box (not just Linux) the distinction between the shell and the utilities is: NONE. (The system's design makes automation-shell functionality an inherent quality of the OS)
The integration is provided by the environment itself, there is no requirement for a program to do anything to be integrated in the shell, and the shell doesn't need to do anything special to interact with a program and automate things.
To achieve object-level functionality as you do in PowerMeh! all that is required is to have something in the environment that the shell can use and there you go.
We call that Perl/PHP/Python.
I haven't bothered to check PowerMeh! for Linux, but if MS have done things correctly in the Unix style I should be able to leverage PowerMeh! and .net scripting from bash or ksh without having to do anything special.
However I seriously doubt that a lot (Microsoft doing things correctly) every piece of Unix related stuff that I have seen from Microsoft was a pile of crap.
-
Tuesday 23rd August 2016 15:28 GMT Ken Hagan
"Don't want wget to do what it currently does? Then change the alias which is a reference to .NET assembly entry point."
And then convince all of your customers to do the same.
You sound like all those people who say that <insert offensive desktop feature here> isn't a problem because I can change it. Yeah, but we aren't *all* hobbyists playing in our bedrooms, so the out-of-the-box behaviour matters. It is what our customers will be using whether we like it or not.
-
Tuesday 23rd August 2016 20:36 GMT Lars
"the usual whining expected from Linux users when confronted by anything from Microsoft.".
So how usual do you think this "confrontation" between Linux uses and Microsoft is. Is it not rather Microsoft that is confronted with something they so desperately tried to kill. It must be at least twenty years since I read about Microsoft guys who claimed the shell was "killed" on Windows because people high up in the organization did not understand why a user would ever need it and because it looked so old-fashioned.
The name PowerShell tells a lot about Microsoft, does it not. You might also notice that apparently some Linux users have voluntarily decided to try it out.
-
Wednesday 24th August 2016 00:18 GMT Trixr
I love PowerShell and use it hourly at work, but I think it's bloody stupid idea from MS.
It's bad enough they aliased "ls" to Get-ChildItem without (again) the same functionality, but creating default aliases that mask well-known tools that exist outside the shell is stupid.
If you want to create your own aliases inside PowerShell, that's all good, and if you want to use a well-known name for whatever you've rolled, that's up to you. But introducing it as a default, stupid.
Thanks for giving the rest of us grown-up and real-world Windows admins a bad name (18+ years Windows experience, but I'm also a RHCT).
-
Tuesday 23rd August 2016 08:09 GMT Dazed and Confused
provide something incompatible and far less capable
MS don't need to break their broken SW here, they could use an environment variable to specify whether users want the "old" broken MS curl & wget or whether they want the get the real curl and wget. This is how Unix and Linux systems have dealt with these types of problems for years.
-
-
-
-
Tuesday 23rd August 2016 08:20 GMT bombastic bob
Re: Nothing new
"People still used FTP?"
well windows didn't have an ssh implementation for using scp until very recently. And I have to wonder whether or not any of the "standard ssh features" are BROKE-DICK in the *new* PowerSmell
except, of course, cygwin. scp works fine there. so does rsync. yeah.
-
Tuesday 23rd August 2016 14:51 GMT Anonymous Coward
Re: Nothing new
"well windows didn't have an ssh implementation for using scp until very recently. "
But it supports https which is more firewall friendly...You can use Powershell to copy files from the command line - for instance:
Invoke-Command -UseSSL -SessionOption MyServerName {get-content -encoding byte -readcount 0 "C:\windows\system32\notepad.exe" } -Credential $cred |Set-Content -Encoding byte c:\temp\foooooooo.exe
-
-
Tuesday 23rd August 2016 15:38 GMT MacroRodent
Re: Nothing new
>People still used FTP?
I still often find it to be the only common way to move files between unlike systems. Even if a better alternative is available for some OS; it may not have been installed by whoever is in charge of the system I need to communicate with. Or there is stupidly configured firewall blocking the way for other methods. I don't think FTP is going away any time soon...
-
Wednesday 24th August 2016 13:25 GMT Seajay#
Re: Nothing new
People still used FTP?
Maybe I'm feeding the troll here but I'm going to do it anyway. FTP is great because it's an old universally established standard, and that trumps almost all other considerations. As an example, after spending a long time trying various cloud file sync services for home use I've discovered that what I really wanted all along was just an ftp server running on my main machine. Because lots of applications support ftp they can do the sync, merge and conflict resolution stuff at application level which works enormously better than a sync client which presents applications with what appears to be a local file system then tries to sort out the conflicts itself (which is fundamentally impossible if I have made changes to, for example, a keepass file on two separate machines).
Integration is one of the most irritating bits of IT. 90% of the time, what you lose to network inefficiency by using FTP over something like rsync is insignificant compared to the hassle of getting more sophisticated systems to actually bloody talk to each other.
-
-
-
Tuesday 23rd August 2016 05:52 GMT Flocke Kroes
If only ...
If only their were implementations of curl and wget available in source code form on the internet that could be compiled for Windows and distributed for free. Microsoft would be able ship software that has survived decades of testing by demanding techies without having to go to the expense of creating, testing and debugging their own versions.
-
-
Tuesday 23rd August 2016 16:21 GMT Carlie J. Coats, Jr.
Re: If only ...
Back many years ago, at their press conference where they introduced Windows NT, the Microsof marketroids doing the presentation claimed, "It complies with POSIX. In particular, it has the Korn shell, 'ksh'."
When one of the audience claimed, "No, you've got ksh semantics wrong", the marketroid replied, "No I assure you that it is correct."
"No, it isn't."
"Sir, I assure that our experts have looked at it, and..."
Interruption from another member of the audience, "Look, asshole -- that's David Korn himself!"
So this broken wget is typical Microsoft.
-
-
-
Tuesday 23rd August 2016 05:57 GMT Steve Davies 3
They are quick to shutter services
when they "embrace" a company (absorb them into the MS-Borg) yet when they clearly make a balls up they say that it will have to go out to RFC because someone might have used their shite in a week?
And because they want to play by the rules....
After a while the RFC will be forgotten and people will accept the MS versions of wget and curl as the defacto ones.
I just get a feeling this is the first attack in their plan to subvert the FOSS movement from the inside.
If that is the taste of MS to come then watch out FOSS.
come on Redmond, just pull the Powershell release from Github until you fix it properly. Otherwise folks, use at your own risk with wearing a full Hazmat suit and a 40ft barge pole.
-
Tuesday 23rd August 2016 12:22 GMT John Sanders
Re: They are quick to shutter services
>> Otherwise folks, use at your own risk with wearing a full Hazmat suit and a 40ft barge pole.
No one other than testimonials will use PowerMeh! in Linux because there is no god-damn point to it in the first place!
This is like trying to come out with a device for voice communication over wires with a limit of 100m and call it "talkphone" when the world already has "telephones" that go over any distance and can even be wireless.
-
Tuesday 23rd August 2016 14:53 GMT springsmarty
Re: They are quick to shutter services
> No one other than testimonials will use PowerMeh! in Linux because there is no god-damn point to it in the first place!
Disagree. From time to time I need to maintain some stuff in Azure via Powershell, and up until now I had to keep a Windows VM instance just for that.
-
-
Tuesday 23rd August 2016 06:10 GMT Neoc
Usual Microsoft behaviour
Take a well known item non-microsoft (curl, wget, kerberos), make it available in windows, change it so the window version is not compliant with the non-microsoft version, refuse to make windows version compliant because "it might break out shit", push windows version of said item, call non-microsoft version "broken" because it doesn't play well with windows' version.
-
Wednesday 24th August 2016 08:59 GMT oldcoder
Re: Usual Microsoft behaviour
The order is a bit off:
1. change it so the Windows version is not compliant with the non-microsoft version
2. make it available in Windows
3. claim the Windows version is compliant
4. refuse to make the Windows version compliant
5. push windows version of said item
6. call non-Microsoft version "broken"
Hey - works for AD, being broken with LDAP/Kerberos/Bind.
Microsoft doesn't even know how its own software works as MS had to get help from the Samba project for the EU mandated documentation.
-
Tuesday 23rd August 2016 06:23 GMT Ralph B
Embrace, Extend, Extinguish
This is a classic Microsoft tactic.
The strategy's three phases are:
- Embrace: Development of software substantially compatible with a competing product, or implementing a public standard.
- Extend: Addition and promotion of features not supported by the competing product or part of the standard, creating interoperability problems for customers who try to use the 'simple' standard.
- Extinguish: When extensions become a de facto standard because of their dominant market share, they marginalize competitors that do not or cannot support the new extensions.
It looks like nothing has changed under Nadella. Zero, zip, zilch, nada. (I guess the clue was in the name.)
-
Tuesday 23rd August 2016 08:36 GMT Lee D
I assigned my tech a task of turning a Server Core installation into a Server GUI installation remotely using nothing but PowerShell. This involves getting the original Server ISO onto the machine somehow and then running commands that add on the GUI features from the files on that ISO.
They weren't allowed to use any file they couldn't download from the PowerShell interface itself, though, so it became a real chore.
The "fake" wget command on PowerShell craps out on anything unusual. You can't get large files. You can't get secure files properly. You can't resume. You can't do anything with it.
In the end, the documentation started with a step that used PowerShell "wget" to download the REAL wget (from a plain HTTP website), and then carry on from there. Anything else was just too much messing about with strange errors and failed downloads.
-
Tuesday 23rd August 2016 09:46 GMT Naselus
"I assigned my tech a task of turning a Server Core installation into a Server GUI installation remotely using nothing but PowerShell. This involves getting the original Server ISO onto the machine somehow and then running commands that add on the GUI features from the files on that ISO."
Since this task can be done with a single novice-level line in Powershell, it appears more likely that either 'your tech' doesn't know how PS works, or else you don't and made this whole story up to win anti-MS street cred. Your 7 upvoters are likely in the same boat.
-
Tuesday 23rd August 2016 12:24 GMT Lee D
It's a single command. When you have the Windows Server DVD present and/or the massive wim file nearby. How do you do that on a remote server with no physical access? Download the iso? How? Wget? OH LOOK ... THAT'S WHAT WE'RE TALKING ABOUT!!!
Before showing yourself to be a know-it-all idiot, read the full post at least.
-
-
-
Tuesday 23rd August 2016 10:50 GMT phuzz
Re: Another example
The alias of 'wget' for 'Invoke-WebRequest' has been around for years (it was added in Powershell 3.0).
This has only become a problem since MS added a bash shell last month and the 'real' versions of wget and curl became available, and they all have different syntaxes.
The simple solution is to remove the alias*. MS is hesitant to do that in case someone has a script using wget as an alias. Although in that case, maybe you should just find/replace on your script and fix it your own damn self.
* "remove-item alias:wget -force" if you want to do it yourself.
-
Wednesday 24th August 2016 21:33 GMT John Brown (no body)
Re: Another example
"The alias of 'wget' for 'Invoke-WebRequest' has been around for years (it was added in Powershell 3.0)."
I suppose the real question is, why the hell did MS use a unixoid alias for an existing powershell command in the first place? Consider the MS attitude to Unix in general and Linux in particular, especially back when PS3.0 was introduced.
-
-
-
-
Tuesday 23rd August 2016 09:58 GMT Roo
"but when scripting shouldn't one test and declare exactly which executable you want to be running as opposed to relying on a user shell environment to be set up correctly?"
In most cases I would say "no" because the users may well have their shell env setup with the intention of using non-standard executables (eg: if they are cross compiling) and that kind of environment testing code renders scripts pretty much unreadable. If you really want that kind of thing I think it should be put into a dedicated environment setup+validation script.
-
Tuesday 23rd August 2016 19:28 GMT Destroy All Monsters
That's the idea. You want to leave the user the possibiity to soft-replace an executable using PATH if he is in the mood of pretending to know what he's doing.
Except for the shebang line, where you want to (actually, must) use #!/bin/bash or something (some people prepend "#!/usr/bin/env bash" which completely bonkers in a pretend-more-flexible-than-you way.
-
-
Wednesday 24th August 2016 10:15 GMT Gerhard Mack
I have had executable locations change based on age of the Linux Distro, and then there are things like difference in the way Linux/FreeBSD/Darwin organises things. I have found that hard coding the shell's location breaks far more often than breakage caused by someone doing strange things on their system. If I write the script conservatively and don't use bleeding edge features, I can count on it functioning on most of the systems people try to run it on.
-
Wednesday 24th August 2016 20:12 GMT Destroy All Monsters
1) How often did the shell move from /usr/bin/bash and /bin/bash lately?
2) Code "conservatively" is actually an alias "not using features that make shell scripting less horribad". Are you still coding in original Perl? Why suffer needlessly for a couple of survivalist stuckists loudly proclaiming their insanity on the interwupps who insist on "original sh code" and who are unlikely to ever encounter your script in any case? It's rank masochism. And you don't even get a woman traipsing all over you on the plus side.
-
Thursday 25th August 2016 21:51 GMT Roo
"1) How often did the shell move from /usr/bin/bash and /bin/bash lately ?"
I'll grant you that one, and even if the shell does move it's not a show stopper - easily fixed/worked around/bodged etc.
"2) Code "conservatively" is actually an alias "not using features that make shell scripting less horribad"
I think you're being a bit hard on folks here. I use ksh & bash on a daily basis, so I tend to restrict myself to using features common to both simply because I c.b.a with writing a script twice. Besides if I need the stuff bash brings to the table (as handy as they may be) the chances are I should be working in Python instead. :)
-
-
-
Tuesday 23rd August 2016 10:14 GMT Cem Ayin
Cool down
I know I'll get massively downvoted for this, but command name conflicts have been around in unixoid operating systems for ages...
Are you old enough to remember the conflict between 'rsh' (restricted Bourne shell) and 'rsh' (remote shell)? Some vendors solved this by putting the two binaries in different directories, but HP did rename the remote shell binary to 'remsh' (at least in the versions of HP-UX I have known, to wit 7.x - 9.x) which was not good for cross platform script compatibility.
Even worse for cross-platform scripting was their decision to embed the functionality of 'nawk' into 'awk' and do away with the 'nawk'-command altogether.
Another blunder of the sort that comes to mind is the decision of Mr. Thorvalds himself to name the Linux system call tracer after the SysV STREAMS tracer thus giving 'strace' the functionality of 'truss'...
The GNU project has been providing incompatible versions of POSIX commands under the original names for as long as it exists.
And let's not talk about the variety of shells that you find under /bin/sh in different versions of Unix or Linux (these days, many script writers naively assume that /bin/sh is always linked to /bin/bash only to find their scripts fail on non-Linux or Debian-based systems). Solaris even has two distinct versions of the Bourne shell (in /usr/bin and /usr/xpg4/bin, respectively) in order to be compatible with its early versions as well as with the official standard.
These are just the examples that I can think of OTTOMH.
As the saying goes: "The wonderful thing about Unix standards is that there are so many to choose from". Feel free to blame Microsoft, but don't forget to add at least AT&T, UCB, Sun, HP, IBM, SGI and Linus Thorvalds to the list...
-
Tuesday 23rd August 2016 15:29 GMT Lennart Sorensen
Re: Cool down
As clearly documented at the bottom of the strace man page, strace was written for SunOS and inspired by the trace tool. It was ported to linux later, and then many features of truss were added to it. It did not start out with all the truss features at all.
Mr. Torvalds had NOTHING to do with that. Neither did anything Linux related for that matter.
As for GNU versions of commands, at least they do implemented the required POSIX features, and then add to them. The aliases in this case have hardly any of the functionality of the tools they claim to be. You can treat the GNU tools like posix and they will do what you wanted, you can't do that with the powershell aliases.
-
-
Tuesday 23rd August 2016 12:02 GMT SImon Hobson
Isn't it rather quaint that MS says it can't change something without consulting users. What f****ing hypocrites. They go out of their way to change sh*t all the time regardless of how much users complain.
People were mostly happy with the XP UI - so they changed it to something else. People were mostly happy with the Win7 UI so they changed that. OK< people really loath the Win8 UI and they did change that - just not for anything much better.
Then there's the ribbon bar in Office. And just don't get me started on that pile of steaming manure that is their online Sharepoint offering where they keep fooking about with the UI - usually for the worse.
-
Tuesday 23rd August 2016 12:08 GMT Swarthy
Given the second part of your handle
You should understand that MS gives all of there users a choice - Hobson's choice
-
-
Tuesday 23rd August 2016 15:25 GMT GrumpyOldMan
Reminds me of a very very old joke from the 90's...
How many Bill Gates does it take to change a lightbulb?
None - he just redefines 'Dark' as a new standard.
So - 2 new 'Standards' coming up then ...
(Note - there's an alternative answer: One - he holds the bulb and the world turns around him)
-
-
Wednesday 24th August 2016 20:05 GMT Ken Hagan
Re: Reminds me of a very very old joke from the 90's...
To be honest, I'm not sure that the C standard at the time (C89) *did* allow it. There was a pretty strong presumption that "long" was the longest integer type. Since ptrdiff_t and size_t had to be 64-bit, that meant inventing a long long that could be used for them, thereby breaking an assumption that pretty much all C programmers had made for the previous quarter century.
The trajedy is that it was so unnecessary. Porting from Win32 to Win64 was going to be a line-by-line re-write no matter what you did. (MS introduced a plethora of COBOL-esque typedefs to help, but none of them really helped you any more than size_t and ptrdiff_t.) Keeping long as 32-bit merely forced you to re-write for Win64 differently from Unix64.
Perhaps that was the plan. Assume that Win32 shops everywhere would have the porting resources for just one re-write and then fix the rules so that this re-write only targets one 64-bit platform. Then sit back and hope that everyone chooses Win64.
-
Thursday 25th August 2016 22:06 GMT Roo
Re: Reminds me of a very very old joke from the 90's...
"My favorite "Dark" by Microsoft: letting the type "long" be 32 bit on 64 bit systems.
Yes, the C-standard allows it. It's still insane for a general purpose computer."
I can understand the antipathy - the 32bit long/64bit pointer model was actually employed on 64bit RISC boxes before MS had got around to using 32bits properly. The aim was to reduce the memory footprint of apps - and thereby get less cache misses and increase performance. Believe it or not it did actually work in some cases. Personally I found the existence of long long more irritating, and refused to play the game by using things like int64_t instead. :)
-
-