A lucky 10 %
As of last week, a lucky 10 per cent .......
Lucky? Are we sure about that?
Feeling a little befuddled and out of sorts as your summer holiday comes to an end? That's nothing compared to confusion spilling from the Windows Insider team in this week's roundup of Microsoft news. Throttling 19H2 The merry-go-round of version numbering continued as Microsoft pushed out 19H2 (October's Windows 10) to the …
I knew it. The latest and greatest Microsoft OS of the 3rd Millennium is still completely incapable of handling user data.
Why can Microsoft not understand that user data needs to be on its own partition ?
It think, at this point, it has to be genetic. There's some sort of biological block in the brains of people working in Redmond. They just can't fathom the idea of having one partition for the OS, and one for the data. Kind of like asking a cockroach to conceive the notion of flight.
It's all the more incredible in that today's hard disks or SSDs are thousands of times larger than what existed when Windows first saw the light of day and, on top of that, Windows has a built-in partition manager that, while basic, is still quite functional. So it's not like Redmond doesn't know that partitions exist. Hell, Windows will create two of them for you right on install. So why not add a third one ?
Nope, they can't handle that.
Oh how quaint. IIRC, I first came upon dynamically allocating space to a filesystem around 15 years ago with the DEC ADVFS. Saved me a good number of times back then.
Linux can do it as well. Poor Windows (as installed on PC's) still lagging behind then.
They really do need to get with the times but it will probbly cost you $49.99/month just to have the feature when they get their finger out.
Linux can [expand filesystems] as well. Poor Windows (as installed on PC's) still lagging behind then.
IIRC the filesystem support has been in NTFS since day one. (All FS structures are stored in files so can be extended), and you've been able to grow paritions to fit free space since at least windows 7 via disk manager.
It's not ZFS/BTRFS/APFS volume growing, but neither is extending an ext2/ffs partition either.
(Personally, I wouldn't be happy extending any FS that doesn't nativly support it, but then I wish most FSs were ZFS these days. :) )
Don't forget that repository where all sorts of crap gets put and never deleted. Is it any wonder that users complain about lack of disk space?
I'm talking about AppData.
How many GB of shite is in yours eh?
MS, by not making the move to a separate partition is hammering it home that USERS DON'T CARE in their world. The User is only there to provide data for them to slurp.
Counting the days until I get laid off (job outsourced to India and me made redundant) and I can retire from using Windows once and for all. I won't regret it one little bit.
My Linux system (running Arch) does everything I want from a computer and more.
Don't even get me started with the WinSxS directory- on a well patched machine that's been three years between nuke n reloads, that folder will outmass every other folder in size. And you can't really get rid of anything in there, because 'backwards combatability'...
Hmm, just checked mine - it had 9GB in it (W10 reinstalled in June). Used the Disk Clean-Up method shown here:
https://www.laptopmag.com/articles/clean-winsxs-folder-to-save-space
Result afterwards was 6.4GB - so it IS possible to reduce the size of that monster a little bit.
"They don't what you to put your data on the HDD but in their cloud."
This is shown by how hard it is to network Windows 10 compared to Windows 7.
1. In Windows 7 they have something called Home Groups, not something I use but it's there to assist normal users with file sharing over the LAN. They've got rid of it in 10. I don't even think WORKGROUPs work any more.
2. If Windows 10 machine crashes or is turned off when running then the next boot it has a major fit. It will attempt to repair and then fail, losing users data and OS. This was a problem in the days of DOS but was fixed with NTFS so why is it back again?
I've increasingly had the feeling over the last decade or so that the (few) low level OS devs who knew what they were doing at MS have left and been replaced by over promoted web monkeys. Unfortunately the same seems to be happening at Apple too these days. I guess the old school is losing talent to places like Google and Amazon now.
"Why can Microsoft not understand that user data needs to be on its own partition ?"
(I put my user data on /home and depending on the system, it very well COULD be on a different partition)
For Micro-shat, you have to remember, it's not YOUR computer, it's THEIR computer, for which they now have complete control over updating and installing things.[even if you do not want it].
So by THEIR standards, you use that computer by THEIR grace, at THEIR whim, and from THEIR blessings upon you. You should bow down and worship, sing praises to their names, grovel at their feet, accept their ads without question, and buy their advertised merchandise. Or at least, that's how it appears (to me) for what they want and expect...
There is so much that Microsoft does that has never made any sense. One of them is the mixing of user data with the operating system. On top of that, you have that stupid registry for the operating system, program configurations, and even some user data. In the early days of Windows (circa 3.x), the programs I had the least problems with were those $5 to $10 applications that came on a floppy and the programmers just ignored the Microsoft recommendation of putting their DLLs into same location that Microsoft put theirs. There was no Microsoft clearing house that I was aware of for programmers to avoid conflicts in naming of DLLs. Those developers of the cheap software often just put their DLLs into the same directory tree as their program, as well as all configuration files. If you wanted to move the program to a new computer, just copy the directory to the new computer. Or, if you had to reinstall the operating system, you could just copy back the program directory and all your configurations were retained.
I have been using various flavors of Linux for almost 20 years now (I still use Windows for some tasks), and it is so nice to be able to even install a completely different flavor/version/distribution of Linux and still have my data still there (even though I do a backup just before doing that for safety reasons.
I do have a Linux computer setup at home for a home server, so most of my data is kept there so that all of my computers can access my data.
No. Not on the machines I build. One HDD for the (usually Linux) OS and software, and a second, separate HDD for user data. In case the system HDD crashes.
If you are really nervous about data security, you can even leave the data drive unmounted except when you actually want to save fresh data to it. But never forget to backup, backup, backup.....
"Kind of like asking a cockroach to conceive the notion of flight".
Believe me, there are places in this world where giant cockroaches HAVE come to terms with the notion of flight in a rather dramatic way. I used to live in one of these places.
But windows explorer still doesn't know how to properly deal with the folder mounted partitions. If you have a 1GB D: and have a 1TB partition mounted on D:\Data\, using explorer to try to copy a > 1GB file from elsewhere to D:\Data\ fails with a insufficient disk space error.
I know its not exactly on topic, but SWMBO has been tearing her hair out recently because an upgrade of Office at work (and we're talking probably thousands of users) means that files created in older versions of Word / Powerpoint / whatever don't load properly in the new version (missing master page elements for example even in compatability mode) and files created or updated in new versions of those apps won't load at all on machines that haven't yet been upgraded.
They have still to complete a W7 - W10 upgrade and I hate to think how thats going.
What is it with MS? This sort of shenanigans was solved years ago for most other OSes...
How many Windows updates get pushed to the whole wide world of Windows users, and then cause problems and have to be fixed, rolled back or whatever, this system they have for testing updates could be argued to not work well enough. I mean, if everyone who lost time and/ or data from a pushed update could invoice Microsoft for their losses, I guess it would soon get changed. But even so, every failed update released to the public lowers Microsofts reputation. So why don't they find another way?
So why don't they find another way?
Pick one or more:
1) Profit is more important than delivering a properly tested OS.
2) They are the only game in town for most businesses. Linux just isn't there yet for most of them.
3) Having users do the testing (forced obviously) is cheaper than doing themselves.
"2) They are the only game in town for most businesses. Linux just isn't there yet for most of them."
It's a case of weighing up the costs and benefits. For some people Windows is the only choice but even for them they may not have to use Win 10. For other people it maybe that they think it will be too hard to move to Linux but it would actually be worth it.
When it comes to working with other people then it depends if you're calling the shots. If you can say "We're using Libre Office, get used to it" then other people can fairly easily switch to Libre whilst staying on Windows.
Effort needs to be put into WINE to make important programs work perfectly. The developers themselves could do this more easily than a 3rd party. However I suspect there is some kind of Microsoft Developer lock-in that prevents them from making their DLLs available under Linux WINE.
Rather than having to fumble for media or hope that no super-secret restore partitions have been deleted in the quest for more disk space, Windows 10 will now download and install a fresh copy of the currently installed version of the OS.
So, how does that work then? If I've deleted the super secret restore partition where does the code to do the download come from? Won't this require firmware support?? (As I believe Apple does*)
* aside: Oh what fun if you've bought a second hand mac and you don't have the version of the OS it ships with tied to your iCloud account.
I know that this is off topic for this thread but this is wrong
"Oh what fun if you've bought a second hand mac and you don't have the version of the OS it ships with tied to your iCloud account."
You can download a .dmg file that contains the whole MacOS install. There are plenty of instructions on the Internet about how to convert that to a .iso which can be burnt to a DVD and installed without going anywhere near an iCloud account.
How else do you think that people build Hackintoshes?
Oh, and you can de-authorise a computer from iCloud. Again, the instructions are out there if you care to look. I sent an old MacBook (2009) to recycling and did just that before wiping it.
"Oh what fun if you've bought a second hand mac and you don't have the version of the OS it ships with tied to your iCloud account."
You can download a .dmg file that contains the whole MacOS install. There are plenty of instructions on the Internet about how to convert that to a .iso which can be burnt to a DVD and installed without going anywhere near an iCloud account.
Sorry, I wasn't clear; what I meant was... 'Oh what fun it is trying to use internet restore if it tries to restore a version of the OS you don't have tied to your iCloud account'
I have a 2012 air and if completely hosed it will try and download the version of OS X it shipped with; which I don't have a licence for, so it fails.
Downloading the dmg and building a USB/SD bootable media is indeed the solution. :)
I thought the software entitlement followed the hardware in this case? I've re-imaged several second hand macs from network recovery and had nary an issue. In fact, I am not even asked for an iCloud account until after the OS has installed, so I don't really understand your issue.
I thought the software entitlement followed the hardware in this case? I've re-imaged several second hand macs from network recovery and had nary an issue. In fact, I am not even asked for an iCloud account until after the OS has installed, so I don't really understand your issue.
Well. colour me embarassed, just sacrificed the MB air to test it, and you're correct.
Although it also downloaded Mojave, which wasn't out when I last tried it.
But it's also possible I'm an idiot too. :D
* aside: Oh what fun if you've bought a second hand mac and you don't have the version of the OS it ships with tied to your iCloud account.
I don't know, I did exactly that. Picked up a used mid-2010 MBP, immediately wiped the HDD because I wanted to start the system out fresh (and whatever the former owners had on it was none of MY business). First tested it out with a Mint live-DVD to see it was functional (and that was when I wiped the disk). Figured I'd have to order a reinstall disk, or have a colleague with a Mac pull down a reinstall image. Then I found out about the internet reinstall option. It *did* take 3 or 4 retries until it started doing the install, but surprisingly it worked.
Now if only Apple's hardware hadn't turned to shit in the past 8 years.
I bought a Vive on spec and was happy with my decision. Would not buy a hololens without getting my hands on one.
Also, by the way, when I found out Microsoft was killing HL1 updates I didn't even know it was officially for sale and I stay up to date with this so not terrifying in any way Microsoft. There's a company that should know how to do hardware, has demonstrated it knows how to do hardware and then is utterly tone-deaf when they've sold you it. May or may not be up to standards but that's no excuse. Who are you - Apple?
The previous upgrade was held back for testing, still in testing for that matter, as the upgrade before fouled up authentication devices drivers, resulting in the baseline machine configuration for all users failing to be allowed to load drivers for their authentication token.
And it wasn't fixed in their newer release, as the most current drivers, despite still being vetted, aren't used until they are vetted.
Perhaps, it's time to retrain the users on a new OS and platform, as most of our utilities and primary applications are either web based or terminal based.
Leaving Microsoft minus its largest government contract.
It'll all depend upon how the release works in the testing environment. The entire leadership team isn't very fond of deleterious effects to the enterprise.
This kinda tears the ring out of it, just a bit, dontcha think? A lot of mileage from very little material, is what this article is.
It's not like Apple is any better at avoiding confusion, either, now is it? Version numbers, animals and place names, wines - or not, who knows - and keys that are labelled one thing and called something else...
I have a relatively new computer, and every few weeks it seems, the thing updates itself peremptorily, and every time I shudder about what the consenquences will be for the usability of my computer and my applications.
What will it do, this time? I ask myself. After the last update, which was a few weeks ago, it took smooth scrolling away from my Word for Windows 2003 program (which I keep using because of massive investment in VBA applications that would take me hundreds of hours to convert). It used to scroll properly, now it's all jittery and almost unusable in this respect.
It also has a tendency to restart on its own volition "for updating", sometimes while I am in the middle of a session, or have popped to the kitchen or bathroom, and I lose my data.
Every time I back up onto hard drives, it crashes with a BSOD, and now it doesn't just crash right after the backing up, but it *chooses* when to do so - e.g. 4 hours later, so that I am completely unprepared and lose more work.
This is a terrible OS, and, frankly, the attitude behind it feels like an insult.