Microsoft has opened the order books on the fourth generation of its Surface Laptop, replete with Intel-baiting AMD chippery in the line-up. Blessedly free of an overexcited Microsoft bigwig describing himself as "pumped" at the sight of some relatively pedestrian hardware, Microsoft's Surface Laptop 4 has arrived in 13.5 and …
This post has been deleted by its author
So you mean I might actually be able to get stuff done. That doesn't sound like much of a deal breaker to me.
And before the downvotes flood in, let me tell you my recent experience of "running" Linux. Which happened yesterday.
I have a Ubuntu machine provided by work. I needed to compare performance of one of our applications between Windows and Linux. Since working from home this last year, I tend to have the work machines (Two Windows, one Linux) stuffed in another room and remote desktop into them from the main Windows machine in my office as and when I need to.
So I boot up the Linux box, remote in, pull and build the latest code. The application doesn't work. There are at least two obvious problems. 1) OpenGL is being initialised with an emulated GPU which only supports an older version of OpenGL. 2) The CUDA drivers are out of date.
Number 2 is easy to solve. (For a given value of easy.) I jump on the nvidia website and download the latest drivers. They come as a .run script. Which opens by default in gedit. Switch to a terminal, cd to the download directory, and try to run the script. Auto complete doesn't find the file, so I end up copy and pasting the filename. It doesn't run. Oh, I need to make the file executable. Of course, how silly of me expecting to be able to run an install script. Change the permissions. Run the script. It throws an error saying the nvidia drivers are in use and can't be updated. It mentions that if I'm aware of the risks, I can ignore this error and run the installer anyway, but it doesn't tell me how to actually do that. I dick around trying to figure out how to actually run that installer, (maybe if I remote in using SSH instead of remote desktop it'll let me run it) but after half an hour or so I give up.
I open the Software & Updates control panel and see that there's actually a newer version of the nvidia drivers available "officially". So I check that option and click apply. Enter my password (twice for some reason). The install fails. Missing dependencies that won't be installed. I go online and search for remedies for more time than should be necessary. No solutions present themselves. I wonder whether I can update the drivers manually through apt. Search for the package name, and try. Same error. Can't install because of missing dependencies. Waste another half an hour or so trying to figure out why not. Eventually I discover that it might be due to corruption. "sudo apt --fix-broken install" followed by "sudo apt autoremove" followed by the actual install command finally works.
Two hours (ish) later.
After that I gave up trying to figure out why remote desktop can only use an emulated video driver. My motivation and determination had been crushed so I did other stuff for the rest of the day. Today I will try and find room in here to use the Linux machine locally to debug shit.
Compare that to Windows. Firstly, DirectX has just worked over remote desktop for at least 15 years. Secondly, when it comes to updating the nvidia drivers, I download the installer, and run it. I may need to restart Windows a few minutes later, but after that I can actually do real work for the rest of the day.
Downvote me for saying Linux sucks if you like (I don't give a shit) but you will never convince me Linux is a suitable desktop alternative for Windows. In my opinion, the average Linux user experience is at least 20 years behind Windows.
I've been saying this to FOSS supporters for years, but both they don't want to hear it and they will downvote me again.
Linux doesn't suck. What is the real issue here??
It's not the OS, it's the APPS, stupid.
Linux's problem on the desktop is...it isn't a desktop OS. It's a server OS with a windowing server patched on top - something we left behind decades ago with DOS 6 and Win3.11. Monolithic desktop OS's are a bane to programmer's sensabilities but for average Joe users, it is the only de facto design that works.
Average Joes do NOT want to learn how to manually administer systems through an antique command line. They do NOT want to edit config and etc locations, download tarballs, nor search high and low for working drivers that are compatible with specific hardware revision levels. The just want an OS that works so that they can open and run the apps that they need to get work done.
The OS is the tool, not the solution. But FOSS members never, ever want to hear that during their great debate about which OS is better. News flash guys: average users spend very little time "interfacing" to the OS, that is doing things on the OS level. Most of their time is spent inside an application running on top of the OS - yes, yes, that means the OS is there to run the app in the first place, but average, non-tech Joes don't see it that way. They see Excel, taking up their whole screen, and THAT'S the experience their mind registers. So the OS is simply the carrier to whatever app they really "use" to get their work done.
So FOSS needs to STOP concentrating only on the power and stability of their OS designs and bring the REST of their ecosystem up to the level that average Joe users have become accustomed to.
But there's no glory in that. OS programming gets the kudos, app development gets a "meh", especially once you consider that UI design must play a very important part and many programmers hate doing that end of the design work.
The ryzens actually mop the floor with the M1 except on gpu performance and power efficiency. Multithreaded is way more powerful and single core is a bit les powerful.
I agree that the MBAir is better value, as it is fanless, has a better gpu and has integrated HW support for many programs and codecs, plus amazingly low power use.
I would be worried about ssd damage on the mac, as they have an issue with that and the ssd is soldered to prevent users buying their own, plus lack of spare ICs available due to Apple reqyiring the parts not to be sold to third parties.
I find it *hugely* amusing that worshippers at the Church of Jobs have been telling us for years that the hardware specs don't matter, it's the user experience that is key.
Now the M1 chip has come along, and all we have heard for the last six months is hardware specs, quoted endlessly as if they are the only thing that matters.
Here's a thought for you. It's going to come as quite a shock, you might want to sit down. Here goes:
Most users don't need high performance in a laptop. They run Office applications, and do a bit of web browsing. All these Apple weenies showing how fast they can edit 4K videos and manipulate 3D models? We're just not interested.
My Lenovo Yoga C940 has a big advert on the front page of the website advertising 18 hours battery life .....
At best it gets 3-1/2 .... and it even runs down with the laptop completely switched off so if you've not used it in a week you have to charge it again.
Large pinch of salt and all that.
Oh, and no thunderbolt 3. Really ...
Like all the top end laptops it has a squarer screen and no numberpad to give a central touchpad. Why don't cheaper brands don't do this*? Dropping a numberpad should save money and it's hardly like 13"/15" panels need to be 16:9 cos they're also used in TVs.
Now if they'd just use a matte screen it'd be perfect. Losing the touchscreen is no loss IMHO, my work laptop has it, it's a gimmick.
*Edit: 13-14" laptops have no numberpad, but at 15"+ it's hard to find a sub-£1k laptop without one.
but.... haven't we been told for years that:-
- Matte screens are crap for watching movies
- Matte screens are more expensive than glossy ones.
Laptop makers think that everyone ONLY watches movies (hence the widescreen aspect ratio) and does nothing else.
Using your laptop with a glossy screen for business?
Cue most laptop makers laughing at you. Then they ignore you and produce more of the same crap.
Isn't that why we had years of 1368x768 screens on laptops for years?
Biting the hand that feeds IT © 1998–2022