Coming soon at the price of what used to be a normal PC
You too can have a thin client and subscription to a cloud OS.
Thanks AI!
Microsoft has found some friends to make desktop devices that boot into its Windows 365 cloud PCs. The software giant was previously the only vendor to make such machines, in the form of the Windows 365 Link machines it debuted in 2024 and delivered the next year. Dell and ASUS have now joined the cloudy PC party. ASUS’s …
And how long before your ability to download any content you've created via the thin client to store offline is blocked for 'reasons' and all your data is belong to us? All your company's data and IP sitting in someone else's datacentre and your level of 'ownership' only extends to creating content and accessing it. You might - for a very steep fee - have an option to download, but knowing how these cloud companies like to change the terms and conditions to be egregiously in their favour, do you want to take that chance?
Thin clients are a great solution for enterprise use.
But better to keep a team of three engineers and a modest server farm on-site. It'll be cheaper, more secure and adaptable to the organisation's needs.
Also won't go pop next time cloudflare or azure is down...
The school I attended in the 2000s went through a full cycle of implementing thin clients in an attempt to do things on the cheap. It appeared that no-one thought to test out that a class of 35 IT students logging on at the same time was enough to crash the by then elderly server. To add to the fun, the crash could be anywhere between 30 seconds and 30 minutes after logon, needless to say students aren't the most diligent in ensuring their work is saved.....
Equally I've worked at places that moved to thin-client operation to save a quick buck, in some cases they switched back to conventional desktops within 12 months.
Add internet connections, and running everything remotely on someone else's computer (that definitely doesn't keep to 365 days of uptime), and the cost doesn't look like that much of saving from where I'm sitting.
But its The Cloud™, which is really new-fangled tech as far as a CxO is concerned. That must make it good, right? Right?
Freelance I had a couple of gigs a few years apart running UAT for application server upgrades. It might have been a coincidence but I'd worked on development of the application a decade or more previously. The application was character based and the client was running thin clients with a separate terminal server. By the time we got round to the second upgrade someone -possibly the terminal server company, had put together some sort of layer that took the terminal control codes to turn it into a GUI interface although the users were told not to use this.
The morning of go-live on the new server everything was working as expected and I was about to go home when I was called back. Performance had collapsed. It didn't take long to find out who'd switched to the GUI terminal setting in advance of the beefed up terminal server that was due next week.
So...using thin clients with inadequate infrastructure or choosing IT technology based on "saving a quick buck" results in poor results. Quelle surprise!
My place of work has around 200 thin clients in production areas, running IGEL OS, used mostly just for Citrix/RDP. This is not to save a quick buck.
Some of these fanless thin clients with no moving parts are in hazardous areas (atex/bio/chemical), which rules out a lot of computers and many other hardware too. In certain production areas a hardware fault means sending it to incinerators, and installing a spare unit. The firmware or OS updates are a breeze through central management since there's no other software installed. There's no need to backup the thin clients which is a major plus.
I'm not saying thin clients are fine for everything. Outside of production we use regular computers.
Our cloud usage is limited to email and some other non-critical things. Production does not rely on cloud in any way.
>” So...using thin clients with inadequate infrastructure or choosing IT technology based on "saving a quick buck" results in poor results. Quelle surprise!”
Given the 365 focus, and Microsoft’s focus on cloud and subscriptions, we can be sure this will be sold in the high st. Hence use over inadequate infrastructure will be a given.
I think, too many years back (before ADSL), Gates specified that Windows needed to be able to work over a 28.8kbps dial-up line, even though 56kbps modems existed and the expectation was connection speeds would improve again.
So I suggest if 4 of these devices can’t be concurrently used at full pelt over a Social Tariff connection, they are not fit for purpose.
"Given the 365 focus, and Microsoft’s focus on cloud and subscriptions, we can be sure this will be sold in the high st. Hence use over inadequate infrastructure will be a given."
You are totally wrong. These are aimed for enterprises. Home users do not use Azure VMs in any way.
"So I suggest if 4 of these devices can’t be concurrently used at full pelt over a Social Tariff connection, they are not fit for purpose."
Social Tariffs...? Had to look them up since I'm not living in UK. Seems to be multi-megabit connections. RDP works very well with low bandwidth low / latency connections.
In any case the end users are companies with RDP servers in the cloud, so your conjecture about Social Tariffs is ludicrous.
"Home users do not use Azure VMs in any way."
It is always great to see people make such definitive statements about something that they know nothing about.
I know several people ("home users") who create and use Azure VMs (as well as AWS VMs, GCE VMs, etc).
Does Azure prevent non-business accounts from creating Azure VMs? Of course not!
Of course *some* individuals use them. Doesn't change the fact that these are not aimed at consumers!
Roland6 brought up Social Tariffs for some reason.
Social Tariffs are for people claiming Universal Credit, Pension Credit and some other benefits. It's not like these people would rent VMs.
My mistake for not being clear about this.
That was kinda my angle, to be fair - there are certainly scenarios where thin client systems are useful, as you note in your post.
For more general usage or indeed anything cloud-based they just feel like an additional point of failure given that most IT systems are seen as a necessary evil by CFOs and receive the minimal amount of investment possible, even if the lost productivity this results in likely costs the company more.
Given how unreliable Microsoft's 365 services have gotten even this year I can't see how it's workable for most normal use cases.
"...no-one thought to test out that 35 IT students logging on at the same time was enough to crash the by then elderly server..."
Ah, the feeling of deja-vu! My first job was in education, being a system and network admin. This was in the days of MS DOS and Novell Netware 3.x. One of the first things we realized* was that network and server load in classroom environments was atypical. All students logging on at the same time, typically starting the same assignments all at once, trying to save all at the same time just before the bell rings...
The fun we had, trying to explain that to the vendors of the era! Then along came the thin client hype, which made that an order of magnitude worse. Manglement had read the glossies printed on drool-proof paper, and decided in their wisdom that This Way Lay The Future. We tried to explain to them that This Way Lay Ruin, but to no avail.
Needless to say it never worked properly and the projected "cost savings" turned into a gaping money pit when it became clear that in order to make that nonsense work we'd need very, very chunky hardware indeed.
Thin clients, my backside...
*This was after we realized that the only way to keep the mouse balls from disappearing was to superglue the mice shut.
I've been having great fun for the last couple of days playing around with different software stacks on old Intel Compute Stick devices that I found in a drawer… complete PCs in the form factor of a disposable vape pen, just plug them into the HDMI port of a monitor & some USB power. Their original Windows 8/10 builds could charitably be described as "ambitious". OK, an Atom processor with 32GB of flash and 2GB of RAM isn't going to win any speed awards, but they run Linux Mint quite happily. Alas, Puppy Linux didn't seem too happy with my Ventoy USB drive… I may have to create dedicated boot media for that one.
Although I have 2 x Raspberry Pi and 2 first generation Banana Pi there are plenty of other options to buy a 2hnd computer as a thin client and use it as a small low powered but capable computer.
I also have an Igel M340C, definitely designed as a thin client but it now has a 240GB main SSD, 512 GB second SSD internally, A quad core processor, old but capable Radeon graphics and sits with at least a 2 TB HDD on USB3 and enough ports to connect some more if needed.
The biggest issue with them is the earlier 1+ GHz version is old fashioned legacy BIOS. The newer 2+GHz version is UEFI and also slightly newer graphics.
It isnt lightning fast but it does what its needed for very well. It acts as a simple desktop to make web browsing and streaming on a smart TV more bearable than the TV's interface. Jellyfin as a media server for all my own recordings of many years from a headless Banana Pi running my TVheadend server. I have to transcode everything in to .mp4 so it doesnt transcode on the fly from .mkv as it doesn't support hardware accelaration on the graphics.
Cost of the Igel unit £25 including PSU even an old Raspberry Pi would cost me more.
There are plenty of other brands with similar aims as thin clients that can still be usable.
Horses for coarses, if you need to do 3d rendering or video editing (or trascoding 10+ years of live tv recordings) use a suitably specced computer.
Thin clients or full desktops/workstations have their place. Using something incapable makes your life a misery, using something massively over specced wont make you much if any more productive. Just use more energy than needed and so be a source of heat for your house or office space.
A completely spurious analogy, You can hold a Citroen 2CV 24 hour race (Snetterton), it wont be fast, but they nearly all get to the end. or you can have a Ferrari and go faster but more likely not get to the end of 24 hours. Faster can be reliable and even useful, but it can also just be an unecessary expensive ego boost!
Among my junk pile collection of tinkering machines I have a HP thin client, think it's from 2016 or so. Has a rather unusual 2GHz quad-core embedded AMD APU (GX series which I've never heard of before) with half decent Radeon graphics, can support up to 16GB of RAM and has twin M.2 SATA drive bays making it pretty flexible. Runs a pair of 1440P monitors without a hitch.
It's rather pleasant to use for general desktop work compared to the i7 intel NUC it superseded, being 100% silent thanks to passive cooling and even runs a heavier desktop environment like KDE perfectly well. Thanks to the IGPU being fairly potent I've had some luck using it as a streaming box.
40+ years ago most of my work involved mini/midrange systems, terminals connected to a central computer.
Then we got PCs but the ERP & Finance were using terminal emulators.
Later processing was decentralised with dedicated PC applications talking to a central database.
As data volumes grew but comms couldn't keep up the use of remote desktops and thin clients became common.
PCs became more powerful and comms cheaper so the workload decentralised again.
Dedicated PC apps were replaced by web pages and the cloud so processing was effectively centralised.
Now we're back to talking about virtual desktops again!
Thanks goodness I'm close enough to retirement to not have to go through too many more such cycles.
We've gone Back to the Future.
They've reinvented the DMS HiNet.
For those that don't know, it was a network of Z80 based diskless workstations that network booted a copy of CP/M from central 'server', where each user was allocated a chunk of storage. Most DMS systems offered their users office software, like Wordstar (1980s Word) and Calcstar (1980s Excel) and similar, including databases.
The wheels in IT go round and round, round and round...
The likes of Zotac have been doing this stuff for years, certainly over a decade.
Their Micro PCs - ZBoxes - come with a choice of a dozen CPUs, GPUs, Memory, SSDs, yada.
The one I've got is only 10.5x10.5x3cm and runs Win RDP, Citrix, etc fine for VPCs. With wired and wireless NICs, Bluetooth, HDMI, USB, eSATA, SD card slot, etc.
I use it as a multimedia hub but the company I worked for when I decided to get one myself had about 60 of them, booting into Linux with an RDP client to VMware VPCs and Servers for each user.
The only local computing was some financial stuff they didn't want to put in the cloud (it's an investment company) and local backups.
That was around 10 years ago.
By about 1984 NetWare 68/S-Net was around with a proprietary box linking CP/M and early IBM DOS PCs. We had a few up and running for WP with diskless or floppy workstations. A couple of years later the Ethernet base Netware 86 series were everywhere in our organization. By the time I left, in the early 90's, we had 10's of thousands of NetWare "seats" connected to many 386/486 networks. It was particularly useful for connecting disparate kit together (*NIX, PDP/VAX/DG etc.). If there was no approved drivers, our lowest common denominator was a PC running a terminal emulator, where we could copy-paste/scrape/save the screen and then transfer it. We did have some other networks like MS/IBM PC/LAN Manager, DECnet, 3COM etc., but they were all replaced by NetWare.
Yes it does seem similar to the Sun Ray in concept, however to me, the HP Stream (first launched in 2014) seems to be similar proposition. The Stream was an early attempt at a full function Windows terminal, just capable of running Windows and Remote Desktop. what MS seem to be proposing is an updated version with a more capable processor and locked into 365, so Microsoft’s second attempt at a Chromebook.