> Even the likes of Skype for Business can’t be assumed to enjoy life on a desktop.
Fixed that for you.
Don’t buy either desktop virtualization or desktop-as-a-service solely to save cash. Don’t assume that because server virtualization went well the same will happen for desktops. And don’t assume that the cloudy desktops are cheaper and easier than on-premises virtual PCs. So said Mark Lockwood, a research director at Gartner …
From a software licensing perspective, VDI technology is an absolute nightmare. You see most legacy licensing is device based and perpetual. A VDI is not a device and herein lies the problem.
In my experience, you need to have an incredibly bespoke rationale for requiring virtual desktops for them to be remotely value for money.
Especially now in the world of super powered tablet devices.
It is licenses what kills VDIs etc.
The technolgy was there more than a decade ago.. I would say it worked ok 15-16 years ago.
But you can't run anything that uses anything other than named users or floating licenses.
So no microsoft products, even if you CAN get a perfect windows vdi.
Microsoft Office and I assume all their other products are licensed for VDI with software assurance licenses. Windows VDI with VMWare for Xenserver both work fine. I've worked with hundreds of vendors and yet to come across one that doesn't have a license model for VDI based on device or concurrent licenses.
This post has been deleted by its author
Scott McNealy made this argument in 1997, with Java based thin clients and the rising Data Center Cost is what kept that from going anywhere. Virtualization on the Server works, if it is done right. Virtualization on desktops is an unnecessary step to remove the need for Service Desk Support. Desktop Virtualization is slow, and would require significant investment in Data Center machinery. What will happen when user's desktops and laptops are slow or frozen? Are they going to call an automated Service Desk to get AI Support? No. That's not going to happen. The clowns in Management who try to do this, just want to finance their own perks and bonuses, at the expense of system performance and Service Desk Support jobs.
We have around 1,000 VDI session mainly for off site developers who remote in and work on a corporate standard desktop and access internal systems remotely.
We also have a bunch of instances where we need the client next to the server due to crap design. We are looking at HRG-VDI for our engineering teams but can't find suitable hardware configuration for mass deployment at the moment. Ideally we would like composable GPU so we can switch load from VDI to visualisation or ML when developers go home.
The real reason isn't in the article: users are idiots. PC's have spread into areas where the users can barely tie their own shoelaces. As much as the readers of the Reg would hate having to use a VDI, the tales from tech support, along with the examples of users plugging random crap into their machines, mean that for the typical job nowadays a VDI is a must if you are going to stay out of trouble. The tradeoff is a million niggling service calls vs. a few major breaches.
I've used VDI for years, chiefly for manageability, and have always been amazed at the resources I've had to throw at it to make it work okay. For instance 1:
"[Gartner] said VDI works nicely with latency of between 100 and 180 milliseconds, but at 200 milliseconds your users will be burning you in effigy."
Like Gartner itself, anything over 50ms is for occasional use only.
For instance 2:
Is your VM rubbish? Low latency will not help if your VM is rubbish. Can be hard to test all remote.
For instance 3:
Windows: It's a bulimic pig (with apologies to bulimic pigs everywhere), puking great mounds of throwaway I/O. Sheer passive aggression by Microsoft is the laziest hypothesis.
So is VDI cheap? Lavish storage and networking are important clues. Is it unavoidable? Almost never. Is it nice to have? Yes.
I don't think that will work well. The benefits they cite seem to be mostly useful for locations where you want a lot of remote access, but RDP seems to serve that well. Otherwise, I don't see pretty much any of the benefits of the system. The sentence that most concerns me is the statement of less need for IT people to go to the desks and fix machines. What, pray tell, do they think the users are using to access their virtual desktops? Whatever it is, it looks a lot like a computer and it's on or under the desk. Users can mess that up just as well as they could a traditional machine. The main difference is that there is less access for the IT person if the user has managed some calamitous software problem. True, you can easily reboot it after you correctly plug in each cable, then reconnect to the virtual desktop, but if the issue is worse than that you will have to play around with whatever thin client it is to find it. Traditional desktops work fine most of the time, because they are straightforward to manage. If there is a location where repeatedly destroying or reimaging virtual desktops is a routine thing, there are probably problems that virtualization won't fix.
#1 It Costs MORE money than management wants to pay to make it work well!
#2 Management NEVER accounts for the lost productivity when it fails do to connection issues to the clients. Either in a Teleworking environment or in house.
#3 Both #1 and #2 have been ignored numerous times resulting in three failed roll outs. Managements answer is to buy the next generation of dumb terminals without spending for the required bandwidth or service level agreements to make it useful.
This post has been deleted by its author
Biting the hand that feeds IT © 1998–2022