First a sort of a disclaimer -- about a decade back, I briefly worked for a consultancy that heavily pushed virtual desktop solutions at its clients, and most of said clients ran them.
So, it wasn't my choice or anything, but all the same, that's what I must base my answer on as my most extensive personal experience.
There were several reasons. Different clients had different drivers.
One was: a fleet of badly outdated desktop OSes, some even running Win9x well into the 21st century sometimes. Many were underpowered and slow. Upgrading the OS would be vastly labour intensive and possibly expensive in both software and hardware upgrades.
But almost anything, however old and slow, can run a remote desktop client. So, they pushed out remote desktops to everyone, and any old junkware clunker of a PC that could at least boot up would let them connect -- and once they'd logged in, everyone got the same performance.
Another reason: upgrading the client software (OS, apps, drivers, whatever, all of the above) is a significant management hassle. Put all the apps on a terminal server and you only have to update that 1 box.
Some clients want the terminal server in their office; fair enough, we could remote onto it like anyone else.
(In that case you really want at least 2, ideally a set of 3 or 4, so you can keep working if 1 goes down.)
Of course, then, maintenance windows become extremely hard to find, but that's a different issue.
Or, outsource running the servers too, and then their maintenance becomes the hosting company's job.
Maintaining a fleet gets even worse if you have lots of mobile workers, or remote workers, or people who are sometimes both and so have maybe a home-office PC *and* a travelling PC.
If it's all virtual, then this goes away. No data syncing issues. No remote-node-to-server connectivity issues: the server is also the client, or at least sits next to it in the rack.
LAN protocols (file sharing, print sharing, etc.) are not designed for WANs and often do not run well over WANs. Remote desktop protocols these days are highly compressed and designed to cope with intermittent connections. If the link is lost, the remote app doesn't panic, crash and corrupt the file -- whatever the remote app is.
It's inefficient, it's a bit of a bodge, but it *does* work and it does make some issues just go away.
Amusingly, while I worked at that company, it was in my contact that I never mention or discuss Linux or FOSS with clients. :-)
What they didn't know is that sometimes, I used my laptop with Linux to run an RDesktop client at work. The machine they gave me was, true to the company's methods, an abysmally old, slow, underpowered dog of a box. I spent a day or so cleaning up and updating the client box to make it work slightly better, and in that time, I used my own. With a full-screen RDesktop session, nobody could tell. :-D