Lots of non-financial reasons.
Anywhere you have remote offices handling personal information, or simply where the physical security is potentially an issue - you can deploy dumb terminals and keep everything in your nice, PCI/DPA-Compliant data centre.
Of course you could deploy cheapy fat clients and XenApp rather than full VDI, although people could still fall into bad habits and store stuff locally unless you go to the effort of fully locking the remote devices down (which then tends to reduce productivity and ends with users finding "creative" workflows).
There can be certain benefits for app-licensing, or significant hardware savings for demanding users - there could be scenarios where you have 3D modellers, CAD jockeys or animators, who need a hefty Quadro card plus a Tesla accelerator. But who only really strain their workstations in occasional bursts.
Virtualise say, two or three of them using nVidia's GRID product onto a single beefy machine which can give each of them the power when they need it, but saves the expense of three full-fat workstations, two of which are probably idling along at any given moment...
It's the same argument as virtualising servers. At any given time, not all your servers or users will be maxing out their machine. Indeed many of them will be hardly touching 10%, so throw them onto shared hardware.
Not suitable for every case, and Windows licensing eats into the potential savings more than a *nix ecosystem would, but suitable for many depending on their precise business and workload.