The worst of all worlds!
So he's using WebGL Streaming?
Yep, used it and other similar technologies. And you know what - they are fine for headless machines which want a nice remote UI over a LAN. Latency utterly kills them over the Internet - even for 2D UI - as it's a worse experience in most cases than sending out a compressed video stream.
Thick clients - or "normal online gaming" - work because the user gets instant response to their actions (even if the target moves fairly erratically due to latency)
OpenGL was originally designed to be used with the GPU on the other end of a network. Ask yourself why every single type of GPU workload now wants the biggest, lowest latency bus between the CPU and GPU.
This product can only ever have any value if an application has absolutely massive CPU requirements compared to GPU needs.
Nobody writes games like that, and even if they did, it just moves the cost from "need big GPUs nearby" to "need big CPUs nearby".
In short, this is not mass-market, and not for gaming. There used to be a possible market for this in CAD simulation acceleration and similar, but these days those workloads are being farmed out to banks of GPUs, because they're a far better architecture for it.
Sorry guys, your ship sailed twenty years ago, and has long since been broken up for scrap.