Interesting to see the same old use cases being rolled out, again. Quite a lot of them are not very mass-market, much more niche-market.
FPGAs remain ******* hard to get working effectively. There's a lot of talk of direct model synthesis, but that's never very good. Someone doing some proper coding gets better results, and the number of people who actual work in VHDL or Verilog and are good at it is surpassing few. You've really got to need to use an FPGA, to justify using one. Software on a CPU is a lot easier.
They're also expensive, hot, and prone to having insufficient on-chip resources. If you thought Apple Sillicon had expensive on-chip RAM, wait until you're paying Xilinx / Altera prices for it. They do not have that much on-chip storage - 100sMByte tops - and as soon as that becomes insufficient then you're better off with a CPU and GPU. The thing I think is interesting is that with these monster FPGAs, they're firmly in the land of large and hot; i.e. a large GPU or CPU is volumetrically and thermally competitive. The problem is that for computational applications, the GPU or CPU can easily out perform an FPGA if the problem data set requires data storage in off-FPGA RAM (GPUs especially have huge memory bandwidths in comparison), and are a whole lot cheaper to buy.
That's the benefit of there being a huge consumer market for CPUs and GPUs - the cost of the chips gets to be quite low. With very little in the way of a market (in comparison) for FPGAs, they're always an expensive solution to a problem. There is a very small range of problems where it's worth it.