For many years I've had the very strange hobby of creating programming languages and writing compilers for them. My first readily-available one ran on 8-bitters under CP/M. It had the ability to define and use integers with user-specified bitwidths. Thought it would be useful on the memory-constrained machines of the day. Tryed using them in one major-ish project I did. Bad idea. Never tried to use them again.
For programming FPGA's having various-sized fields is pretty basic. But why does that have to reflect itself back into something like the C programming language, which is intended for general-purpose programming? My gut tells me that they will be patching weird issues for years, and that any actual benefit will not be worth the overall cost.