
Been there, done that, don't do it
One of my first languages (in high school) was Autocoder for the IBM 1401. It allowed you to define character, base 10 integers from 1 to 1000 characters (floating point? we didn't need no stinking floating point!). The excuse was that it mimicked punch cards.
I later worked (in a telco, in a production environment) with call records that were defined at the bit level. The excuse was that the switches couldn't produce bigger records.
Neither was easy to maintain and the latter ended up costing the company a chunk of change due to poor maintenance (not mine).
Decades ago, space was at a premium, it's why Y2K happened.
Now it's just not worth it. The first rule of programming should be "write code that can be easily maintained" and part of that is"don't write weird code."