Sounds like a good idea
No point wasting time processing unused bits. Reminds me of my early days programming in the 70's and 80's when RAM and disk space was at an absolute premium and I (everyone really) used all manner of weird and wonderful ways to compress data to the minimum. It did of course eventually lead to problems like Y2K with an assumed "19" or "20" depending on whether the year was more or less than 80 for example. It did make handling other people's poorly documented code a nightmare though, especially if they munged multiple values into one integer variable (using higher bit positions) to hold boolean or other data; no bits wasted. I've come across some real pigs to debug later when the bits "unexpectedly" overflowed into each other.