I worked on a contract where the project manager decided that such productivity could be measured by counting the number of semicolons in the source code, and got someone to write him a script to do so.
The term for this is Goodhart's Law, and I've seen it hundreds of times over the years.
In an example almost identical to yours above, when I worked at one large, unnamed (*cough* mentioned in the article title here *cough*) Fortune 1 company decades ago, the metric was kLoc, or "thousands of lines of code". Yes, they measured software productivity by weight, essentially.
Management had a parser that went through each developer's code, including all included headers and etc., and counted the number of lines. There was a database module that was particularly huge, including hundreds, if not thousands, of 3-5 line modules that handled the numerous permutations of data. It was completely and totally unnecessary to every subsystem but the database module. One week, every subsystem, including the graphics, communication, and all others, suddenly included the root header file for the database, because doing so dragged in about 8,000 lines of headers.
Build times went up from 90 minutes to about four hours. Ugh.
When I asked what was going on, I was told "next week is audit week". Sure enough, after the audit was completed, the code was "re-factored", and through brilliant optimization, the developers were able to improve the four hour build time down to less than half, about 90 minutes. Management was extremely impressed, and I believe one developer even got an award for his brilliant optimization work of, err, removing the pointless header files that they'd only inserted a week earlier to make the kloc audit look good.