Well, yes...
I do understand logarithmic scales and I understand that +3dB is equivalent to a factor of two increase in signal. In my original comment I said I believe there is a problem that needs to be addressed. I'll admit using the range the way I did is misleading, but let's face it, we don't know what's going on in the new, more sensitive baseband hardware, so a direct calculation based on a dB logarithmic scale may be as misleading as my use of the range. The iPhone 4 seems to make better use of a -113dB signal than other phones, and I'd imagine that this improved signal handling will not be as marked (or needed) at -51dB. So what effect should this have on the bars displayed? Should bars be based on real world usefulness, or strictly the signal dB, even though some phones will work better than others at low signal levels?
If you read the Anandtech analysis you'll see that the problem is that the scale used to display signal bars is so skewed towards the top end that it amplifies any loss in signal drastically. Look here (http://fscked.co.uk/post/754590440/update-i-have-a-followup-piece-about-apples-new) for a neat little graph that shows the Anandtech data. Even given the logarithmic nature of the dB scale, that's a very big skew at the top end. I admit I'm not an RF engineer, but I've read around the subject, and people who are far more knowledgable than me seem to agree that the scale is a bit off. I'm sure that the new scale Apple implements will still not be linear, but I'm sure it won't be so broad at the 5 bar end.
Also read the Guardian piece here (http://www.guardian.co.uk/technology/2010/jul/02/iphone-4-signal-apple) which points out that a signal of -80dB is displayed as 5 bars on the iPhone 4, but only 2 bars on a BlackBerry.
Yes, a 19.8dBm attenuation when held in a certain way is very big, and bigger than most (if not all) other smartphones. Yes, it's a problem that needs sorting. But the crappy signal scale makes it look far worse than it is, for some people.