Instead of changing the definition...
Why not just make hard discs conform to time immemorial standards that 1GB = 1024MB?
Any punters who bought a Seagate hard drive in the US between 22 March 2001 and 26 September 2007 could be entitled to a five per cent discount on future Seagate products or free backup software. The settlement is the result of a US class action suit. The case alleges that Seagate sold drives with seven per cent less storage …
``digital definition of a gigabyte - so 1 GB = 1 billion bytes.''
According to the wikifiddlers:
Adjective
digital (no comparative or superlative)
1. Having to do with digits (fingers or toes); performed with a finger.
2. Property of representing values as discrete numbers rather than a continuous spectrum.
* digital computer, digital clock
3. Of or relating to computers or the Computer Age.
``computer operating systems''
Of course, 'digital' (pun intended) hardware communicates in decimal, doesn't it?
Plaintiffs don't know what they're talking about, let alone deserve anything beyond a good LARTing.
For those of use older than 20, remember when disk drives used to
be measured using the Base 2 definition.
Which was all fine and dandy. Until the HD manufacturers decided to
use the SI definition, but didn't bother to inform anyone too much, as
it allowed them to sell bigger drives, when in fact they had redefined
the measurement!
Althouht i hate this mismatch and finding my 300GB drive is nowhere near, i'm surprise seagate lost this, isn't the problem technically windows reporting it incorrectly.
a gigabyte *is* 1000 megabytes.
i thought the correct term for 1024 Megabytes is something different (Gibibyte?)
Hard drives are computer components, the only thing they get used in is a computer. Therefore, the clash between the manufacturer's claimed capacity vs the capacity reported by ALL computers is one of their own making. I've never come across a device yet that doesn't use the 1GB=1024MB definition, so it's totally illogical for HDD makers to use 1GB=1000MB except as a way for them to legally make a 930GB drive but labelled as 1TB.
This is such a long standing issue which needs sorted out. The HDD makers and the rest of the computer industry have got to sit down and sort this out as it's only going to get worse as larger and larger drives appear on the market. The confusion caused to average computer users is not acceptable (I'm sure many techies are tired of trying to explain why a brand new 500GB drive is 35GB short...)
I've already seen this happen to Western Digital, and now to Seagate as well, and it makes me sad. In this case and others like it, the hard drive manufacturers have done *nothing wrong*. A gigabyte (GB) *is* a billion bytes, according to SI definition. It is also defined that 1024 x 1024 x 1024 bytes is called a gibibyte (or giga binary byte), and should use the suffix "GiB".
The problem here is that both Windows and Mac OS X report disk and file sizes in gibibytes (GiB), but erroneously use the gigabyte suffix (GB). In essence, the OS is lying to the user about hard disk capacity, but nevertheless the user will still get the full capacity of the drive regardless of the misleading readings. For example, Windows XP reports my 200 GB drive as "186.29 GB", but I am still getting the full 200 GB out of it. 200 GB = 186.29 GiB. However you represent it, the number of bytes on the drive is exactly the same - 200,000,000,000.
It's appalling that the hard drive manufacturers continue to take the blame for the standards ignorance of OS vendors.
I remember back at college in the early 90's when one of my lecturers said that disk space used 1000 kilobytes to mean a MB.
All HDD manufacturers have been doing this for years. As disk sizes increase, the difference between the proper binary GB and HDD GB increases.
So why were Seagate singled out now, rather than years ago?
This is no different to Acer doing their usual lie-through-their-teeth-specification where they claim to sell you a laptop with an 80GB HDD. First, the hard disk is never 80GB, usually around 70-74GB, then they take 4GB for their "recovery" (ie, wipe everything back to the original, bloated, crap-infested and often unusable configuration), then partition the remaining space into two partitions and, to add final insult to injury, format them with FAT32 rather than NTFS, The end result, a supposed "80GB" HDD system with only about 30GB in each partition. Most users will never use the second partition and all software will be installed, by default onto the first parition (C:), along with data ("My Documents", etc). As a result, with the normal windows bloat, swap file and shovel-ware, most users start off life with around 25GB of free disk space on an "80GB" HDD system.
Still, at least Seagate bother to put the capacity of their drives on the disks themselves these days.
disks from IBM were the right size. Since a bit before they turned in Hitachi, the got into the 10e3 instead of the 2e10 definition. A longer long time ago all disks were using the 2e10 notation, and it was quite fine.
It is a fact for all drives that the 10e3 notation is used for had drives, and is written on all hard drives I can remember as 1GByte = 1.000.000.000 Byte. Or so I tend to remember. Everyone knows also that formatting a drive you lose some capacity, that FSs
Dunno about seagate, the only thing I liked in them was thier plastic enclosing for EOM market, which lead to a greater density of disks per standard size delivery box.
(I am not saying that seagate disks are bad, I just happen to have installed a bad series of them some 4 years ago and gotten all the customer support that went with the failing disk on me)
Yes, blah blah blah, HD manufacturers aren't wrong with their Gb rather than GiB nomenclature - stuff it.
If I buy a 500 gigabyte HDD, I expect it to show up on my computer as 500gigs of available space, not 460. And yes, I'm well aware that every HDD that I've ever bought to date fails this test.
Computing has always worked using the binary system, not decimal. HDD manufacturers use the decimal system because it allows them to inflate their capacity claims - simply marketing lies, as far as I'm concerned. Yes, it's not "technically wrong" but it's sure as hell misleading.
Screw 'em all - perhaps HDD manufacturers will now release products that actually give you what you expect. Just a shame that it took a class action lawsuit to get the ball rolling.
"... that the engine on my so-called 2 litre car is, in fact, only 1,998cc.
Whom do I sue?"
The difference is that 1,998cc DOES round up to "2 liters" quite nicely. If they say "you have a 2000cc engine" when it is in fact 1998cc, THEN they are lying...
By the same token, if they want to call that drive a "1TB drive", then all is well. To one significant digit, it IS 1 TB. But if they call it a "1.0 TB drive", then they are lying to you once more as it is only 0.9 TB.
Oh, and the definition of GB=1024x1024x1024 has been in place a LOT longer than the SI definition of GiB. About the time drives started to hit triple digit MB sizes (and that is MG=1024x1024) is when companies started to lie about there capacity.
Because they could. And it made the gullible think they were getting a bigger drive than they were.
I wonder who convinced the SI group to define GB=10e9 instead of already in place and accepted GB=2e30 definition? I bet they worked for a hard drive vendor...
Strange how every other thing in the computer is referenced off a base 2 system (maybe because EVERYTHING in the computer IS base 2) except the hard drive capacity???
Hard disc manufacturers have been right all along.
It was operating systems that confused consumers using the 1024 multiples.
Standards bodies have defined the KiB, MiB, GiB for this. Linux has adopted this. Windows should too (although wasn't it Bill who started all this business in the first place?).
If there's anything hard disc manufacturers should do though is they should maybe just quote on the label an "estimated" GiB figure. How much you get depends on the format of partitions on the disc though.
As for Seagate. They're not too bad as drives. Quieter than many. Could do worse and buy a Deathstar ;-)
Plus in my opinion 90% of hard disc "failures" are user error.
My 2.0 Mondeo has a 1,998cc engine too, i.e. 0.2% lower. I suspect that this is a bit of sensible leeway to ensure that after manufacturing tolerances and a few years of running wear, the capacity can't get to 2,000cc or greater, since this is the legal limit for taxation class (or some other legally significant borderline, I forget which).
Getting back to the main subject: since as far back as I can remember, when dealing with binary arithmetic and related subjects (such as, er, computers and binary data storage) 1 kilobyte =1024 bytes, 1Megabyte = 1024 kilobytes, 1 Gigabyte = 1024 Megabytes. This was for the sake of mathematical/arithmetic convenience and everyone knew that the numbers were related to base-2 arithmetic. More importantly, everyone who was closely involved with computers knew about this and 'ordinary' people didn't go near them and didn't care.
As the PC and home/hobby computing hit mainstream, then hard drives effectively became commodity items and were given the commodity marketing treatment by corporate departments who's only concern was making a sale.
Hence they tell the punters that their hard drive is bigger than the other company's hard drive (even though it isn't), safe in the knowledge that there is a dictionary or encyclopedia definition somewhere that will say 1Giga of anything is 1000 of it. As long as their position was thought to be defensible in law, that was fine for them and the other sales organisations did exactly the same thing of course (they had to or they would lose market share). I remember seeing this happen some years ago when the PC market really started to take off and there were ridiculous adverts aimed at convincing people to buy a particular PC.
Sadly, for the disk manufacturers, the OS's are still written by people who stick to the 'traditional' ways in binary land where 1G = 1024. Nowadays with very big hard drives, whre a relatively small percentage drop in actual over expected gives quite a substantial shortfall in absolute terms, people have got upset when thinking of all the extra .mp3 files and DVD backups they could have fitted on there, if only the hard drive salesmen had been 'honest'.
Remember, if you buy a car from a salesman, check the meaning of all terms carefully. Similarly with a hard drive, because salesmen will always be salesmen.
I may be blonde, but I can't seem to find the name of the "backup and recovery software" they'll be peddling anywhere on the settlement site. It'd be nice to know in advance if it's actually worth digging out my receipts and taking the computer apart to get the drive serial number!
It has always amazed me that the monitor manufacturers stick to the diagonal measurement rather than the more useful surface area.
The difference between 15" and 17" does not sound much and would be much more descriptive if it were in square inches ... reaches for sliderule ... thinks about Pythagoras ... too difficult :))
"Although Seagate continues to deny all charges, it has agreed to settle the case."
My (non-lawyer) translation would be - this action is a crock of sh1t, but it could cost us a bundle to fight it, so why not offer existing customers a 5% discount on their next purchase? Since our gross margin is >>5%, we'll be quids (sorry $$$) in and it'll get these dumb-asses off our backs.
If I remember my motoring history right (although it's a bit fuzzy) a 1,2 or 3 litre engine was just that until manufactures got wise and stripped a few cc off to get into the lower tax and insurance brackets (if you think about it, it's a great sales angle).
Soon afterwards places like DVLA revised their classifications, so now if you did a 2 litre engine you would probably be placed in the 3 litre category!
OH and Steven about monitor sizes - everyone knew it was a con, but it was the standard way to measure. When LCD's started to take off places like Dixons started to show the width and height dimensions, but the general public got all confused and they gave up after about 6 months.
So Seagate get their wrist slapped for something any computer savvy person knew anyway. 1Gb is 1024mb, always has been and always will be. If not then why arent memory chips and various other components measured as 1000 ? Or does this mean my PC has been 'upgraded' and now has 2.1Gb of memory ? In other words, why should HD manufacturers be allowed to count differently from all the other component makers !
I remember when WD changed their units. I bought 2 WD Caviar 21200 (I might have the model number wrong) drives many years ago about a week apart. Both had identical drive geometry. The first was advertised as a 200MB drive. The second as 212MB. Same drive. That was the week they changed.
And their excuse? Because DOS CHKDSK reports capacity in millions of bytes! Which it doesn't. It reports it in bytes, with commas as thousand separators.
People can argue about MB & MiB as much as they like but the MB started out meaning 1024*1024 and got changed.
And while you know what you're getting, I have NO IDEA how much data I can store on a 4GB SD card. It may be 4*1024*1024*1024 and it should be. No other number makes sense. I know how much RAM there is in a 1GB SDRAM stick. It's 1024*1024*1024. And for good reason. Computers work in powers of two and programmers need to as well.
But marketing will always win out the day and WD probably did better advertising that drive as a 212MB than they did as a 200MB even though it was IDENTICAL hardware.