Talk:Gigabyte/Archive 2

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Pretty much every operating system disagrees with this definition

Surely a valid point? Windows defines a Gigabyte as 1024x1024x1024 bytes, I am uncomfy with some institution trying trying to enforce this when it makes life difficult for programmers, because memory is not addressed in decimal. It seems to me the only people who define a Gigabyte this way where the hard drive manufacturers. I am personally never going to say "Gibibyte", nor are the vast majority of people.DarkShroom (talk)


Raymond Chen's arguments for Windows using 1024 instead of 1000 are pretty weak: Why does Explorer use the term KB instead of KiB?

Also note that Mac OSX shows the correct value: Do you use “kibibyte” as a unit of measurement in your programs? (one answer) — Preceding unsigned comment added by 124.169.158.170 (talk) 09:05, 21 August 2012 (UTC)


I agree, "gibibyte" is not a word I will ever utter. I'm a software developer at a software development company, and we were all flummoxed at what GiB refers to. Gigabyte has always referred to powers of 2. 2600:6C4E:1200:F3:A9E8:3A79:A1B0:A87D (talk) 17:53, 23 February 2018 (UTC)

As is thoroughly documented at Binary prefix, pretty much every hard drive manufacturer - hm, no. Absolutely every hard drive manufacturer marketed hard drives using GB = 1,000,000,000 bytes and, now, terabyte or TB = 1000 GB. And have always done so. In fact since hard drives reached GB-scale capacities long before RAM did the claim that the binary usage has priority through first use is obviously wrong. Same for "megabyte" for that matter. Jeh (talk) 19:41, 23 February 2018 (UTC)