Talk:Megabit

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

More meaningful byte value conversion[edit]

I just added a more meaningful byte value conversion, as that's something people would likely be looking for....

Someone's got a Mbit down as 10^6 (correct) = 1,048,... (whatever, but it's wrong) bits. The M in megabit is the decimal mega (10^6). Which does not - no matter your binary conventions, equal anything other than one million. Have removed this spurious figure.

Thanks. It's hard to maintain all these articles against vandalism. — Omegatron 03:51, 25 January 2006 (UTC)[reply]

Untrue statement[edit]

The megabit is most commonly used when referring to data transfer rates in network speeds, e.g. a 100 Mbit/s Fast Ethernet connection. In this context, like elsewhere in telecommunications, it always equals 106 bits. Residential high speed internet is often measured in megabits.

A binary counterpart of megabit, useful for measuring RAM and ROM chip capacity, is mebibit.

^^ The above is untrue in terms of telecommunications although without checking holds true for packet based networks.

With the developement of PCM a 4kHz voice was sampled at 8000 times per second and formed 8 bit words (7bit data plus parity). Hence 8000x8 = 64Kb/s (Should be 64Kib/s)

Further as defined by SDH/SONET standards the primary rate interface E1 (Europe) is 2.048Mb/s (2.048Mib/s) That is 32 x 64Kb/s channels are thus byte interleaved to form a 2.048Mb/s (2.048Mib/s). Thus the new terminology defined by the IEC 60027-2 Standard (or working group) is factually correct.

Storage medium for the most part (Hard Drives Predominantly) continue to use the metric conversion.

The use of k or K does not in itself imply binary nor metric hence the move to use IEC 60027-2 Standard for data transmission and solid state memory devices is an important step.

125 KB?[edit]

are you sure it is not 122KB because 125 000 bits devided by 1024 is 122.0703125 and Kilobytes are 1024 based not 1000 based

I just made the change because you're right - KevinJones 20:33, 24 July 2007 (UTC)[reply]

I just changed it to 122KB again... who keeps changing it? --Poh Tay Toez (talk) 03:18, 8 January 2008 (UTC)[reply]

The kilobyte is an ambiguous unit. It's pointless switching between 122 kilobytes and 125 kilobytes because both are correct. In fact even the megabit is ambiguous (some define it as 1024^2 bits), which makes for yet more options. Let's keep it as simple as possible without losing the essence. Thunderbird2 (talk) 10:39, 8 January 2008 (UTC)[reply]

Removed some trivia[edit]

I removed the following as non-encyclopedic:

It is certain that the SNES Street Fighter II cartridge's proud claim that it came on a 'massive 16-Mbit cartridge' sounds more impressive than stating that the whole game weighed in at 2 MB, i.e. about the same size as a desktop-sized bitmap image these days.

—The preceding unsigned comment was added by Kaleja (talkcontribs) 22:47, 10 February 2007 (UTC).[reply]

Megabit[edit]

(Cross-posted from User talk:intgr)

Actually, SNES cartridge capacities are measured in mebibits, you are right, but the symbol of mebibit is Mibit, not Mbit like it's written in Megabit. Sarenne 21:50, 10 April 2007 (UTC)[reply]

Indeed, I missed that. -- intgr 22:19, 10 April 2007 (UTC)[reply]
What do you suggest we do about it? Perhaps the cartridge example is bad for this article in the first place? -- intgr 22:21, 10 April 2007 (UTC)[reply]
I simply suggest to replace "Mbit" by "Mibit", like in Mebibit but it will need explanation about the fact that 'megabits' are actually 'mebibits'. Perhaps this example is not relevant for this article, you're right :) Sarenne 22:28, 10 April 2007 (UTC)[reply]

See also[edit]

The following discussion took place on my Talk Page:

Hello! I'd like to invite you to take a look at WP:ALSO, a section in the Wikipedia Manual of Style which describes how "see also" sections are generally used. Since the purpose of these sections is to provide a place for relevant wikilinks that aren't yet included in the article, with the eventual intent being to integrate those links into the article and remove the "see also" section entirely once that's done, I've removed a number of redundant links from the articles megabit and gigabyte. If you wish to discuss these changes, I'd be happy to do so. --DachannienTalkContrib 11:31, 7 December 2007 (UTC)

I have followed your suggestion to look at WP:ALSO. Though I have not read it thoroughly, the presence in it of a "See also" section suggests to me that its authors have not thought through the implications of their own guideline. In what sense does the removal of (eg) bit from the "See also" of megabit improve the megabit article? Thunderbird2 13:29, 7 December 2007 (UTC)
Thanks for your response. A person reading no more than the introduction to the article would have already come across the wikilinked term "bit" and could have clicked on it if they were interested. Including a wikilink in the "see also" section merely clutters that section up, making it more difficult for users to pick out the terms that are both (a) related to the topic at hand and (b) didn't already appear elsewhere in the article. If "see also" were intended to be a handy summary of all the wikilinks in an article, it would have been made an automatic function of MediaWiki long ago. Hope this makes sense. --DachannienTalkContrib 17:02, 7 December 2007 (UTC)
If "See also" were used as a "handy summary of all the wikilinks in an article", I would agree with you. In megabit it is not used in that way, but as a summary of most directly related to the megabit article (bit, megabyte, mebibit). Collecting the important ones in one place helps the reader by saying Reading these particular articles will help you understand the concept of a megabit. Thunderbird2 (talk) 10:41, 8 December 2007 (UTC)
Not really. If a reader begins reading the article and comes across a concept or word he or she doesn't understand, then that word will already be wikilinked within the article, and they can click on it then. If it's unclear from the context of the article that a particular concept is important as background to understanding the topic at hand, then the article should be revised to make that context clearer. In other words, repeating the links at the end of the article isn't particularly useful, because someone unfamiliar with the concepts of a topic don't skip to the appendix material first - they read the article from the beginning. --DachannienTalkContrib 14:33, 8 December 2007 (UTC)

In the hope of achieving consensus, I'd like to know what other editors think. Please comment here. Thunderbird2 (talk) 14:43, 8 December 2007 (UTC)[reply]

neutrality question[edit]

This statement seems to imply mild contempt for standard industry usage: "The practice of using megabit or Mbit in this context is still widespread in the RAM and ROM manufacturing industry in 2007 #despite the availability of an unambiguous alternative#." The portion in hash marks seems based in opinion. To me, IMHO, the author of this statement is indicating the opinion that RAM/ROM manufacturer's should change their usage. Akamos (talk) 17:20, 3 March 2008 (UTC)[reply]

I'm not sure that the quoted statement is even true. When people refer to RAM and ROM it is usually in terms of bytes, not bits. If a source is found demonstrating its correctness (that the megabit is used in the binary sense), I consider it reasonable to point out that an unambiguous unit is available. Do you have a suggestion for rephrasing it in a more neutral way? Thunderbird2 (talk) 17:46, 3 March 2008 (UTC)[reply]
That passage arose due to binary prefix evangelism. I've pared it down to what's necessary to direct people to the mebibit article. --DachannienTalkContrib 18:15, 3 March 2008 (UTC)[reply]
It is true, though, that all the manufacturers I can think of, including Samsung, Toshiba, TI, Atmel, etc, all use the terms kilobit (Kbit/Kb), megabit (Mbit/Mb), and gigabit (Gbit/Gb) in the base-two sense of the word, when referring to discrete SRAM, DRAM, EEPROM, and Flash chips. They do *not* refer to these chips in terms of their "byte" storage capacity (I hypothesize here) because of the historical ambiguity of the very term "byte", which has had different meanings from 5 bits up to 9 bits or more through various historical generations of computer architectures. JEDEC standards support this view as well. I was the person who most recently expanded on the recently deleted section; when I wrote it I actually surprised myself with how fair and balanced I thought I was being. I am personally in favour of "unit purity" and would like nothing more than to see the day when the misuse of "megabit" and the likes are forgotten to history; however, that day has not yet come, and I thought it was important for people who are researching the issue be presented with factually accurate facts about the current state of affairs within the industry. Goosnarrggh (talk) 13:47, 7 April 2008 (UTC)[reply]


I absolutely agree, a MB and Mb is not, has not, and will not be Decimal. Just because the industry makes up standards to justify their market schemes does not make it true. In addition there is no mention that Microsoft's OS uses the binary system. This document is absolutely biased http://macosx.com/forums/apple-news-rumors-discussion/36506-computer-makers-sued-over-hard-drive-size-claims.html

"Market schemes" or even deceit do not account for the difference in definition. It is simply easier to communicate that a memory chip is "1 Megabit" when it has a capacity of 1024×1024 bits. — Preceding unsigned comment added by 2607:FB90:8BE:3CCD:7CDC:6737:4A07:6ABC (talk) 01:13, 12 May 2016 (UTC)[reply]

There are allot of documents like this. Overnight the standard changed.--74.67.130.59 (talk) 19:24, 28 April 2011 (UTC)[reply]

Use of "mega" in the computer industry[edit]

In the computer industry, a mega-something is _always_ 2^20 of them. (kilo is 2^10.) The term was coined long ago in order to avoid the nasty-looking decimal equivalents of powers-of-2 numbers that are necessary to use in optimal digital system design. If you mean a million bytes, you should say a million bytes. If you say a megabyte, expect any digital designer to understand you're trying to make a distinction, that you're really talking about 1,048,576 bytes, not 1,000,000 bytes, but didn't want to write it out, or force the reader to recognize that number as a clean power of 2. I'm always a big fan of revisionist history, but please understand the confusion you perpetuate by attempting to co-opt the terms kilobit/byte, megabit/byte, and so forth to mean powers of 10. If you'd like to coin a new term like mebibibbibibit/byte, I recommend to apply the new term to the powers of 10, as there is a large established industry which understands exactly what a "megabyte" is. Or just use the words we already have: million, billion, trillion, and so forth. Best Regards, Ron. 70.102.112.242 (talk) 17:58, 16 July 2008 (UTC)[reply]

For the purposes of this particular article, there are legitimate circumstances where different segments of the industry have standardized on opposite interpretations. It is therefore appropriate to identify these differences of opinion. The example is given of "megabits per second", for which the interpretation "millions of bits per second" is consistently accepted. It would, in fact, be an example of revisionist history to change its meaning to "2^20s of bits per second" today.Goosnarrggh (talk) 13:10, 4 August 2008 (UTC)[reply]
IEEE 802.3 (the governing spec for "100 Mbit/s (megabit per second) Fast Ethernet connection" as cited in the article page, does not generally use the term "megabit" to define its data rate. Instead, it uses the word "million" when referring to bits per second. (In the version I'm looking at, IEEE 802.3-2002, the term "megabit" appears only twice, whereas "million" in the context of bits or bit-groups appears 14 times, including in the all-important definition of bit rate.) I don't think "Fast Ethernet" is a good example to support your point -- nor do I think the current article accurately reflects the statement that there are differences of opinion. Rather, it seems written from the point of view of a "deci-vangelist", to paraphrase a previous talk point, and attempts to pass off a new term as if it were in widespread use. If the goal of the page is to provide unbiased representation of the actual use of the word "megabit", then it (currently) fails to meet that goal. Best Regards, Ron. 70.102.112.242 (talk) 23:10, 13 August 2008 (UTC)[reply]
Although most people involved in digital industries use "mega" to mean 2^20, this is incorrect. SI define mega as 10^6. —Preceding unsigned comment added by 84.13.176.173 (talk)
Why do you consider the SI to be correct, when it is not the accepted standard for measurement for memory size?


12:54, 3 June 2009 (UTC) I kinda opened this discussion up, and Im happy to see the interest it has generated. The result of this discussion is clearly that the term "Megabit" has multiple meanings for multiple applications and has different values regarding who or what you are talking about. What must be given within the document is CONTEXT. Without Context the article is misleading and wrong. It is incomplete to point out a single definition, and must be more broadly described. The assumption that a Megabit is 1,000,000 bytes is false for in many situations and in common language, many indidviduals (including Myself) refer to a megabit as 1,048,576. People like me are no more incorrect or correct as thoose that use 1,000,000 bytes. Now maybe as time passes this will change to the Mb and Mibit standad, but until such a change is made and until such a change is accepted by the industry as a hole it is incorrect to have a singular definition.--74.67.130.59 (talk) 05:24, 2 May 2011 (UTC)[reply]

This article contains questionable information (e.g. expressing the size of old software in terms of kilobits or megabits instead of kilobytes or megabytes) because I think some persons have confused the concepts of a byte and a bit. Program sizes were normally referred to in terms of bytes, and thus KB and MB, not bits. During the whole 8-bit microprocessor period, I feel it was well-understood what a byte was--one character, which was denoted by combinations of 8 bits (each bit is a binary characteristic of either being on or off--a 1 or a 0). This began to get confusing in the 16-bit generation of microprocessors, so I have to agree with the claim that it is a "revisionist" effort to try to clarify these things by expressing them in terms of bits rather than bytes (i.e. Does a 16-bit processor have bytes that are twice as large as they were during the 8-bit era?) During the 8-bit era, there was little ambiguity among users and programmers. A byte was 8 bits, which in binary provided a value from 0 to 255 for the memory location (byte) being addressed, or the value loaded into the microprocessor's accumulator, to be interpreted as a single number. The only time I ever heard "bits" being used to describe large scale data sizes (rather than the wiring of the microprocessor) was when modem data transfer was involved. For example, a modem of the time might send information over phone lines at a rate of 120 bps (notice that the b is not capitalized, and referred to bits per second, which no knowledgeable person at the time would ever have confused with a term like 4KB of RAM). Bits could also have been used to refer to other I/O operations, but again, the rate of data transfer was usually expressed as such (e.g. bps) and involved much smaller numbers than those involved with RAM and ROM (e.g. no metric prefix was usually needed: 120 bps vs. 64K RAM vs. 1MB total program size, like for a 4-disc software package). 136.181.195.29 (talk) 15:57, 27 November 2017 (UTC)[reply]

telecommunications mibi/mega[edit]

I'd like to see this statement addressed:

'Telecommunications always uses the correct SI definition of the unit.'

Can we please remove 'correct'. It doesn't add anything to the meaning of the sentence and it seems kinda condescending to the 1k = 1024 people.

Also, can someone site a reference for this? i.e. Can someone look up some example telecommunications standards (Like ethernet, telephone PRIs, etc.) which explicitly state the megabit = 100000000bit in accordance with wikipedia's rule for citing sources of information.

209.239.13.226 (talk) 20:32, 8 April 2009 (UTC)[reply]

N64[edit]

"This usage continued on the Nintendo 64, with cartridge sizes ranging between 32 and 512 megabits."

Is there a source for this? I've never seen N64 game sizes referred to as anything other than MiB/mebibytes. Further, the 32-512 figure is only accurate for Mib/mebibits. — Preceding unsigned comment added by Telanis (talkcontribs) 17:13, 10 July 2015 (UTC)[reply]

Megabit's Symbol: Mb or Mbit or Both?[edit]

All the other wiki entries for base 10 bit-units have two letter abbreviations. Isn't Mb also a proper symbol for megabit? Look at the other base 10 bit-unit's symbols, with links to their respective wikipedia articles here: [Symbols of Units Found on Wikipedia http://www.lonniebest.com/DataUnitConverter/] Should Mb be added as a symbol for Megabit in this article?

@anonymous editor, you are quite correct. The symbols Mb is also used for megabit, and I agreeit would be helpful to include this information in the article. The same applies to kb for kilobit, Gb for gigabit, and so on, so perhaps these other articles need updating too. Dondervogel 2 (talk) 16:44, 2 November 2021 (UTC)[reply]