Talk:Code rate

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Code rate = information rate?[edit]

I doubt that. Information rate is usually measured in bit/s, while code rate is unitless. Mange01 (talk) 22:29, 4 March 2009 (UTC)[reply]

Quoting Huffman & Pless, page 88.:

"For a (possibly) nonlinear code over Fq with M codewords the information rate, or simply rate, of the code is defined to be n-1 logqM. Notice that if the code were actually an [n, k, d] linear code, it would contain M = qk codewords and n-1 logq M = k/n."

--150.162.245.40 (talk) 23:14, 10 June 2009 (UTC)[reply]


Apparently Information rate is defined differently in some coding theory publication than within the rest of the information theory and data communications fields. The most common definition of information rate is useful bit rate or net bitrate. Search at http://books.google.com and you'll see. Currently Information rate is redirected to bit rate, where a bit/s definition is given. How can we solve that at Wikipedia? Should we avoid the term? Or create an article where we define it as number of useful bits per time unit, where a time unit either may be the bit transmission time, or a second.
Wikipedia articles that were using the coding theory definition are: Code rate, Entropy rate, Block code and Hamming code. I replaced the term by "code rate" in the latter two articles. In Block code both definitions occured before my change.
Examples of articles using the bit/s definition: Information theory, Error detection and coding, Shannon–Hartley theorem, Channel capacity, Spectral efficiency, Classical information channel, Eb/N0, Peak Information Rate, Maximum Information Rate and Committed Information Rate.
Mange01 (talk) 09:29, 11 June 2009 (UTC)[reply]