I doubt that. Information rate is usually measured in bit/s, while code rate is unitless. Mange01 (talk) 22:29, 4 March 2009 (UTC)
Quoting Huffman & Pless, page 88.:
"For a (possibly) nonlinear code over Fq with M codewords the information rate, or simply rate, of the code is defined to be n-1 logqM. Notice that if the code were actually an [n, k, d] linear code, it would contain M = qk codewords and n-1 logqM = k/n."
Apparently Information rate is defined differently in some coding theory publication than within the rest of the information theory and data communications fields. The most common definition of information rate is useful bit rate or net bitrate. Search at http://books.google.com and you'll see. Currently Information rate is redirected to bit rate, where a bit/s definition is given. How can we solve that at Wikipedia? Should we avoid the term? Or create an article where we define it as number of useful bits per time unit, where a time unit either may be the bit transmission time, or a second.
Wikipedia articles that were using the coding theory definition are: Code rate, Entropy rate, Block code and Hamming code. I replaced the term by "code rate" in the latter two articles. In Block code both definitions occured before my change.