When you discuss sizeof you make reference to sizeof telling you the number of octets something is. this is not correct, an octet refers to an 8 bit byte, but not all platforms have this (e.g. some machines on the old PDP line), all sizeof is in units of sizeof(char) which is defined to be 1, you have to use limits.h to look at CHAR_BIT to see the number of bits in the byte...so I think this article should be changed to remove the references to octets as they are not portable, and replace them with the word byte or even character, also the example which does a sizeof(c) where c is a char is only a good example if you are demonstrating that sizeof(char) is always defined to be 1 --Michael Lynn 02:34, 2 November 2006 (UTC)
- Changed to (hopefully) incorporate the best of both worlds :) (i.e. sizeof(char) is always 1 and sizes are in terms of bytes) Conor H. 02:19, 6 November 2006 (UTC)
- That pretty much fixes the parts I was concerned with, thanks --Michael Lynn 08:32, 6 November 2006 (UTC)
"The size of char on most architectures is 1 byte (usually 8 bits)"
I think this text is somewhat confusing. To understand it, I had to look at the ISO/IEC 9899:1999 standard, which contains the following definition:
- '3.6. byte: addressable unit of data storage large enough to hold any member of the basic character set of the execution environment
I had no idea that a byte could be larger than a char. I think "byte" is so synonymous (to the layman) with "char", or even with "8 bits", that this sentence requires something like a footnote or an explanatory link. —Preceding unsigned comment added by 184.108.40.206 (talk) 10:26, 17 September 2008 (UTC)
- How about just looking into Byte? --JonnyJD (talk) 00:54, 18 September 2008 (UTC) PS: You should add sections at the end of the discussion and not somewhere in the middle.
- In C, a byte cannot be larger than a char, because char is defined to be exactly 1 byte by the standard. CHAR_BIT is used to describe how many bits are in a character, and therefore how "wide" a byte is in the given implementation.
- Long story short, when we're talking C, "char" and "byte" are synonymous, but the number of bits in each can be greater than 8. Conor H. (talk) 21:15, 19 October 2008 (UTC)
Biased towards C89
I think the article was written solely with C89 in mind. Variable length arrays are not mentioned. In such a case, sizeof(a[n]) is evaluated at run-time, so it's not a compile-time constant in all cases. Another important use is mentioned either: sizeof(a) / sizeof(a) is used to calculate the number of elements in an array. I'm not really interested in this article though because all of this explained in the not-so-long standard (the last draft is freely available) and anyone who's calling himself a C programmer should have read it anyway. --220.127.116.11 16:58, 31 December 2006 (UTC)
- find me a compiler that is fully compliant with the new standard and then maybe we can talk, but you're not going to find a real compiler that is because it includes so many stupid things that its not going to happen --Michael Lynn
- You, Sir, are a blatant shame for Wikipedia. If you want to spout nonsense, why don't you write it in your own blog instead of spreading false information instead abusing the resources of Wikipedia. I make you in person liable for the extremely crappy quality of C code in general caused by wanna-be developers incapable of proper reading, learning and instead relying on charlatans like you.
- wow, strong words for an anonymous coward... --Michael Lynn 00:34, 2 February 2007 (UTC)
Anyone who wants to code in C should read the standard himself. It's not that large, we're not talking about C++. Competent C wizards can be found on comp.lang.c and there's a huge archive of intelligent enlightening postings there, freely accessible at groups.google.com. Unfortunately, most other web sources have the same low standards as this one. --18.104.22.168 17:42, 1 February 2007 (UTC)
I re-added the small note about readability. I hope it's in a better place now. I always considered readability to be a huge reason for using sizeof.
- You're probably right, now that I think more about it. Conor H. 17:16, 14 August 2007 (UTC)
Use of parentheses with the sizeof operator
The article is inconsistent in use of parentheses after the sizeof operator. Some examples use the parens, some don't, some leave whitespace and some don't. I'm for removing the parentheses where not strictly required, for reason explained below. Asking politely for your input, before I go around about editing.
The issue is similar as with `include' or `require' statements in PHP -- using parentheses is superflous (unless following argument is an expression) and seems mostly harmless. However, placing unnecessary parentheses in examples causes many programmers to believe the satement (or the operator) to be a function call, with all the semantics and limitations of calling a function and being evaluated each time the code is ran; this belief then influences coding habits.
Please excuse me not logging in; I'm at work and don't remember my WP password.
- Sounds fine to me. I always use parens just for consistency's sake, but you don't and that's fine, and I agree that it should be consistent. So I say, edit away (not that you need my permission). Conor H. (talk) 01:54, 27 November 2008 (UTC)
Errors in section "Using
sizeof with arrays"
I've made some substantial changes here, and I thought I would explain.
The text states:
It should be noted that, since
sizeofis a compile-time operation, it is impossible to use
sizeofto determine the size of an array if it cannot be evaluated at compile time.
That's incorrect since C99 introduces variable-length arrays. In that case, the size of the array can only be evaluated at run-time, and sizeof() will be evaluated at runtime.
Then it goes on to say:
In the above example, since the size of the
argvarray is indeterminate,
sizeof(argv)will be equivalent to
sizeof(char **)— in other words, the size of the pointer type corresponding to the array, not the amount of memory the array itself (which the pointer refers to) takes.
That's at best misleading. argv isn't an array - it's a pointer (to a pointer). Thus, the size of argv will be the size of a pointer. This has nothing to do with the size of the argv array being indeterminate, and much to do with the way arrays are passed to functions (by decaying into a pointer to the first element). Even if there was an array which had a known number of elements (i.e., not "indeterminate" in size), if you were to pass it to a function, it would still decay into a pointer to the first element and sizeof() called over the parameter would yield the size of the pointer.
I didn't change the line that said "When
sizeof is applied to an array, the result is the size in bytes of the array in memory", but I was slightly unhappy with it since arrays are not somehow special in this respect - sizeof() also returns the size of simple types and structures.
Type returned by sizeof()
Surely any thorough discussion of sizeof() will also discuss why it has its own return type. The sample code in this page is using it correctly -- declaring the type returned by sizeof() and used by malloc() etc. as 'size_t', but no discussion of why this must/should be done exists. —Preceding unsigned comment added by 22.214.171.124 (talk) 01:00, 9 August 2010 (UTC)
- On 8/16/2012, User:Goutamiitg edited the above paragraph, including the signature block, in an apparent attempt to add the following comment. As it's usually inappropriate to edit another editors comment on talk page, I've reverted that, and included Goutamiitg's comment below.
One more thing this page should says that in which standard library function sizeof()and its returned type is defined. (comment by User:Goutamiitg)
- And I've removed the leading space on that comment, so that it formats properly. As for the content of the comment, see the section on unnecessary parentheses a little way up this page. sizeof is not a function, but a unary operator. It's not in a library, but defined in the language, and shouldn't be written with trailing (parentheses). Rojomoke (talk) 16:05, 30 October 2012 (UTC)
Size in unsigned chars, not bytes
According to the Everything you ever wanted to know about C types blog entries by ISO C comittee member Peter Seebach, the sizeof operator returns the size in units of unsigned char, not necessarily bytes. From Part 1: "In C, size is measured in units of unsigned char, and returned as a value of type size_t, which is some unsigned integer type; the size of a type is the number of unsigned char objects it would take to hold all the bits used to store the object. The built-in sizeof operator yields this size." He also gives the example of a 60-bit char, to emphasize the importance of not assuming that chars are bytes.
- The term byte is a bit ambiguous is this context. C defines the byte as "3.6. byte: addressable unit of data storage large enough to hold any member of the basic character set of the execution environment" - IOW, a char. So they're synonymous from a strict C perspective. Common modern usage, however, usually assumes bytes to be 8 bits, which is clearly not what sizeof returns (except, of course, on those machines where the C byte - and by extension the C char - are 8 bits). Rwessel (talk) 15:47, 24 April 2011 (UTC)
First array example
In the first array example, why is the code subtracting sizeof buffer? The last element of the array will always be overwritten by the \0, so what is the point of avoiding copying a char into that element before it gets overwritten? — Preceding unsigned comment added by 126.96.36.199 (talk) 17:50, 11 September 2013 (UTC)