sizeof(“”+0) != sizeof(char *) Bug or undefined behaviour?

The following C program:

#include <stdio.h>

int main(void)
{
    printf("%u %u %u\n",sizeof "",sizeof(""+0),sizeof(char *));
    return 0;
}

outputs 1 4 4 when compiled with GCC on Linux, but outputs 1 1 4 when compiled with Microsoft Visual C++ on Windows. The GCC result is what I would expect. Do they differ because MSVC has a bug or because sizeof(""+0) is undefined? For both compilers the behaviour (i.e. whether the middle value printed is equal to the first value or the last value) is the same no matter what string literal or integer constant you use.

A relevant reference in the ANSI C Standard seems to be 6.2.2.1 - Lvalues and function designators:

"Except when it is the operand of the sizeof operator ... an lvalue that has type 'array of type' is converted to an expression that has type 'pointer to type' that points to the initial element of the array object and is not an lvalue".

Here though the "Except" should not apply because in sizeof(""+0) the array/string literal is an operand of + not sizeof.

15
задан Mchl 1 February 2011 в 12:26
поделиться