Objective-C class returned by alloc being confused with wrong class when initializing

I thought I understood basic Objective-C as far as alloc and init... methods go, but apparently I don't. I've boiled down the problem I encountered to the minimum example below. (For the example I put all the source into one file, but the problem happens just the same if the source is split into multiple source and header files as it normally would be).

Here is a synopsis of what the code does and what happens when I run it.

I define two classes MyInteger and MyFloat, that are nearly identical except that one deals with type int the other with float. Both have initializer methods called initWithValue: however with different type of the argument. There is a #define to control whether the MyInteger class is defined or not, the reason being, that it leads to different behavior of the program even though the class is never used.

The main() only uses MyFloat. The first two lines do this: Allocates an instance of MyFloat, and initializes it with a value of 50.0. It then prints the value. Depending on whether MyInteger is defined or not, I get two different outputs.

Without MyInteger defined, just as I would expect:


[4138:903] float parameter value:50.000000
[4138:903] Value:50.000000
(next two output lines omitted)

With MyInteger defined, much to my surprise:


[4192:903] float parameter value:0.000000
[4192:903] Value:0.000000
(next two output lines omitted)

It seems to me that the compiler treats the call to initWithValue: as if it belongs to the MyInteger class. The next two lines of main() test this by casting [MyFloat alloc] to type MyFloat*. And that does generate output as expected, even when MyInteger is defined:


[4296:903] float parameter value:0.000000
[4296:903] Value:0.000000
[4296:903] float parameter value:50.000000
[4296:903] Value with cast:50.000000

Please explain what's going on! I've struggled with it for more than 24 hours now, even to the point of opening the doors to let some heat out so my computer might cool down:-) Thanks!

Another oddity is that if I move the definition of MyInteger down below the definition of MyFloat, then everything is "good" - works as I would expect. History has proven me wrong too often for me to suspect the compiler is to blame. In any case, here is compiler and project information: Using Xcode 4.0.2. Tried with all 3 compiler options (GCC 4.2, LLVM GCC 4.2, and LLVM Compiler 2.0). The Xcode project for this examples was setup using standard configuration for a Mac OS X command line tool based on Foundation.


#define DO_DEFINE_MYINTEGER 1

//------------- define MyInteger --------------
#if DO_DEFINE_MYINTEGER
@interface MyInteger : NSObject {
    int _value;
}
-(id)initWithValue:(int)value;
@end

@implementation MyInteger
-(id)initWithValue:(int)value {
    self= [super init];
    if (self) {
        _value= value;
    }
    return self;
}
@end
#endif


//------------- define MyFloat --------------
@interface MyFloat : NSObject {
    float _value;
}
-(id)initWithValue:(float)value;
-(float)theValue;
@end

@implementation MyFloat
-(id)initWithValue:(float)value {
    self= [super init];
    if (self) {
        NSLog(@"float parameter value:%f",value);
        _value= value;
    }
    return self;
}
-(float)theValue {
    return _value;
}
@end

//--------------- main ------------------------
int main (int argc, const char * argv[])
{
    MyFloat *mf1= [[MyFloat alloc] initWithValue:50.0f];
    NSLog(@"Value:%f",[mf1 theValue]);

    MyFloat *mf2= [((MyFloat*)[MyFloat alloc]) initWithValue:50.0f];
    NSLog(@"Value with cast:%f",[mf2 theValue]);

    return 0;
}
5
задан rene 23 April 2011 в 17:47
поделиться