When I try to print an integer value to the console that is retrieved from an NSManagedObject, it displays a 6 or 8 digit value (the object ID?). However, if I use the debug
It sounds like myProcess.sequence
is an NSNumber (object) rather than an NSInteger (scalar). That would explain why it shows up correctly in an object's description but not when you explicitly try to print it as an integer.
What you get from your NSManagedObject would be a NSNumber, I think. It's easy to print that:
MyProcess *myProcess = [array objectAtIndex:i];
NSLog(@"sequence = %@", myProcess.sequence);
or, if you really need the NSInteger:
MyProcess *myProcess = [array objectAtIndex:i];
NSLog(@"sequence = %i", [myProcess.sequence integerValue]);
I think that in this bit of code
NSInteger sequence = [[NSNumber numberWithInteger:(NSInteger)myProcess.sequence] intValue];
the (NSInteger)myProcess.sequence actually gets the memory address of the NSNumber. You can't just cast an NSNumber into an NSInteger.
Try this:
NSLog(@"sequence = %li",(unsigned long)myProcess.sequence);
Depending on how the application is built, NSInteger might be 32 bits, or it might be 64 bits. If it's a 64-bit value, you'll need to do
NSLog(@"sequence = %qi", sequence)
so that it correctly treats sequence
as a 64-bit value. Note, however, that this won't work for 32-bit applications; as far as I'm aware, there's no single format specifier that will work to print an NSInteger in both 32-bit and 64-bit worlds.
NSInteger is just an int:
typedef int NSInteger;
In your first line of code:
NSInteger sequence = [[NSNumber numberWithInteger:(NSInteger)sequence] intValue];
All you are doing is assigning sequence to itself, in a round about way. And since it's not initialized, it might be any random number.