I am trying to convert a NSInteger to a NSUInteger and I googled it and found no real answer. How would I do this?
Since this might be useful for other coming across this issue here is a little table showing you the actual effect of casting. These values were taken straight from the debugger as hex values. Choose accordingly, as you can see casting does cause effects. For 32-bit, lop off the bottom ffffffff and for 16-bit lop off bottom ffffffffffff. Also note, -1 is always 0xffffffffffffffff.
NSInteger si = NSIntegerMax; // si NSInteger 0x7fffffffffffffff
si = -1; // si NSInteger 0xffffffffffffffff
si = (NSInteger)-1; // si NSInteger 0xffffffffffffffff
si = (NSUInteger)-1; // si NSInteger 0xffffffffffffffff
si = NSUIntegerMax; // si NSInteger 0xffffffffffffffff
si = (NSInteger)NSIntegerMax; // si NSInteger 0x7fffffffffffffff
si = (NSUInteger)NSIntegerMax; // si NSInteger 0x7fffffffffffffff
si = (NSInteger)NSUIntegerMax; // si NSInteger 0xffffffffffffffff
si = (NSUInteger)NSUIntegerMax; // si NSInteger 0xffffffffffffffff
NSUInteger ui = NSUIntegerMax; // ui NSUInteger 0xffffffffffffffff
ui = -1; // ui NSUInteger 0xffffffffffffffff
ui = (NSInteger)-1; // ui NSUInteger 0xffffffffffffffff
ui = (NSUInteger)-1; // ui NSUInteger 0xffffffffffffffff
ui = NSIntegerMax; // ui NSUInteger 0x7fffffffffffffff
ui = (NSInteger)NSIntegerMax; // ui NSUInteger 0x7fffffffffffffff
ui = (NSUInteger)NSIntegerMax; // ui NSUInteger 0x7fffffffffffffff
ui = (NSInteger)NSUIntegerMax; // ui NSUInteger 0xffffffffffffffff
ui = (NSUInteger)NSUIntegerMax; // ui NSUInteger 0xffffffffffffffff