NSNumber claims to be NSString - ios

I am getting the warning Comparison of distinct pointer types ('NSString *' and 'NSNumber *') in this line of code:
if(beacon.beaconIdentifier == self.identifier) {
// do something
}
Both should contain an NSNumber. The object beacon is a CLBeacon. self.identifier is a NSNumber. Other possibly relevant code:
CLBeacon category
#interface CLBeacon ();
#property(nonatomic, strong) NSNumber *beaconIdentifier;
#end
static char const *const MyBeaconIdentifier = "MyBeaconIdentifier";
....
- (CLBeacon *)initWithJSON:(NSDictionary *)json {
self = [super init];
if (self) {
...
self.beaconIdentifier = json[#"id"];
}
return self;
}
- (void) setBeaconIdentifier:(NSNumber *)beaconIdentifier {
objc_setAssociatedObject(self, MyBeaconIdentifier, beaconIdentifier, OBJC_ASSOCIATION_RETAIN);
}
- (NSNumber *)beaconIdentifier {
return objc_getAssociatedObject(self, MyBeaconIdentifier);
}
json[#"id"] always only contains numbers. Besides the warning, everything runs as it should. What is causing my problem and what could possibly solve it?
Thanks to a couple great responses, I managed to solve the warning. Still, I would like to know what caused it, if that is possible.
beacon.beaconIdentifier.class logs _NSCFNumber, just as self.identifier does.

First, you are using pointer equality (“do these pointers point to the same address?”) where you want to use object equality (“are the objects pointed to by these pointers considered equal?”). Simply call -isEqual: instead of using == to fix this. [Comparing using == could work for NSNumbers since they are tagged pointers, but you shouldn’t do it anyway.]
Second, if the compiler complains of mismatched types, beaconIdentifier and self.identifier can’t really both be declared as NSNumbers, there has to be an NSString somewhere.

You say json[#"id"] contains a number, but that is not known at compile-time. Hence the warning. Its only at run-time that it will be identified as NSNumber by the compiler.
To check equality of NSNumber, try this code
if([beacon.beaconIdentifier isEqualToNumber:self.identifier]) {
// do something
}

Related

The accessor way are different with static int and static NSArray

Below this demo code, the logical of the process is not important.
#interface ViewController ()<UITableViewDataSource, UITableViewDelegate>
#end
static int channelIndex = 0;
static NSMutableArray *channelsDataArray = nil;
#implementation ViewController
- (void)getSomething {
// Append the desiredValuesDict dictionary to the following array.
if (!self.channelsDataArray) {
self.channelsDataArray = [[NSMutableArray alloc] initWithObjects: desiredValuesDict, nil];
} else {
[self.channelsDataArray addObject:desiredValuesDict];
NSLog(#"channelsDataArray : %#", self.channelsDataArray);
}
// This will print the result I expected.
NSLog(#"channelIndxBefore: %i", channelIndex);
++channelIndex;
NSLog(#"channelIndxAfter: %i", channelIndex);
}
#end
The questions I have is that if I call the channelIndex in this way "self.channeIndex++" it will come out a warning:
Format specifies type 'int' but the argument has type 'NSInteger *'
(aka 'long *')
If I call this way "channelIndex++", which will work properly.
Strangely, I have another static NSMutableArray channelsDataArray, if I just call
[self.channelsDataArray addObject:desiredValuesDict];
It will work properly add object into the var. But if I just use
[channelsDataArray addObject:desiredValuesDict];
It will not show any warning, but the channelsDataArray will be nil, and can't assign the desiredValuesDict into it.
Question: When should I add self prefix or not? Why they are all static variable but one have to add self, another don't?
[Originally a comment:]
The error suggests you have another #interface (in a .h file) and that you've declared an instance variable in that file with the same name as the global variable you have declared in the quoted file. You need to remove one of them, which depends on what you need.
HTH

Objective-C method signatures: Parameter types can differ between declaration and implementation?

I can declare a method in the #interface having parameter type NSString*:
- (id) initWithString:(NSString*)str;
while in the implementation it is NSNumber*:
- (id) initWithString:(NSNumber*)str
For a full example, see the code below. When calling [Work test] the output is a.x = Hi, so the passed-in NSString* went through and one can see that the "correct" initWithString method was called.
Why is this code accepted by the compiler?
Can I make the compiler complain when parameter types differ?
Citing from Apple's documentation Defining Classes :
The only requirement is that the signature matches, which means you must keep the name of the method as well as the parameter and return types exactly the same.
My test code:
#interface ClassA : NSObject
#property (strong, nonatomic) NSNumber *x;
- (id) initWithString:(NSString*)str;
- (void) feed:(NSString*)str;
#end
#implementation ClassA
- (id) initWithString:(NSNumber*)str
{
self = [super init];
if (self) {
self.x = str;
}
return self;
}
- (void) feed:(NSNumber*)str
{
self.x = str;
}
#end
#implementation Work
+ (void) test
{
ClassA *a = [[ClassA alloc] initWithString:#"Hi"];
NSLog(#"a.x = %#", a.x);
}
#end
I added the feed method to see, whether it is "special" to init-like methods, but the compiler doesn't complain either.
(Ran this on Yosemite / Xcode 6.4 / iOS8.4 Simulator.)
PS: If I didn't use the right terms, please correct me :-)
Can I make the compiler complain when parameter types differ?
There's a warning for this which you can activate by including the following line in the header:
#pragma clang diagnostic error "-Wmethod-signatures"
You can also put -Wmethod-signatures into the project's "Other Warning Flags" Xcode build setting to activate this for the whole project.
I don't really understand why Apple is so hesitant to activate helpful warnings like this by default.
My standard pattern on virtually every project is to put -Weverything in "Other Warning Flags". This activates all warnings clang has to offer.
Since there are some warnings that are a little too pedantic or don't serve my coding style, I individually deactivate unwanted warning types as they pop up.
I'm surprised by the quote you found stating that param and return types matter to the uniqueness of the method signature. Re-reading, I think you found a bug in the doc.
Defining a parameter type in the interface will generate a warning for callers that do not pass that type (or cast the parameter to that type), no matter the implementation. Changing the parameter type in the implementation is exactly like casting the parameter within the method. Nothing wrong with that, not even a cause for warning. So long as the different type shares methods (polymorphic or inherited) with the declared type.
In other words, restating by example...
The following will cause a compiler error, proving that distinct param types offers no distinction to the compiler (same is true for return type)...
// .h
- (void)foo:(NSString *)str;
// .m
- (void)foo:(NSString *)str {
NSLog(#"called foo %#", [str class]);
}
- (void)foo:(NSNumber *)str { <----- duplicate declaration error
}
The following will cause no compiler warnings, errors or runtime errors...
// .h
- (void)foo:(NSString *)str;
// .m
- (void)foo:(NSNumber *)str {
// everything implements 'class', so no problem here
NSLog(#"called foo %#", [str class]);
}
The following is exactly like the previous example in every respect...
// .h
- (void)foo:(NSString *)str;
// .m
- (void)foo:(NSString *)str {
NSNumber *number = (NSNumber *)str;
NSLog(#"called foo %#", [number class]);
}
The following will cause no warnings, but will generate a runtime error because we are abusing the cast by invoking a method that the passed type doesn't implement (presuming the caller calls with a string as the interface indicates)...
// .h
- (void)foo:(NSString *)str;
// .m
- (void)foo:(NSNumber *)str {
NSLog(#"does str equal 2? %d", [str isEqualToNumber:#2]); <--- crash!
}
All of the foregoing matches intuition and behavior in practice, just not that passage in the doc. Interesting find!
In Objective-C a method is defined as a string (known as a selector) in the form of doSomethingWithParam:anotherParam:. Or in your case it will be initWithString:. Note there's no parameter types in these strings. One side-effect of defining methods like this is that Objective-C, unlike Java or C++ doesn't allow overloading operators by just changing the parameter type. Another side-effect is the behavior you observed.
EDIT: Additionally, it appears that the compiler does not look at the implementation at all when checking method calls, just the interface. Proof: declare a method in a header, don't specify any implementation for that method, and call this method from your code. This will compile just fine, but of course you'll get an "unrecognized selector" exception when you run this code.
It'd be great if someone could provide a nice explanation of the default compiler behavior.

Doubts about __bridge, _bridge_retain and _bridge_transfer

I have read about __bridge, _bridge_retain and _bridge_transfer and did some experiments. However the output does not coincide with what I was expecting. In particular, I have the following code:
#interface ViewController ()
#property (nonatomic, strong) NSString *test;
#end
#implementation ViewController
CFStringRef cfString;
- (void)viewDidLoad
{
[super viewDidLoad];
self.test = #"123";
cfString = (__bridge CFStringRef)self.test;
self.test = nil;
}
- (void)viewDidAppear:(BOOL)animated
{
NSLog(#"%#", cfString);
NSLog(#"%#", self.test);
}
I expect the program to crash, based on the following reasoning: _bridge does not transfer ownership, so while casting self.test to cfString, there is no retainCount increment. Once self.test is set to nil, ARC will step in and dealloc the string. So that portion of memory is freed, but cfString is still pointing there, resulting in a pointer bad access exception. In contrast to my reasoning, the output is 123 and null, and of course the program does not crash.
Moreover, if I replace
self.test = nil;
with
CFRelease(cfString);
I expect the program to crash as well due to a similar reasoning. Even stranger is that the output is now 123 and 123.
Can anyone kindly elaborate why? Btw, the term ownership always troubles me, some explanation will be greatly appreciated.
Your problem is that you're using a constant string. This is put straight into the programs memory so the reference is unexpectedly remaining valid despite the fact that it shouldn't. Use something less constant than a constant string and your program will brake like you think.
The problem is that you base your example on a literal NSString value.
In objective-C, constant NSString (constant values known at compile time) are never released. In fact, their main memory managment methods are like:
+ (id)allocWithZone:(NSZone *)zone {
id _uniqueInstance = [self _singletonInstanceOfClass];
if( _uniqueInstance == nil )
_uniqueInstance = [super allocWithZone:zone];
return _uniqueInstance;
}
- (id)copyWithZone:(NSZone *)zone {
(void)zone;
return self;
}
- (id)retain {
return self;
}
- (NSUInteger)retainCount {
return NSUIntegerMax; // denotes an object that cannot be released
}
- (oneway void)release {
//do nothing
return;
}
- (id)autorelease {
return self;
}
As you can see, releasing them is not possible.

NSArray, trying to pass the index number of the object in the array to a function

I have spent many hours trying to find a solution, so if it IS here somewhere and I missed it I am sorry ...
In .h
#property (nonatomic, strong) NSArray *physicalMan;
-(int) getManeuverRating:(int *) value;
In .m
physicalMan = [[NSArray alloc] initWithObjects:#"Grip-Hold", #"Strike", #"Fall", #"Response", nil];
NSLog(#" The second element is: %#", [physicalMan objectAtIndex:1]);
NSLog (#" This is the index location of Grip-Hold: %i", [physicalMan indexOfObject:#"Grip-Hold"]);
[self getManeuverRating:[physicalMan indexOfObject:#"Grip-Hold"]];
}
-(int) getManeuverRating:(int *) value
{
int a = *value;
return a + 1;
}
The NSLogs work fine with the proper values, which is why I am so confused as to why the function will not work.
The compiler warning says "Incompatible integer to pointer conversion sending 'NSUInteger' (aka 'unsigned int') to parameter of type 'int *'"
I have tried removing the * and I have tried to find other data types, and converting data types, and I cannot get anything to work correctly. Please help or point me in the right direction ... what am I doing wrong? what am I missing?
The indexOfObject: method of NSArray returns an NSUInteger, not an int*. Passing an int to a method that takes int* is incorrect: the value at the corresponding memory location would not be valid.
You should change the getManeuverRating: method as follows:
-(int) getManeuverRating:(NSUInteger) value
{
return value + 1;
}
You are not pointing to an int... you should make this function
-(NSInteger)getManeuverRating:(NSInteger) value
{
NSinteger a = value;
return a + 1;
}
If that is giving you issues you should also try casting the integer in the initial function...
So instead of
[self getManeuverRating:[physicalMan indexOfObject:#"Grip-Hold"]];
Do
NSInteger index = (NSInteger) [physicalMan indexOfObject:#"Grip-Hold"];
[self getManeuverRating:index];
You should be using NSInteger instead of int simply because it is good to write in objective-c syntax. But it is just a wrapper. You could also make it take and return an NSUInteger and not cast it.
Another modernization thing you could do (and this is an aside) is declare your array like this...
NSArray * physicalMan = #[#"Grip-Hold", #"Strike", #"Fall", #"Response"];

NSDictionary<FBGraphUser> *user syntax explanation

In the Facebook iOS SDK requests are returned with the following handler:
^(FBRequestConnection *connection,
NSDictionary<FBGraphUser> *user,
NSError *error) { }
The user variable can then be accessed with calls like these...
self.userNameLabel.text = user.name;
self.userProfileImage.profileID = user.id;
This syntax is somewhat similar to the syntax id <protocolDelegate> object syntax that is a common property declaration, except for that the NSDictionary is the id object explicitely, and that dictionary conforms to the protocol? But where does the dot syntax come from and how does one state that an arbitrary NSFoundation object corresponds to a protocol without subclassing the object itself and making it conform?
I did some additional research about dot notation and NSDictionary and it appears that it is not possible to use dot notation on a dictionary without adding a category to NSDictionary. However, I did not see any reference of the <> syntax in the Apple Documentation to indicate that this particular instance of NSDictionary conformed to that notation.
And the Facebook documentation is a little sparse on how this wrapping works:
The FBGraphUser protocol represents the most commonly used properties
of a Facebook user object. It may be used to access an NSDictionary
object that has been wrapped with an FBGraphObject facade.
If one follows this lead to the FBGraphObject documentation then there is methods that return dictionaries that conform to this "facade..." but no further explanation on how one goes about wrapping a dictionary.
So I guess my questions are a few:
What would the underlying code look like to make this sort of
syntax work?
Why does it exist?
Why would facebook implement it this way as opposed to just
making an object that they can convert the data into?
Any explanation or insight would be very appreciated!
Basically, NSDictionary<FBGraphUser> *user, implies an object that inherits from NSDictionary, adding functionality (specifically, typed access) declared by the FBGraphUser protocol.
The reasons behind this approach are described in quite a bit of detail in the FBGraphObject documentation (the FBGraphUser protocol extends the FBGraphObject protocol). What might be confusing you is that FBGraphObject is a protocol (described here) and a class (described here), which inherits from NSMutableDictionary.
In terms of inner implementation, it's some pretty advanced Objective-C dynamic magic, which you probably don't want to worry about. All you need to know is you can treat the object as a dictionary if you wish, or use the additional methods in the protocol. If you really want to know the details, you can look at the source code for FBGraphObject, in particular, these methods:
#pragma mark -
#pragma mark NSObject overrides
// make the respondsToSelector method do the right thing for the selectors we handle
- (BOOL)respondsToSelector:(SEL)sel
{
return [super respondsToSelector:sel] ||
([FBGraphObject inferredImplTypeForSelector:sel] != SelectorInferredImplTypeNone);
}
- (BOOL)conformsToProtocol:(Protocol *)protocol {
return [super conformsToProtocol:protocol] ||
([FBGraphObject isProtocolImplementationInferable:protocol
checkFBGraphObjectAdoption:YES]);
}
// returns the signature for the method that we will actually invoke
- (NSMethodSignature *)methodSignatureForSelector:(SEL)sel {
SEL alternateSelector = sel;
// if we should forward, to where?
switch ([FBGraphObject inferredImplTypeForSelector:sel]) {
case SelectorInferredImplTypeGet:
alternateSelector = #selector(objectForKey:);
break;
case SelectorInferredImplTypeSet:
alternateSelector = #selector(setObject:forKey:);
break;
case SelectorInferredImplTypeNone:
default:
break;
}
return [super methodSignatureForSelector:alternateSelector];
}
// forwards otherwise missing selectors that match the FBGraphObject convention
- (void)forwardInvocation:(NSInvocation *)invocation {
// if we should forward, to where?
switch ([FBGraphObject inferredImplTypeForSelector:[invocation selector]]) {
case SelectorInferredImplTypeGet: {
// property getter impl uses the selector name as an argument...
NSString *propertyName = NSStringFromSelector([invocation selector]);
[invocation setArgument:&propertyName atIndex:2];
//... to the replacement method objectForKey:
invocation.selector = #selector(objectForKey:);
[invocation invokeWithTarget:self];
break;
}
case SelectorInferredImplTypeSet: {
// property setter impl uses the selector name as an argument...
NSMutableString *propertyName = [NSMutableString stringWithString:NSStringFromSelector([invocation selector])];
// remove 'set' and trailing ':', and lowercase the new first character
[propertyName deleteCharactersInRange:NSMakeRange(0, 3)]; // "set"
[propertyName deleteCharactersInRange:NSMakeRange(propertyName.length - 1, 1)]; // ":"
NSString *firstChar = [[propertyName substringWithRange:NSMakeRange(0,1)] lowercaseString];
[propertyName replaceCharactersInRange:NSMakeRange(0, 1) withString:firstChar];
// the object argument is already in the right place (2), but we need to set the key argument
[invocation setArgument:&propertyName atIndex:3];
// and replace the missing method with setObject:forKey:
invocation.selector = #selector(setObject:forKey:);
[invocation invokeWithTarget:self];
break;
}
case SelectorInferredImplTypeNone:
default:
[super forwardInvocation:invocation];
return;
}
}
This syntax is somewhat similar to the syntax id object syntax
"Somewhat similar"? How'bout "identical"?
and that dictionary conforms to the protocol
Nah, the declaration says that you have to pass in an object of which the class is NSDictionary, which, at the same time, conforms to the FBGraphUser protocol.
But where does the dot syntax come from
I don't understand this. It comes from the programmer who wrote the piece of code in question. And it is possible because the FBGraphUser protocol declares some properties, which can then be accessed via dot notation.
and how does one state that an arbitrary NSFoundation object corresponds to a protocol without subclassing the object itself and making it conform?
It's not called "NSFoundation", just Foundation. And it's not the object that doesn't "correspond" (because it rather "conforms") to the protocol, but its class. And you just showed the syntax for that yourself.
And how is it implemented? Simple: a category.
#import <Foundation/Foundation.h>
#protocol Foo
#property (readonly, assign) int answer;
#end
#interface NSDictionary (MyCategory) <Foo>
#end
#implementation NSDictionary (MyCategory)
- (int)answer
{
return 42;
}
#end
int main()
{
NSDictionary *d = [NSDictionary dictionary];
NSLog(#"%d", d.answer);
return 0;
}
This is an SSCCE, i. e. it compiles and runs as-is, try it!
What would the underlying code look like to make this sort of syntax work?
Answered above.
Why does it exist?
Because the language is defined like so.
Why would facebook implement it this way as opposed to just making an object that they can convert the data into?
I don't know, ask the Facebook guys.

Resources