int not returning same value from NSString - ios

I have saved value in Singletone as NSString. When I want to convert to int, value is some random number. For example, I am calling NSString *numberCoupons = [Manager sharedInstance].userProfile.numberOfCoupons, po numberCoupons returning normal value: 40.
But problem is in next line, when I want to convert string to value: int coupons = (int)numberCoupons; It is returning some random number, etc. 421414.
What could be the problem?

try int coupons = [numberCoupons integerValue];

numerofCoupons is obviously an NSNumber object which is used to store numbers within Objective-C collection classes (NSArray, NSDictionary, etc) as only objects can be stored in them.
To get the wrapped value out of the object use:
NSInteger coupons = [numberOfCoupons integerValue]
I would recommend redeclaring numberOfCoupons as NSInteger, and not NSNumber, as NSNumber objects are difficult and expensive to manipulate compared to the primitive types they wrap.
If the value needs to go into a collection class then wrap it in an NSNumber object when adding it and unwrap it when removing it.

When you write (int)numberOfCoupons you are asking that the value in the variable numberOfCoupons be cast to the type int.
Now the value in a variable of type NSString * is a reference to an object, that is a memory address. When (Objective-)C casts a reference to an integer type you get back the memory address. This is the “random” value you are seeing.
What you need to do is send a message to the object referenced by the value in your variable requesting that it return an integer value equivalent to itself. NSString has a method intValue for this, so [numberOfCoupons intValue] will do what you wish.
There is a whole family of xValue methods to obtain various integer and floating-point values of different precision/size.
Note: if you have a reference to an NSNumber, rather than an NSString, then exactly the same code will work.
Note 2: if you do have an NSNumber then the cast expression you first tried may return a value which has a completely different magnitude than you might expect for a memory address. This is because some integer values are represented by special tagged addresses which don't actually reference a real object. This is an optimisation you normally would not notice, except when you accidentally cast the reference value to an integer...
HTH

Related

Why Some Variables are Declared with an * Asterisk in Objective-C

I am just starting to learn Objective-C. I am confused to see that some types of variables are sometimes declared with an * asterisk, others are not. For example these are delcared with a *:
#property NSString *firstName;
NSString * mainString = #"Hello World!";
NSNumber *longNumber = #42l;
NSArray *unsortedStrings = #[#"gammaString", #"alphaString", #"betaString"];
And these are not:
int someInteger = 42;
NSInteger anInteger = 64;
id firstObject = #"someString";
NSRange substringRange = [mainString rangeOfString:#"long"];
I found this explanation from Apple's documentation: https://developer.apple.com/library/archive/documentation/Cocoa/Conceptual/ProgrammingWithObjectiveC/WorkingwithObjects/WorkingwithObjects.html#//apple_ref/doc/uid/TP40011210-CH4-SW1
Both these properties are for Objective-C objects, so they use an asterisk to indicate that they are C pointers.
But this explanation is too general and vague for me to understand the concept. I know type * means it is a pointer type, and this type stores pointers of that type. But why some types are declared with *, others are not?
The int type is not object. It is a C language “primitive data type”. You generally interact with primitive C data types directly. E.g.,
int i = 0; // it’s now `0`
i = 42; // it’s now `42`
The NSInteger is just an alias for another primitive data type, long. The NSRange is a struct (thus, also not an object), so the same rule applies. So, for basic interaction with these primitive data types, no * pointer reference is generally needed. (There actually are times you also deal with pointers to primitive data types, but that is beyond the scope of this question.)
But NSString, NSNumber, and NSArray, however, are objects, so you must use pointers when declaring them.
Note, you've included id in the latter list where * is not used:
id firstObject = #"foo";
Be very careful. This actually is a bit misleading, because firstObject actually is a pointer to the #"someString", a NSString object. The id is an exception to the way we generally declare pointers with * and is discussed in Objective-C is a Dynamic Language), “The id type defines a generic object pointer.”
It’s analogous to declaring a NSString pointer, but “you lose compile-time information about the object”. Compare the id pointer to the NSString * to the following syntax:
NSString *secondObject = #"bar";
This latter secondObject is NSString * pointer, but because we explicitly declared it as a NSString * pointer, the compiler knows that you are dealing with a string (thus the compiler can perform greater validation regarding your subsequent use of that pointer). With id, you do not enjoy this compile-time validation.
For example, here are two invalid attempts to call removeObject method on two NSString objects. NSString has no such method. The NSString * reference provides useful compile-time error message that id does not afford us:
Now, if you used the id type and tried to run that code, it would crash when it attempted to call removeObject on firstObject. But, where possible, it is much better to have the compiler tell us about the error during compile-time, rather than at run-time.
So the id pattern should be used sparingly, only when you really need Objective-C’s dynamic type behaviors.

Objective-C Set Property from Another Class Returns Null

I am trying to init properties firstName and lastName in XYZPerson from my int main() in different ways. (I'm learning OC and exploring different ways of initializing values)
However, NSLog always returns null for firstName. I know there are many questions similar to mine, but almost all of them are leaning towards a specific issue but not language grammar itself.
#import <Foundation/Foundation.h>
#import "XYZShoutingPerson.h"
int main(int argc, const char * argv[]) {
#autoreleasepool {
XYZPerson *somePerson = [[XYZPerson alloc] init];
NSString *firstName = [somePerson firstName];
[somePerson setFirstName:#"Jonny"];
[somePerson setLastName:#"Appleseed"];
NSLog(#"%# %#", firstName, [somePerson lastName]); // prints ```(null) Appleseed```
}
return 0;
}
#interface XYZPerson : NSObject
#property NSString *firstName;
#property NSString *lastName;
- (id)init;
- (id)initWithFirstName:(NSString *)aFirstName lastName:(NSString *)aLastName;
- (void)saySomething:(NSString *)greeting;
#end
I am just starting to learn Objective-C. I know this is most likely to be a very silly mistake, so pls don't be too harsh on me... Thanks!
Yeah, that worked! But I wonder why storing it as local property before setting it doesn't work? Doesn't the address firstName points to remains the same?
It does, which is why your code doesn't work...
The following rounds a few edges but hopefully gets the concepts across.
Let's take a step back from properties and just look at variables, the properties you have are just variables with a little code around them which has no impact here on what's going on.
A variable is just a named box, and that box is capable of storing a (computer) representation, aka value, of something with a particular type. E.g the declaration:
int height;
Is an instruction to use an available[1] box that is capable of holding a representation of something with type int and to name that box height.
When you assign to a variable what happens is whatever is already in the box is thrown away[2] and then a copy of the value being assigned is placed in the box.
When you read from a variable a copy of what is in the box is made and returned to you.
Now it doesn't matter what the type of value is stored in the box, the above description of the process remains true.
In your case what is stored in your boxes are values of type NSString *, that is a reference or pointer to something which represents (in whatever way the computer does so) a string, essentially that reference is the representation of the string.
Now consider the assignment:
height = 42;
This always assigns the representation of 42 into the variable/box named height – an integer literal always represents the same value [3].
The same is true for a string literal, so the literal #"Jonny" always represents the string "Jonny".
Now let's look at your code and describe what it is doing in terms of (the underlying) variables (the properties are just wrappers for):
NSString *firstName = [somePerson firstName];
Copy the current value in somePerson's firstName variable into the local variable firstName. From what you report at this point in your code somePerson's firstName contains the representation of null so that is what is copied.
[somePerson setFirstName:#"Jonny"];
Copy the representation of #"Jonny" into somePerson's firstName variable throwing away whatever is already there – in your case a representation of null
[somePerson setLastName:#"Appleseed"];
Copy the representation of #"Appleseed" into somePerson's lastName variable
NSLog(#"%# %#", firstName, [somePerson lastName]);
Copy the representations in the local variable firstName – a representation of null – and in somePerson's lastName – a representation of #"Appleseed" – and pass them to NSLog to convert into printable text and output.
Your misunderstanding might stem from hearing about the type NSMutableString. One of these is represented by a box whose contents can be modified (aka "mutated") rather than simply replaced. The representation of an NSMutableString is a reference/pointer to its box, just as an NSString, but in its case what is in the box can be modified while the reference/pointer will stay the same as the name of the box itself doesn't change.
Hope that helped more than it confused! (It is easier to explain using pictures/a whiteboard.)
[1] How the computer finds an available box is interesting but not important here.
[2] "thrown away" can be quite involved and you'll discover mare as you learn, there is something called "garbage collection" which you'll find out about under the term "ARC" in Objective-C. But again that is not important to your question...
[3] Those old/learned enough to know about the early days of FORTRAN who wish to pipe up and say "Ah, but..." at this point go stand in the corner ;-)

Why NSNumber *a = 0 do not display error

I know the correct way to initial a NSNumber is NSNumber *a = #1;
and when I declare NSNumber *a = 1;, I will got the error
Implicit conversion of int to nsnumber is disallowed with arc
But I don't know why when I declare NSNumber *a = 0; there is no error
In my case, I have write some function in NSNumber category
and then
If the value of NSNumber is #0, I can use the function in category normally
If the value of NSNumber is 0, I can use the function in category, no error happened but when run app, this function will never call
The value 0 is synonymous with nil or NULL, which are valid values for a pointer.
It's a bit of compatibility with C that leads to this inconsistent behavior.
History
In the C language, there is no special symbol to represent an uninitialized pointer. Instead, the value 0 (zero) was chosen to represent such a pointer. To make code more understandable, a preprocessor macro was introduced to represent this value: NULL. Because it is a macro, the C compiler itself never sees the symbol; it only sees a 0 (zero).
This means that 0 (zero) is a special value when assigned to pointers. Even though it is an integer, the compiler accepts the assignment without complaining of a type conversion, implicit or otherwise.
To keep compatibility with C, Objective-C allows assigning a literal 0 to any pointer. It is treated by the compiler as identical to assigning nil.
0 is a null pointer constant. A null pointer constant can be assigned to any pointer variable and sets it to NULL or nil. This was the case in C for the last 45 years at least and is also the case in Objective-C. Same as NSNumber* a = nil.
You can consider 0 as nil or null that can be assign to object but 1 is integer and can't allow to object or non integer.
Objective-C silently ignores method calls on object pointers with value 0 (i.e. nil). That's why nothing happens when you call a method of your NSNumber category pointer which you assigned the value 0.
A nil value is the safest way to initialize an object pointer if you don’t have another value to use, because it’s perfectly acceptable in Objective-C to send a message to nil. If you do send a message to nil, obviously nothing happens.
Note: If you expect a return value from a message sent to nil, the return value will be nil for object return types, 0 for numeric types, and NO for BOOL types. Returned structures have all members initialized to zero.
In the last Apple Doc Working with nil

Store ids in C Array

I would like to create a C array, which stores some objects but would like to declare it as follows:
id array = malloc(sizeof(NSObject * 4));
But this gives an error; it asks me to either:
Fix it - use __bridge to convert directly (No change in ownership).
Or:
Fixe it: use CFBridgeRelease to call to transfer ownership of a +1
'void' into ARC.
I have tried both, but it still gives an error:
Missing )
I remember having done this; but I forgot how since it has been a while.
How can I store ids in C array and retrieve things out of it and then cast them down?
The size of a pointer is the same for all types, including objects, so the following is all you need:
id *myArray = malloc(sizeof(void *) * 4);
Note that the type used on the left in the posted example was also incorrect, since the memory being allocated is expected to be referenced as a C array of pointers to objects, rather than just an object.
If you're compiling with ARC enabled, you'll need to add a lifetime qualifier to the declaration of myArray, and cast the return value of malloc. That's because ARC can only manage the lifetimes of pointers to objects and the array in which you're going to store the objects is a C type. For example, to tell ARC explicitly that the pointers in the array are unmanaged you could modify the previous code as follows:
__unsafe_unretained id *myArray = (__unsafe_unretained id *) malloc(sizeof(void *) * 4);
Note that since ARC can't manage retain counts in C arrays, it will be up to you to ensure that whatever you store in the array can be used safely.
You can't store arrays of objects in Objective-C. It only allow allocating pointers to objects. To do that, the right syntax would be like this:
id *array = malloc(sizeof(*array) * 4);
or possibly
id *array = malloc(sizeof(id) * 4);
but the former is more DRY.

Type casting, why is it so verbose? Or am I doing something wrong?

I have a NSDecimalNumber and an NSInteger. I want to multiply the former by the latter, so I'll have to convert the integer. This is what I came up with, after some trial and error.
NSDecimalNumber *factor = (NSDecimalNumber*) [NSDecimalNumber numberWithInteger:myInteger];
It feels like I'm really driving the point home:
Hi there, I want an NSDecimalNumber, let's call it factor.
By the way, just so you know, I want that to be an NSDecimalNumber, if that'd be possible.
So could you give me an NSDecimalNumber? Here, I'll give you an integer to use as a source.
Is it supposed to be this verbose? Or am I messing up somewhere?
The (NSDecimalNumber*) type-cast is "necessary" because +numberWithInteger: is an NSNumber method that returns an NSNumber object. However, this is actually going to cause problems, because it's returning an NSNumber object, not an NSDecimalNumber. (How to use NSDecimalNumber?)
To get your integer into a decimal number, simply do this:
NSDecimalNumber *factor = [NSDecimalNumber decimalNumberWithDecimal: [#(myInteger) decimalValue]];
It's still fairly verbose but two things to bear in mind:
Objective-C is very verbose, by design.
NSDecimalNumber in not intended for basic integer arithmetic, it's intended for use with numbers that are typically represented using scientific notation.
The API you call is a NSNumber convenience constructor which returns an NSNumber -- not necessarily an NSDecimalNumber. A convenience constructor does not need to return a type of the class you message, but an instance of the declared return type. Because NSDecimalNumber is a subclass of NSNumber, an explicit downcast is required when assigning an NSNumber to an NSDecimalNumber.
If a library writer intended to specify the expectation you have in mind (to return an instance of the class you have messaged), instancetype would be used for the return type. Unfortunately, it is a rather new keyword and does not exist in all places it possibly could exist. If instancetype had been used, the cast would not be necessary.
Before instancetype existed, a simple id was the convention. With id, no cast is required and no type checking performed when assigning/initializing a variable of an Objective-C type. For example: NSArray * array = [NSString string]; would not be flagged by the compiler if the return type were id, but it would be flagged if the return type were instancetype. NSMutableString * string = [NSMutableString string]; would be flagged by neither, but it would be flagged if +string's return type were declared + (NSString *)string;.

Resources