NSUInteger in iOS7 - ios

I'm having a really weird problem here with NSUInteger in iOS7,
everything is perfect before iOS7, I guess it's related to the 64-bit support in iOS7.
My code is like this, very simple:
if (blah blah blah) {
NSUInteger firstRow = 0;
firstRow = ([self.types containsObject:self.selectedMajorType] ?
[self.types indexOfObject:self.selectedMajorType] + 1 : 0);
...
}
According to my console,
[self.types containsObject:self.selectedMajorType] is true
[self.types indexOfObject:self.selectedMajorType]+1 is 1,
no doubt, and indexOfObject also returns an NSUInteger (according to Apple's document),
here's the screenshot:
but firstRow is always fking **0
This is so creepy I don't know what's going on with NSUInteger,
can someone help me? Thanks a lot!!
____new finding____
I guess this is the problem? It's weird..

I tried to recreate this scenario but I was always getting the expected result 1.
Here is the screen shot:
Here is the project, try running this and see if you still face the problem.
PS. I was using xcode 5.1 and iPhone 64bit Simulator.
=============UPDATE================
Here are some explanations on the lldb commands you used.
po : prints the objective C description of an object.
print / p : Evaluates a generalized expression in the current frame. Specify return type of the function if not used in your program.
Hope this screenshot will help you understand more.

Related

XCode 6 verificationController.m issues

I am using VerificationController.m provided by Raywenderlich for validating receipts for in ap purchase. It is working fine for XCode5 but in XCode6 it is giving number of errors. probably due to C++ code like:
Missing Code for Method declaration
#end must appear in objective-c
context Conflicting types for 'checkReiptSecurity'
can anyone tell me what is needed to be done ?
Edit : Here are errors screenshot
Have you fixed this? I was running in to the exact same problem so I'll leave my fix here for anyone that comes looking. It turns out in newer versions of Xcode you aren't allowed to put C/C++ code in objective-C context anymore. So I moved the declarations for unsigned int iTS_intermediate_der_len, unsigned char iTS_intermediate_der[], char* base64_encode(const void* buf, size_t size), and void * base64_decode(const char* s, size_t * data_len) to the top of the file, above the #implementation tag.
Have you downloaded sample code? I have downloaded sample code and its working fine at my side. It seems that you have missed or added an extra braket } or { in your code.
May be this happened when you was trying to comment this code [UIDevice currentDevice].uniqueIdentifier; because originally this line produce an error.

EXC_BAD_ACCESS when updating Swift dictionary after using it for evaluate NSExpression

I'm using a dictionary to evaluate an expression, when the expression has variables and the dictionary is actually used by NSExpression, something happens and I get EXC_BAD_ACCESS when trying to update the dictionary, this only happens when debugging in an iPhone6, not in the simulator and not in an iPhone 4S.
let strExpression = "a+b+20"
let exp = NSExpression(format:strExpression)
self.dictionary = ["a":10.0, "b":15.0, "c":25.0]
let value:AnyObject = exp.expressionValueWithObject(self.dictionary, context: nil)
let doubleValue = value as Double
self.dictionary.updateValue(doubleValue, forKey: "c")
Something really weird is that if i add this line just after creating the dictionary, then it woks fine:
let newDic = self.dictionary
I,m using iOS 8.1. Thanks in advance!
With #bensarz comment, I thought it might be helpful for others searching for answers if I put the response into an actual answer instead of a comment.
Per #LeeWhitney's response on a similar post:
Looks like a compiler bug.
Have you tried switching between Release and Debug then rebuilding? If debug works but not release it can be an indication of a compiler/optimizer bug.
Does it happen in the simulator also?
Your code works for me on iOS 8.1 with XCode 6.1.
Solution:
The issue seems to be solved by changing the 'Optimization Level' under the 'Swift Compiler - Code Generation' to 'None'. The issue seems to be with the 'Fastest' Compiler optimization level.
Also, a work around that I've found original before the compiler change:
If you use a let statement prior to assigning values in the dictionary, it seems to alleviate the issue. More information found at link below:
EXC_BAD_ACCESS on iOS 8.1 with Dictionary

Swift App - casting from AnyObject to NSArray seems to fail

I posted this question on the Apple IOS Developers' Forum, with a notable lack of response. I'm hoping the StackOverflow wizards can help...
I'm developing an iOS 8 app using Swift. In Xcode beta 5 the code below code worked, but gives me a linker error in beta 6 and beta 7:
var sqlStr = "SELECT count(*) as count FROM nouns WHERE bucket = ?;"
var rs = db.executeQuery(sqlStr, withArgumentsInArray: [0] as NSArray)
The linker error is:
Undefined symbols for architecture x86_64:
__TFSs26_forceBridgeFromObjectiveCU__FTPSs9AnyObject_MQ__Q_", referenced from:
__TFC8les_Mots13WordGenerator9getBucketfS0_FT_Si in WordGenerator.o
(getBucket is a method in the UIViewController WordGenerator. If I reduce the method to just these two lines, I get the same error, and if I comment these two lines out, the error goes away, so I know the problem is here.)
The db.executeQuery() is an FMDB method with this signature:
- (BOOL)executeUpdate:(NSString*)sql withArgumentsInArray:(NSArray *)arguments;
If I change the code to this, it works in all betas:
var sqlStr = "SELECT count(*) as count FROM nouns WHERE bucket = '\(whereClause)';"
var rs = db.executeQuery(sqlStr, withArgumentsInArray: nil)
From the linker error and my trials-and-efforts to debug this, it appears that the cast of [0], which is of type AnyObject to an NSArray, which is required, is failing. I'm using this example, but I'm seeing similar problems in other areas of the app, all where an AnyObject has to be cast to an NSArray or NSDictionary.
The original code above works just fine in Xcode beta 5, but not in subsequent betas. Clearly I'm not understanding something about the AnyObject to NSArray cast, but I'm darned if I know what, and it appears that betas 5 and 6 enforce something not previously enforced. I've tried every kind of explicit cast I can think of, with no success.
Any help will be GREATLY appreciated.
I changed the Build location and the code now compiles, links and runs. Previously the Build location was Custom/relative to workspace; I changed it to Unique.
I have no idea why this would change things, as I had deleted derived data and cleaned the build folder many times. But...it worked, so if anyone is having the same problems, give this a try.

iOS / xcode - NSNumber literal updates being ignored by Xcode

1) I declare a NSNumber property is a view controller header file #property NSNumber *myNumber;
2) I set that property to a NSNumber literal and log the output:
self.myNumber = #9;
NSLog(#"myNumber is: %#", self.myNumber);
3) The above works as expected. Then I change the #9 to #10. and run the program. It does not work, the property is still set to a value of 9.
4) I make a small change to the NSLog text (a space, a full stop, anything) then rerun the program. It now does work! (updating to the new value, e.g. #10)
Any thoughts on why this is happening would be MUCH appreciated, thanks
Sometimes, Xcode gets stuck on little things like this. When something weird like this happens, the first thing to try is a clean - CmdshiftK.

iswalpha() in iOS doesn't return the same value on iOS that it does on MacOS

I have a problem about iswalpha() on iOS.
I am tuning my app in Xcode 4.5 and I tried to pass the Spanish character ú to iswalpha(). The xcode displays the int value of ú is 250.
When I tried to run the app on a real device, iswalpha() returns 0; but in the simulator (I run Xcode on a MacBook air with 10.8.2) it returns 1.
I guess the reason might be iOS has a different implementation of wide-character than does MacOS. What is the best way to resolve this?
Enhanced details:
UTF-16(unicode)encoding of Spanish character ú is 250 in int value. I think iswalpha()should return 1, as MACOS does, other than in iOS return a 0.
Dam new user could not post image here. so for UTF-16 encoding of ú please refer to :
http://www.fileformat.info/info/unicode/char/fa/index.htm
Well I can answer my own question now, as well as a development log in case I forgot this later:
It seems to be a fault of Apple's implementation of libc in iOS. The implementation of iswalpha() is incomplete considering letters in languages other than English. The specific letters(ú,á,ó,...) in different languages could not be recognized by iswalpha(), because they fall out of the 0x7F ASCII boundry, and for somehow it could not be recognized by iOS's locale processing functions, but obviously in different locale those should still be readable alphabet letters.
Some details about it:
iswalph() in iOS is tracked down to:
__DARWIN_CTYPE_static_inline int
__istype(__darwin_ct_rune_t _c, unsigned long _f)
{
#ifdef USE_ASCII
return !!(__maskrune(_c, _f));
#else /* USE_ASCII */
return (isascii(_c) ? !!(_DefaultRuneLocale.__runetype[_c] & _f)
: !!__maskrune(_c, _f));
#endif /* USE_ASCII */
}
and it is __maskrune(_c, _f)) that in the end returns 0.
It is understandable that Apple missed this point since, nobody will use iswalpha() in Objective-C. However it may still be useful to note this point for some porting projects. It was a widely used function so maybe important to many legacy projects that porting to iOS. Hope Apple could fix it in later release.
My workaround now to this problem is to have a wrapper function of iswalpha(), which handle these Latin letters by my own code. Now the app runs flawlessly in my iPhone!

Resources