iOS: print unicode character in decimal notation - ios

How would I print a string with 1 letter in a decimal form?
I need to get unique integer for a character and convert it to string.
NSString* letter = #"a"; // ---> to #"97"

unichar c = [letter characterAtIndex:0];
NSString *charAsNum = [NSString stringWithFormat:#"%d",c];

Related

Objective C: Conversion of large hex to double gives wrong value

I have an issue converting a big hex number 0x500123fb2d414d3d1192a659d4d39dfd to its decimal value
double serialNumberValue = 0;
NSScanner * scanner = [NSScanner scannerWithString:#"0x500123fb2d414d3d1192a659d4d39dfd"];
[scanner scanHexDouble: &serialNumberValue];
NSString* serialNumber = [NSString stringWithFormat: #"%.f", serialNumberValue];
NSLog(#"serialNumber decimal value : %#", serialNumber);
The result I get is: 106344161744262488068834846534090620928 which corresponds to 0x500123FB2D414C000000000000000000 in hex
While the correct conversion is: 106344161744262493917718975250015952381
How can i solve this issue ?
FYI: I need the last 6 digits of the hexadecimal number in its decimal value.

Substring char * in Objective C

I need to substring char* to some length and need to convert to NSString.
char *val substring Length
I tried
NSString *tempString = [NSString stringWithCString:val encoding:NSAsciiStringEncoding];
NSRange range = NSMakeRange (0, length);
NSString *finalValue = [tempString substringWithRange: range];
This works but not for other special character languages like chinese.
If i convert To UTF8Encoding then substring length will mismatch.
Is there any other way to substring the char* then convert to UTF8 encoding?
You have to use the encoding, the string is encoded in.
In your case, you say to interpret the string as ASCII string. ASCII does not have chinese characters. Therefore this cannot work with chinese characters: They are not there.
Likely you have an UTF8 encoded string. But simply switching to UTF8 does not help. Since NSString and OS X/iOS at all encodes 16-Bit Unicode, but extended Unicode has 20 bits, chinese characters needs multiple codes. This has some effects, for example -length returns the number of codes, not the number of chinese characters. However, with -rangeOfComposedCharacterSequencesForRange: you can adjust the range.
For example 𠀖 (CJK unified ideograph-0x20016):
NSString *str = #"𠀖"; // One chinese whatever
NSLog(#"%ld", [str length]); // This are "2" characters
NSRange range = {0, 1}; // Range for the "first" character
NSLog(#"%ld %ld", range.location, range.length); // 0 1
range = [str rangeOfComposedCharacterSequencesForRange:range];
NSLog(#"%ld %ld", range.location, range.length); // 0 2
You can get a better answer, if you add information about the encoding of the string coming in and the required encoding for putting out.
Strings are not UTF8 or whatever strings. Strings are strings. Their storage, their representation in computer memory has an encoding, but they don't have an encoding themselves.
I found the solution for my question
char subString[length+1];
strncpy(subString, val, length);
subString[length] = '\0'; // place the null terminator
NSString *finalString = [NSString stringWithCString: subString encoding:NSUTF8StringEncoding];
I did the char* sub string and UTF8 encoding both.

How to convert unicode hex number variable to character in NSString?

Now I have a range of unicode numbers, I want to show them in UILabel, I can show them if i hardcode them, but that's too slow, so I want to substitute them with a variable, and then change the variable and get the relevant character.
For example, now I know the unicode is U+095F, I want to show the range of U+095F to U+096f in UILabel, I can do that with hardcode like
NSString *str = [NSString stringWithFormat:#"\u095f"];
but I want to do that like
NSInteger hex = 0x095f;
[NSString stringWithFormat:#"\u%ld", (long)hex];
I can change the hex automatically,just like using #"%ld", (long)hex, so anybody know how to implement that?
You can initialize the string with the a buffer of bytes of the hex (you simply provide its pointer). The point is, and the important thing to notice is that you provide the character encoding to be applied. Specifically you should notice the byte order.
Here's an example:
UInt32 hex = 0x095f;
NSString *unicodeString = [[NSString alloc] initWithBytes:&hex length:sizeof(hex) encoding:NSUTF32LittleEndianStringEncoding];
Note that solutions like using the %C format are fine as long as you use them for 16-bit unicode characters; 32-bit unicode characters like emojis (for example: 0x1f601, 0x1f41a) will not work using simple formatting.
You would have to use
[NSString stringWithFormat:#"%C", (unichar)hex];
or directly declare the unichar (unsigned short) as
unichar uni = 0x095f;
[NSString stringWithFormat:#"%C", uni];
A useful resource might be the String Format Specifiers, which lists %C as
16-bit Unicode character (unichar), printed by NSLog() as an ASCII character, or, if not an ASCII character, in the octal format \ddd or the Unicode hexadecimal format \udddd, where d is a digit.
Like this:
unichar charCode = 0x095f;
NSString *s = [NSString stringWithFormat:#"%C",charCode];
NSLog(#"String = %#",s); //Output:String = य़

Count characters before some string iOS

I need to count characters in string before some string and after some string. For example, I have string "This is example string" and I need to know howmany characters are before word "example" (it is 8 chars in this case) and how many characters are after word "example" (7 in that case...). My idea was to loop that string and count every character, but how to stop it before that requied word? Thanks for every idea!
check this out
NSString *sample = #"This is example string";
NSRange b = [sample rangeOfString:#"example"];
if (b.location != NSNotFound) {
NSLog(#"%d characters before", b.location);
NSLog(#"%d characters after", [sample length] - b.location - b.length);
}

Allowing special characters in iOS

I am allowing the user to input some data into the TextField. The user inputs Š1234D into the TextField.
The code I have looks like this:
NSString *string = textField.text;
for (int nCtr = 0; nCtr < [string length]; nCtr++) {
const char chars = [string characterAtIndex:nCtr];
int isAlpha = isalpha(chars);
}
string output looks like this:Š1234D
Then I printed the first chars value, it looks like this:'`' instead of 'Š'. Why is this so? I would like to allow special characters in my code as well.
Any suggestion would be welcome as well. Need some guidance. Thanks
You are truncating the character value as [NSString chatacterAtIndex:] returns unichar (16-bit) and not char (8-bit). try:
unichar chars = [string characterAtIndex:nCtr];
UPDATE: Also note that you shouldn't be using isalpha() to test for letters, as that is restricted to Latin character sets and you need something that can cope with non-latin characters. Use this code instead:
NSCharacterSet *letterSet = [NSCharacterSet letterCharacterSet];
NSString *string = textField.text;
for (NSUIntger nCtr = 0; nCtr < [string length]; nCtr++)
{
const unichar c = [string characterAtIndex:nCtr];
BOOL isAlpha = [letterSet characterIsMember:c];
...
}
characterAtIndex: returns a unichar (2-byte Unicode character), not char (1-byte ASCII character). By casting it to char, you are getting only one of the two bytes.
You should turn on your compiler warnings. I believe "Suspicious implicit conversions" should do the trick.
On a separate note, you can't use isAlpha(char) with a unichar. Use [[NSCharacterSet letterCharacterSet] characterIsMember:chars]

Categories

Resources