I'm trying to grab a character from a UITextField and find the ascii decimal code value for it. I can save the field value into a char variable, but I'm having trouble obtaining the decimal code value. See below code snippet of the problem.
// grabs letter from text field
char dechar = [self.decInput.text characterAtIndex:0]; //trying to input a capital A (for example)
NSLog(#"dechar: %c",dechar); // verifies that dechar holds my intended letter
// below line is where i need help
int dec = sizeof(dechar);
NSLog(#"dec: %d",dec); //this returns a value of 1
// want to pass my char dechar into below 'A' since this returns the proper ASCII decimal code of 65
int decimalCode = 'A';
NSLog(#"value: %d",decimalCode); // this shows 65 as it should
I know going the other way I can just use...
int dec = [self.decInput.text floatValue];
char letter = dec;
NSLog(#"ch = %c",letter); //this returns the correct ASCII letter
any ideas?
Why are you using the sizeof operator?
Simply do:
int dec = dechar;
This will give you 65 for dec assuming that dechar is A.
BTW - you really should change dechar to unichar, not char.
iOS uses unicode, not ASCII. Unicode characters are usually 16 bits, not 8 bits.
Look at using the NSString method characterAtIndex, which returns a unichar. A unichar is a 16 bit integer rather than an 8 bit value, so it can represent a lot more characters.
If you want to get ASCII values from an NSString, you should first convert it to ASCII using the NSString method dataUsingEncoding: NSASCIIStringEncoding, then iterate through the bytes in the date you get back.
Note that ASCII can only represent a tiny fraction of unicode characters though.
Related
I'm using an old objectiveC routine (let's call it oldObjectiveCFunction), which parses a String analyzing each char. After analyzing chars, it divides that String into Strings, and returns them into an array called *functions. This is a super reduced sample of how is that old function doing the String parse:
NSMutableArray *functions = [NSMutableArray new];
NSMutableArray *components = [NSMutableArray new];
NSMutableString *sb = [NSMutableString new];
char c;
int sourceLen = source.length;
int index = 0;
while (index < sourceLen) {
c = [source characterAtIndex:index];
//here do some random work analyzing the char
[sb appendString:[NSString stringWithFormat:#"%c",c]];
if (some condition){
[components addObject:(NSString *)sb];
sb = [NSMutableString new];
[functions addObject:[components copy]];
}
}
later, I'm getting each String of *functions doing this with Swift code:
let functions = oldObjectiveCFunction(string) as? [[String]]
functions?.forEach({ (function) in
var functionCopy = function.map { $0 }
for index in 0..<functionCopy.count {
let string = functionCopy[index]
}
}
the problem is that, it works perfectly with normal strings, but if the String contains russian names, like this:
РАЦИОН
the output, the content of my let string variable, is this:
\u{10}&\u{18}\u{1e}\u{1d}
How can I get the same Russian string instead of that?
I tried doing this:
let string2 = String(describing: string?.cString(using: String.Encoding.utf8))
but it returns even more strange result:
"Optional([32, 16, 38, 24, 30, 29, 0])"
Analysis. Sorry, I don't speak swift or Objective-C so the following example is given in Python; however, the 4th and 5th column (unicode reduced to 8-bit) recalls weird numbers in your question.
for ch in 'РАЦИОН':
print(ch, # character itself
ord(ch), # character unicode in decimal
'{:04x}'.format(ord(ch)), # character unicode in hexadecimal
(ord(ch)&0xFF), # unicode reduced to 8-bit decimal
'{:02x}'.format(ord(ch)&0xFF)) # unicode reduced to 8-bit hexadecimal
Р 1056 0420 32 20
А 1040 0410 16 10
Ц 1062 0426 38 26
И 1048 0418 24 18
О 1054 041e 30 1e
Н 1053 041d 29 1d
Solution. Hence, you need to fix all in your code reducing 16-bit to to 8-bit:
first, declare unichar c; instead of char c; at the 4th line, and use [sb appendString:[NSString stringWithFormat:#"%C",c]]; at the 11th line; note
Latin Capital Letter C in %C specifier 16-bit UTF-16 code unit (unichar) instead of
Latin Small Letter C in %c specifier 8-bit unsigned character (unsigned char);
Resources. My answer is based on answers to the following questions at SO:
What are the supported Swift String format specifiers?
objective-c - difference between char and unichar?
Your last result is not strange. The optional comes from the string?, and the cString() function returns an array of CChar ( Int8 ).
I think the problem comes from here - but I'm not sure because the whole thing looks confusing:
[sb appendString:[NSString stringWithFormat:#"%c",c]];
have you tried :
[sb appendString: [NSString stringWithCString:c encoding:NSUTF8StringEncoding]];
Instead of stringWithFormat?
( The solution of the %C instead of %c proposed by your commenters looks a good idea too. ) - oops - just saw you have tried without success.
I need to find a way to convert an arbitrary character typed by a user into an ASCII representation to be sent to a network service. My current approach is to create a lookup dictionary and send the corresponding code. After creating this dictionary, I see that it is hard to maintain and determine if it is complete:
__asciiKeycodes[#"F1"] = #(112);
__asciiKeycodes[#"F2"] = #(113);
__asciiKeycodes[#"F3"] = #(114);
//...
__asciiKeycodes[#"a"] = #(97);
__asciiKeycodes[#"b"] = #(98);
__asciiKeycodes[#"c"] = #(99);
Is there a better way to get ASCII character code from an arbitrary key typed by a user (using standard 104 keyboard)?
Objective C has base C primitive data types. There is a little trick you can do. You want to set the keyStroke to a char, and then cast it as an int. The default conversion in c from a char to an int is that char's ascii value. Here's a quick example.
char character= 'a';
NSLog("a = %ld", (int)test);
console output = a = 97
To go the other way around, cast an int as a char;
int asciiValue= (int)97;
NSLog("97 = %c", (char)asciiValue);
console output = 97 = a
Alternatively, you can do a direct conversion within initialization of your int or char and store it in a variable.
char asciiToCharOf97 = (char)97; //Stores 'a' in asciiToCharOf97
int charToAsciiOfA = (int)'a'; //Stores 97 in charToAsciiOfA
This seems to work for most keyboard keys, not sure about function keys and return key.
NSString* input = #"abcdefghijklkmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ1234567890!##$%^&*()_+[]\{}|;':\"\\,./<>?~ ";
for(int i = 0; i<input.length; i ++)
{
NSLog(#"Found (at %i): %i",i , [input characterAtIndex:i]);
}
Use stringWithFormat call and pass the int values.
Now I have a range of unicode numbers, I want to show them in UILabel, I can show them if i hardcode them, but that's too slow, so I want to substitute them with a variable, and then change the variable and get the relevant character.
For example, now I know the unicode is U+095F, I want to show the range of U+095F to U+096f in UILabel, I can do that with hardcode like
NSString *str = [NSString stringWithFormat:#"\u095f"];
but I want to do that like
NSInteger hex = 0x095f;
[NSString stringWithFormat:#"\u%ld", (long)hex];
I can change the hex automatically,just like using #"%ld", (long)hex, so anybody know how to implement that?
You can initialize the string with the a buffer of bytes of the hex (you simply provide its pointer). The point is, and the important thing to notice is that you provide the character encoding to be applied. Specifically you should notice the byte order.
Here's an example:
UInt32 hex = 0x095f;
NSString *unicodeString = [[NSString alloc] initWithBytes:&hex length:sizeof(hex) encoding:NSUTF32LittleEndianStringEncoding];
Note that solutions like using the %C format are fine as long as you use them for 16-bit unicode characters; 32-bit unicode characters like emojis (for example: 0x1f601, 0x1f41a) will not work using simple formatting.
You would have to use
[NSString stringWithFormat:#"%C", (unichar)hex];
or directly declare the unichar (unsigned short) as
unichar uni = 0x095f;
[NSString stringWithFormat:#"%C", uni];
A useful resource might be the String Format Specifiers, which lists %C as
16-bit Unicode character (unichar), printed by NSLog() as an ASCII character, or, if not an ASCII character, in the octal format \ddd or the Unicode hexadecimal format \udddd, where d is a digit.
Like this:
unichar charCode = 0x095f;
NSString *s = [NSString stringWithFormat:#"%C",charCode];
NSLog(#"String = %#",s); //Output:String = य़
I have a set of legacy data that that include individual Unicode chars formed based on struct:
struct LocalGrRec{
wchar_t cBegin;
int x2;
wchar_t cEnd;
in x2;
};
and a typical record looks like this, i.e., includes both long and short Unicode characters
{L'a', 0, L'¥', 3}
I can change the struct to make it easier to handle reading these characters into character variables:
wchar_t c = rec.cBegin;
// or
UTF32Char c = rec.cBegin;
Which one (or perhaps another choice that I don't know of) would make it easier to handle it. Please note that I need to process them as individual chars, but eventually I'll need to include them in an NSString.
What solution gives me the maximum flexibility and minimum pain?
And how would I read that character into a NSString?
Thanks
edit:
I need to compose NSString with it, not the other way around.
With unichar, here's the problem:
unichar c = L'•';
NSLog(#"%c", c); // produces: (") wrong character, presumably the first half of '•'
NSLog(#"%C", c); // produces: (\342\200)
I think you are looking for this method:
[NSString stringWithCharacters:(const unichar*) length:(NSUInteger)];
Just pass it an array of unichars and a length, and it will give you a NSString back
unichar list[3] = {'A', 'B', 'C'};
NSString *listString = [NSString stringWithCharacters:list length:3];
NSLog(#"listString: %#", listString);
I have a question regarding the converting string to intvalue. My question and issue is in case if I have string called "001223" I am getting 1223 as intvalue. But I want to get the 001223 as final int value. Please let me know if my question is not clear. Thanks for your time
There is no difference in value between the numbers 001223 , 1223, 2446/2 or 1223.000. They all refer to the same number.
If you want to keep leading zeroes, then you need to either keep it as a string or maintain another piece of information so it can be rebuilt later, basically the number of zeroes at the front, such as:
struct sNumWithLeadingZeros {
size_t zeroCount;
unsigned int actualValue;
};
I'd probably suggest the former (keeping it as a string) since that's likely to be less effort.
"Leading zeros" are to do with the textual representation of an integer, when stored as integer values in a computer the leading zeros do not exist.
However, if what you want to do is display the number with the same number of digits it had before being converted from text then: if the string contains only the digits of the number, e.g. you have #"001223" then you can take the length of this string to determine the number of digits. Later when converting the number back to string format you can use a formatted conversion, e.g. stringWithFormat:, and a format specifier which specifies the required number of digits. You'll need to read up on formats in the documentation, but here is an example:
NSString *input = #"001223";
int x = [input intValue];
int digits = (int)input.length;
NSString *output = [NSString stringWithFormat:#"%0*d", digits, x];
The value of output will be the same as input. The format broken down is: 0 - leading zeros; * use a dynamic field with, will use the value of digits; d - int.
HTH
One cannot prefix leading 0s in int data type. But if you see 0 prefix then the number is octal not decimal. Octal value can be created by changing base. For this you can use wrapper class like Integer.
But if one wants leading 0s for displaying data then he/she can use following code
public class Sample
{
public static void main(final String[] argv)
{
System.out.printf("%06d", 1223);
System.out.println();
}
}