Im struggling to covert chinese word/characters to ascii or hexadecimal and all the values I've got up until now is not what I was suppose to get.
Example of conversion is the word 手 to hex is 1534b.
Methods Ive followed till now are as below, and I got varieties of results but the one I was looking for,
I really appreciate if you can help me out on this issue,
Thanks,
Mike
- (NSString *) stringToHex:(NSString *)str{
NSUInteger len = [str length];
unichar *chars = malloc(len * sizeof(unichar));
[str getCharacters:chars];
NSMutableString *hexString = [[NSMutableString alloc] init];
for(NSUInteger i = 0; i < len; i++ )
{
[hexString appendFormat:#"%02x", chars[i]]; //EDITED PER COMMENT BELOW
}
free(chars);
return hexString;}
and
const char *cString = [#"手" cStringUsingEncoding:NSASCIIStringEncoding];
below is the similar code in Java for Android, Maybe it helps
public boolean sendText(INotifiableManager manager, String text) {
final int codeOffset = 0xf100;
for (char c : text.toCharArray()) {
int code = (int)c+codeOffset;
if (! mConnection.getBoolean(manager, "SendKey", Integer.toString(code))) {
}
Your Java code is just doing this:
Take each 16-bit character of the string and add 0xf100 to it.
If you do the same thing in your above Objective-C code you will get the result you want.
Related
I am trying to convert the byteArray to a Hex NSString.
Here is the solution that I referred to convert it into hex NSString. But, I discovered It add's ffffffffffffff. How can I get correct hex NSString?
Best way to serialize an NSData into a hexadeximal string
const char myByteArray[] = {
0x12,0x23,0x34,0x45,0x56,0x67,0x78,0x89,
0x12,0x23,0x34,0x45,
0x56,0x67,0x78,0x89 };
NSData *myByteData=[NSData dataWithBytes:myByteArray length:sizeof(myByteArray)];
NSMutableString *myHexString= [NSMutableString stringWithCapacity:myByteData.length*2];
for(int i=0;i<myByteData.length;i++){
;
NSString *resultString =[NSString stringWithFormat:#"%02lx",(unsigned long)myByteArray[i]];
[myHexString appendString:resultString];
}
The output String
12233445566778ffffffffffffff8912233445566778ffffffffffffff89
Don't use unsigned long for each of your bytes. And what's the point of myByteData if you don't use it?
And since you are not really using char, use uint8_t.
Try this:
const uint8_t myByteArray[] = {
0x12,0x23,0x34,0x45,0x56,0x67,0x78,0x89,
0x12,0x23,0x34,0x45,
0x56,0x67,0x78,0x89 };
size_t len = sizeof(myByteArray) / sizeof(uint8_t);
NSMutableString *myHexString = [NSMutableString stringWithCapacity:len * 2];
for (size_t i = 0; i < len; i++) {
[myHexString appendFormat:#"%02x", (int)myByteArray[i]];
}
Your initial byte data is char rather than unsigned char. This means that any values >127 (0x7f) will be seen as a twos-complement negative number, giving ffffffffffffff89.
If you change your data to be unsigned char you will get the desired result.
const unsigned char myByteArray[] = {
0x12,0x23,0x34,0x45,0x56,0x67,0x78,0x89,
0x12,0x23,0x34,0x45,
0x56,0x67,0x78,0x89 };
NSData *myByteData=[NSData dataWithBytes:myByteArray length:sizeof(myByteArray)];
NSMutableString *myHexString= [NSMutableString stringWithCapacity:myByteData.length*2];
for(int i=0;i<myByteData.length;i++){
NSString *resultString =[NSString stringWithFormat:#"%02lx",(unsigned long)myByteArray[i]];
[myHexString appendString:resultString];
}
How to find string length Without use length algorithm. Please any one suggest me. What type of algorithm used to find string length.
Already i know [str length];
Any other option is available or not? If available means tell me.
Thanks.
I hope this helps you
NSString *foo = #"IDontWantToUseStringLength";
const wchar_t *str = (const wchar_t*)[foo cStringUsingEncoding:NSUTF16StringEncoding];
int len = 0;
while (str[len] != '\0') {
len++;
}
I'm not needing any serious security, I just need to stop 11 year olds with plist editors from editing their number of coins in my game with ease.
I created a function that takes a string, for each unicode value of a character it raises this unicode value by 220 plus 14 times the character number that it is in the string.
Obviously this will fail (I think) if the string was like a million characters long because eventually you run out of unicode characters, but for all intents and purposes, this will only be used on strings of 20 characters and less.
Are there any unicode characters in this range that will not be stored to a plist or will be ignored by Apple's underlying code when I save the plist so that when I retrieve it and decrypt the character will be gone and I can't decrypt it?
+(NSString*)encryptString:(NSString*)theString {
NSMutableString *encryptedFinal = [[NSMutableString alloc] init];
for (int i = 0; i < theString.length; i++) {
unichar uniCharacter = [theString characterAtIndex:i];
uniCharacter += +220+(14*i);
[encryptedFinal appendFormat:#"%C", uniCharacter];
}
return encryptedFinal;
}
+(NSString*)decryptString:(NSString*)theString {
NSMutableString *decryptedFinal = [[NSMutableString alloc] init];
for (int i = 0; i < theString.length; i++) {
unichar uniCharacter = [theString characterAtIndex:i];
uniCharacter += +220+(14*i);
[decryptedFinal appendFormat:#"%C", uniCharacter];
}
return decryptedFinal;
}
It works for a range of a string of length 20 characters or less if you are encrypting one of the first 26+26+10+30 characters in the unicode index at any given point along the 20 character line. It probably works higher, I just didn't test it any higher.
This is the code I created to test it, all unicode characters were stored in an NSString and stayed valid for counting later.
int i = 0;
NSMutableString *encryptedFinal = [[NSMutableString alloc] init];
NSString *theString = #"a";
int j = 26+26+10+30;//letters + capital letters + numbers + 30 extra things like ?><.\]!#$
int f = 0;
int z = 0;
while (f < j) {
while (i < 220+220+(14*20)) {
unichar uniCharacter = [theString characterAtIndex:0];
uniCharacter += +f;
uniCharacter += +220+(14*i);
[encryptedFinal appendFormat:#"%C", uniCharacter];
i++;
}
z += i;
f++;
i = 0;
}
NSLog(#"%#", encryptedFinal);
NSLog(#"%i == %i?", z, encryptedFinal.length);
There are two thing that you can do:
Save the number of coins using NSData rather than using
NSNumber. Then use
NSData+AES
to encrypt it. You can even encrypt your entire .plist file to
ensure that no other fields are changed.
Security through obscurity. Just save the number of coins as an important sounding field. e.g.:Security Token Number. You can also create a bogus number of coins field whose value is ignored. Or maybe save the same value in both the fields and flag the user for cheating if the two values don't match.
I am allowing the user to input some data into the TextField. The user inputs Š1234D into the TextField.
The code I have looks like this:
NSString *string = textField.text;
for (int nCtr = 0; nCtr < [string length]; nCtr++) {
const char chars = [string characterAtIndex:nCtr];
int isAlpha = isalpha(chars);
}
string output looks like this:Š1234D
Then I printed the first chars value, it looks like this:'`' instead of 'Š'. Why is this so? I would like to allow special characters in my code as well.
Any suggestion would be welcome as well. Need some guidance. Thanks
You are truncating the character value as [NSString chatacterAtIndex:] returns unichar (16-bit) and not char (8-bit). try:
unichar chars = [string characterAtIndex:nCtr];
UPDATE: Also note that you shouldn't be using isalpha() to test for letters, as that is restricted to Latin character sets and you need something that can cope with non-latin characters. Use this code instead:
NSCharacterSet *letterSet = [NSCharacterSet letterCharacterSet];
NSString *string = textField.text;
for (NSUIntger nCtr = 0; nCtr < [string length]; nCtr++)
{
const unichar c = [string characterAtIndex:nCtr];
BOOL isAlpha = [letterSet characterIsMember:c];
...
}
characterAtIndex: returns a unichar (2-byte Unicode character), not char (1-byte ASCII character). By casting it to char, you are getting only one of the two bytes.
You should turn on your compiler warnings. I believe "Suspicious implicit conversions" should do the trick.
On a separate note, you can't use isAlpha(char) with a unichar. Use [[NSCharacterSet letterCharacterSet] characterIsMember:chars]
I am trying to create an NSString object from a const unichar buffer where I don't know the length of the buffer.
I want to use the NSString stringWithCharacters: length: method to create the string (this seems to work), but please can you help me find out the length?
I have:
const unichar *c_emAdd = [... returns successfully from a C++ function...]
NSString *emAdd = [NSString stringWithCharacters:c_emAdd length = unicharLen];
Can anyone help me find out how to check what unicharLen is? I don't get this length passed back to me by the call to the C++ function, so I presume I'd need to iterate until I find a terminating character? Anyone have a code snippet to help? Thanks!
Is your char buffer null terminated?
Is it 16-bit unicode?
NSString *emAdd = [NSString stringWithFormat:#"%S", c_emAdd];
Your unichars should be null terminated so you when you reach two null bytes (a unichar = 0x0000) in the pointer you will know the length.
unsigned long long unistrlen(unichar *chars)
{
unsigned long long length = 0llu;
if(NULL == chars) return length;
while(NULL != chars[length])
length++;
return length;
}
//...
//Inside Some method or function
unichar chars[] = { 0x005A, 0x0065, 0x0062, 0x0072, 0x0061, 0x0000 };
NSString *string = [NSString stringWithCharacters:chars length:unistrlen(chars)];
NSLog(#"%#", string);
Or even simpler format with %S specifier