In Java code, this code can get hex string "efbfbc";
new String((((char)-4 )+ "").getBytes("utf8"),"iso8859-1")
How can implement it in iOS Objective-C?
by byte -4, get hex string "efbfbc";
NSString *string = #"efbfbc";
char converted[([string length] + 1)];
NSString* str = [[NSString alloc]
initWithCString: converted encoding: NSISOLatin1StringEncoding];
NSLog(#"Your result string: %#",str);
I got below result:
Check out this AppleDoc for more information about intiWithCString method. It will give you a clear idea.
i have solve the problem.
unichar asciiChar = -4;
NSString *string = [NSString stringWithCharacters:&asciiChar length:1];
string = [NSString stringWithCString:[string UTF8String] encoding: NSISOLatin1StringEncoding];
now , string hex is "efbfbc"
thanks everyone.
Related
I am looking for solution where i want to store English + Arabic + Emoji Character to store to Database and retrieve it back while display.
Below is the code what i have used to support Emoji, after that Arabic text is not showing.
+(NSString *)emojiToSave:(NSString *)str
{
NSData *dataForEmoji = [str dataUsingEncoding:NSNonLossyASCIIStringEncoding];
NSString *encodevalue = [[NSString alloc]initWithData:dataForEmoji encoding:NSUTF8StringEncoding];
return encodevalue;
}
+(NSString *)emojiToDisplay:(NSString *)str
{
NSData *msgData = [str dataUsingEncoding:NSUTF8StringEncoding];
NSString *goodMsg = [[NSString alloc] initWithData:msgData encoding:NSNonLossyASCIIStringEncoding];
return goodMsg;
}
Can anyone pls suggest to give support for Arabic what change i should do?
Thanks in advance.
Try convert it into base64 code, then insert base64 code to database:
//Original string to base64 string
NSString *emojiString = #"مرحبا 😀 Hello";
NSData *emojiData = [emojiString dataUsingEncoding:NSUTF8StringEncoding];
NSString *base64String = [emojiData base64EncodedStringWithOptions:NSDataBase64Encoding64CharacterLineLength];
//Base64 string to original string
NSData *base64Data = [[NSData alloc] initWithBase64EncodedString:base64String options:NSDataBase64DecodingIgnoreUnknownCharacters];
NSString *originalString =[[NSString alloc] initWithData:base64Data encoding:NSUTF8StringEncoding];
NSLog(#"Result: %#",originalString);
Output:
You have to use an encoding that supports emoji and arabic characters. ASCII doesn't support that.
You should use NSUTF8StringEncoding everywhere, and you're fine.
Why are you using ASCII anyways? Why are you converting a string to an NSData and then back to NSString again? It doesn't make sense.
I'm using
[NSString stringWithFormat:#"%C",(unichar)decimalValueX];
but I have to call it thousands of times and its simply too slow.
As an alternative I tried this:
sprintf (cString, "%C", (unichar)decimalValueX);
[NSString stringWithCString:cString encoding:NSUTF16StringEncoding];
but no characters are correctly transalted.
If I try UTF8 instead of 16:
sprintf (cString, "%C", (unichar)decimalValueX);
[NSString stringWithCString:cString encoding:NSUTF8StringEncoding];
I get alphanumeric, but I don't get foreign characters or other special characters.
Can anyone explain whats going on? Or how to make stringWithFormat faster?
Thanks!
It seems that the %C format does not work with sprintf and related functions and non-ASCII characters. But there is a simpler method:
stringWithCharacters:length:
creates an NSString directly from a unichar array (UTF-16 code points).
For a single unichar this would be just
NSString *string = [NSString stringWithCharacters:&decimalValueX length:1];
Example:
unichar decimalValueX = 8364; // The Euro character
NSString *string = [NSString stringWithCharacters:&decimalValueX length:1];
NSLog(#"%#", string); // €
Example for multiple UTF-16 code points:
unichar utf16[] = { 945, 946, 947 };
NSString *string3 = [NSString stringWithCharacters:utf16 length:3];
NSLog(#"%#", string3); // αβγ
For characters outside of the "basic multilingual plane" (i.e.
characters > U+FFFF) you would have to use 2 UTF-16 code points
per character (surrogate pair).
Or use a different API like
uint32_t utf32[] = { 128123, 128121 };
NSString *string4 = [[NSString alloc] initWithBytes:utf32 length:2*4 encoding:NSUTF32LittleEndianStringEncoding];
NSLog(#"%#", string4); // 👻👹
First i have to convert Hex but this <> generates an exception when i am trying to convert it to integer
NSString *steps =characteristic.value;
int value2= [steps intValue];
First i have to removce <>, then convert this hexadecimal string into integer value.
NSString *strVal = [[NSString alloc] initWithData:characteristic.value encoding:NSUTF8StringEncoding];
int intValue = strVal.intValue;
You have to prefix your string with 0x:
NSString *orig = #"000000ae";
NSString *str = [NSString stringWithFormat:#"0x%#" , orig];
to be able to use NSScanner:
unsigned int outVal;
NSScanner* scanner = [NSScanner scannerWithString:str];
[scanner scanHexInt:&outVal];
outVal holds your result now.
I have a long string, and I would like to remove a specific hexadecimal character from it.
NSString * myString = #"longlongstringwithcharacters\"ofallsorts\"";
Any suggestions?
The hex character I am after is 08, that corresponds to backspace. How can I use code like the following to substitute it? I have no idea on how to represent 08 in a string:
NSString *stringWithoutSpaces = [myString
stringByReplacingOccurrencesOfString:#" " withString:#""];
EDIT:
I will try to clarify a bit more what I am trying to do..
I am trying to remove all occurrences of a character that corresponds to 08 hex from the string that I receive as payload.
The payload is in a string format and I found out the character by using Xcode debugger and view the hex codes of the string as there was an invalid character when trying to covert the NSData corresponding to the string to a NSDictionary.
I am not sure how to phrase the problem correctly..
- (NSString *)stringFromHexString:(NSString *)hexString {
// The hex codes should all be two characters.
if (([hexString length] % 2) != 0)
return nil;
NSMutableString *string = [NSMutableString string];
for (NSInteger i = 0; i < [hexString length]; i += 2) {
NSString *hex = [hexString substringWithRange:NSMakeRange(i, 2)];
NSInteger decimalValue = 0;
sscanf([hex UTF8String], "%x", &decimalValue);
[string appendFormat:#"%c", decimalValue];
}
return string;
}
Try this code...This will help you to convert Hex to string
NSString * str = #"68656C6C6F";
NSMutableString * newString = [[[NSMutableString alloc] init] autorelease];
int i = 0;
while (i < [str length])
{
NSString * hexChar = [str substringWithRange: NSMakeRange(i, 2)];
int value = 0;
sscanf([hexChar cStringUsingEncoding:NSASCIIStringEncoding], "%x", &value);
[newString appendFormat:#"%c", (char)value];
i+=2;
}
this will help u to convert Hex to NSString
This code worked for me:
NSString * dataString = message.payloadString;
NSString * wrongCharacter = [[NSString alloc] initWithFormat:#"%c", (char)0x08];
dataString = [dataString stringByReplacingOccurrencesOfString:wrongCharacter withString:#""];
How to covert std::string to NSString? ,why the result is garbled?I use lldb command po ,look at the console ,the red arrow,the _data display correct string?Why?
std::string resultString = getResult();
NSString *str= [NSString stringWithCString:resultString.c_str() encoding:NSUTF8StringEncoding];
but the str is garbled,like
May be here issue is with string encoding. I have done little test on this input string and found some result on that.
Here I have use complex string "\x18\xa4\tp\x01" which you shown in your log. From the result I conclude that NSUTF8StringEncoding encoding string is not working with above string.
Here is code:
+ (void) stringTest {
std::string *resultString = new std::string("\x18\xa4\tp\x01");
NSString *str= [NSString stringWithCString:resultString->c_str() encoding:NSUTF8StringEncoding];
NSString *str2= [NSString stringWithCString:resultString->c_str() encoding:NSASCIIStringEncoding];
NSString *str3= [NSString stringWithCString:resultString->c_str() encoding:[NSString defaultCStringEncoding]];
NSLog(#"str :%#",str);
NSLog(#"str2 :%#",str2);
NSLog(#"str3 :%#",str3);
}
And a reference image for log:
Here you can see that NSUTF8StringEncoding returns nil string and other encoding gives a result. I'm not sure which encoding scheme is valid for your string. If we know that encoding scheme for resultString string then we can get more accurate result here.