Convert Hex code to Unicode in objective-C - ios

I'm trying to convert this :
NSString *encodedString = #"Les Profs (Comédie)"
into another NSSting in unicode :
NSString *decodedString = #"Les Profs (Comédie)"
I can't figure out how to do that easily...
Thanks in advance !

Your encoded string is containing html entities. You need to convert them to their unicode representation to get the required decoded string.
For conversion you can use following NSString extention
http://code.google.com/p/google-toolbox-for-mac/source/browse/trunk/Foundation/GTMNSString%2BHTML.h
http://code.google.com/p/google-toolbox-for-mac/source/browse/trunk/Foundation/GTMNSString%2BHTML.m
Here's how u will decode the string then
decodedStr = [encodedStr gtm_stringByUnescapingFromHTML];

CFStringTransform is the Core Foundation function that gives you what you need.

Related

Decoding string in base 64

I need to decode a base 64 string. To do so I use the folowing code :
// Get the base 64 string vector.
NSString *vector64String = insertRequestDictionnary[#"check"];
// Decode the base 64 string into data.
NSData *vector64Data = [NSData dataFromBase64String: vector64String];
// Get the string from decoded data.
NSString *decodeVector = [[NSString alloc] initWithData: vector64Data
encoding: NSUTF8StringEncoding];
..But every time i get a nil string (decodeVector).
I check with this website (http://www.base64decode.org/), my first string (vector64string) is base 64. For example : "h508ILubppN1xLpmXWLfnw==" gives "< uĺf]bߟ"
Thanks.
Not all data is an NSUTF8String. The point of Base64 to to create a string representation of data that is not naturally a string.
NSString *vector64String = #"h508ILubppN1xLpmXWLfnw==";
NSData *vector64Data = [[NSData alloc] initWithBase64EncodedString:vector64String options:0];
NSLog(#"vector64Data: %#", vector64Data);
NSLog output:
vector64Data: <879d3c20 bb9ba693 75c4ba66 5d62df9f>
vector64Data is the decoded base64 string.
The vector64Data is not a UTF8 string, it is just data.

Encoding NSString containing 3 byte ASCII characters to a proper NSString

A JSON request returns strings with an HTML encoded Unicode character.
It looks like this: valószínű which should be decoded to valószínű
In other words ű should be ű.
I found a description about a list of non-standard HTML characters here:
http://www.starr.net/is/type/htmlcodes.html
Is there any easy way to correct this?
It appears that the string is partially escaped. If you encode "valószín&#369" into an NSData object using:
NSData * data = [#"valószín&#369" dataUsingEncoding:NSUTF8StringEncoding];
then created an attributed string using
NSAttributedString * attrString = [[NSAttributedString alloc] initWithHTML:data documentAttributes:nil];
the "u" will be properly converted, but the preceding marks would be mangled:
resulting in
valószínű
An alternative would be to see the following post:
iOS HTML Unicode to NSString?

Xcode - UTF-8 String Encoding

I have a strange problem encoding my String
For example:
NSString *str = #"\u0e09\u0e31\u0e19\u0e23\u0e31\u0e01\u0e04\u0e38\u0e13";
NSString *utf = [str stringByReplacingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
NSLog("utf: %#", utf);
This worked perfectly in log
utf: ฉันรักคุณ
But, when I try using my string that I parsed from JSON with the same string:
//str is string parse from JSON
NSString *str = [spaces stringByReplacingOccurrencesOfString:#"U" withString:#"u"];
NSLog("str: %#, str);
NSString *utf = [str stringByReplacingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
NSLog("utf: %#", utf);
This didn't work in log
str: \u0e09\u0e31\u0e19\u0e23\u0e31\u0e01\u0e04\u0e38\u0e13
utf: \u0e09\u0e31\u0e19\u0e23\u0e31\u0e01\u0e04\u0e38\u0e13
I have been finding the answer for hours but still have no clue
Any would be very much appreciated! Thanks!
The string returned by JSON is actually different - it contains escaped backslashes (for each "\" you see when printing out the JSON string, what it actually contains is #"\").
In contrast, your manually created string already consists of "ฉันรักคุณ" from the beginning. You do not insert backslash characters - instead, #"\u0e09" (et. al.) is a single code point.
You could replace this line
NSString *utf = [str stringByReplacingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
with this line
NSString *utf = str;
and your example output would not change. The stringByReplacingPercentEscapesUsingEncoding: refers to a different kind of escaping. See here about percent encoding.
What you need to actually do, is parse the string for string representations of unicode code points. Here is a link to one potential solution: Using Objective C/Cocoa to unescape unicode characters. However, I would advise you to check out the JSON library you are using (if you are using one) - it's likely that they provide some way to handle this for you transparently. E.g. JSONkit does.

convert unicode string to nsstring

I have a unicode string as
{\rtf1\ansi\ansicpg1252\cocoartf1265
{\fonttbl\f0\fswiss\fcharset0 Helvetica;\f1\fnil\fcharset0 LucidaGrande;}
{\colortbl;\red255\green255\blue255;}
{\*\listtable{\list\listtemplateid1\listhybrid{\listlevel\levelnfc23\levelnfcn23\leveljc0\leveljcn0\levelfollow0\levelstartat1\levelspace360\levelindent0{\*\levelmarker \{check\}}{\leveltext\leveltemplateid1\'01\uc0\u10003 ;}{\levelnumbers;}\fi-360\li720\lin720 }{\listname ;}\listid1}}
{\*\listoverridetable{\listoverride\listid1\listoverridecount0\ls1}}
\paperw11900\paperh16840\margl1440\margr1440\vieww22880\viewh16200\viewkind0
\pard\li720\fi-720\pardirnatural
\ls1\ilvl0
\f0\fs24 \cf0 {\listtext
\f1 \uc0\u10003
\f0 }One\
{\listtext
\f1 \uc0\u10003
\f0 }Two\
}
Here i have unicode data \u10003 which is equivalent to "✓" characters. I have used
[NSString stringWithCharacters:"\u10003" length:NSUTF16StringEncoding] which is throwing compilation error. Please let me know how to convert these unicode characters to "✓".
Regards,
Boom
I have same for problem and the following code solve my issue
For Encode
NSData *dataenc = [yourtext dataUsingEncoding:NSNonLossyASCIIStringEncoding];
NSString *encodevalue = [[NSString alloc]initWithData:dataenc encoding:NSUTF8StringEncoding];
For decode
NSData *data = [yourtext dataUsingEncoding:NSUTF8StringEncoding];
NSString *decodevalue = [[NSString alloc] initWithData:data encoding:NSNonLossyASCIIStringEncoding];
Thanks
I have used below code to convert a Uniode string to NSString. This should work fine.
NSData *unicodedStringData =
[unicodedString dataUsingEncoding:NSUTF8StringEncoding];
NSString *emojiStringValue =
[[NSString alloc] initWithData:unicodedStringData encoding:NSNonLossyASCIIStringEncoding];
In Swift 4
let emoji = "😃"
let unicodedData = emoji.data(using: String.Encoding.utf8, allowLossyConversion: true)
let emojiString = String(data: unicodedData!, encoding: String.Encoding.utf8)
I assume that:
You are reading this RTF data from a file or other external source.
You are parsing it yourself (not using, say, AppKit's built-in RTF parser).
You have a reason why you're parsing it yourself, and that reason isn't “wait, AppKit has this built in?”.
You have come upon \u… in the input you're parsing and need to convert that to a character for further handling and/or inclusion in the output text.
You have ruled out \uc, which is a different thing (it specifies the number of non-Unicode bytes that follow the \u… sequence, if I understood the RTF spec correctly).
\u is followed by hexadecimal digits. You need to parse those to a number; that number is the Unicode code point number for the character the sequence represents. You then need to create an NSString containing that character.
If you're using NSScanner to parse the input, then (assuming you have already scanned past the \u itself) you can simply ask the scanner to scanHexInt:. Pass a pointer to an unsigned int variable.
If you're not using NSScanner, do whatever makes sense for however you're parsing it. For example, if you've converted the RTF data to a C string and are reading through it yourself, you'll want to use strtoul to parse the hex number. It'll interpret the number in whatever base you specify (in this case, 16) and then put the pointer to the next character wherever you want it.
Your unsigned int or unsigned long variable will then contain the Unicode code point value for the specified character. In the example from your question, that will be 0x10003, or U+10003.
Now, for most characters, you could simply assign that over to a unichar variable and create an NSString from that. That won't work here: unichars only go up to 0xFFFF, and this code point is higher than that (in technical terms, it's outside the Basic Multilingual Plane).
Fortunately, *CF*String has a function to help you:
unsigned int codePoint = /*…*/;
unichar characters[2];
NSUInteger numCharacters = 0;
if (CFStringGetSurrogatePairForLongCharacter(codePoint, characters)) {
numCharacters = 2;
} else {
characters[0] = codePoint;
numCharacters = 1;
}
You can then use stringWithCharacters:length: to create an NSString from this array of 16-bit characters.
Use this:
NSString *myUnicodeString = #"\u10003";
Thanks to modern Objective C.
Let me know if its not what you want.
NSString *strUnicodeString = "\u2714";
NSData *unicodedStringData = [strUnicodeString dataUsingEncoding:NSUTF8StringEncoding];
NSString *emojiStringValue = [[NSString alloc] initWithData:unicodedStringData encoding:NSUTF8StringEncoding];

Why is it direct commented Encoded string not converting to Arabic?

NSString * string = #"االْحَمْدُ لِلَّهِ رَبِّ الْعَالَمِينَ";
const char *c = [string cStringUsingEncoding:NSUTF8StringEncoding];
NSString *newString = [[NSString alloc]initWithCString:c encoding:NSISOLatin1StringEncoding];
NSLog(#"%#",newString);
// NSString * staticEncodedString = #"اÙÙØ­ÙÙ Ùد٠ÙÙÙÙÙÙ٠رÙبÙ٠اÙÙعÙاÙÙÙ ÙÙÙÙ";
const char *cvvv = [newString cStringUsingEncoding:NSISOLatin1StringEncoding];
NSString *newStringV = [[NSString alloc]initWithCString:cvvv encoding:NSUTF8StringEncoding];
NSLog(#"%#",newStringV);
Why is it direct commented Encoded string not converting to Arabic?
When i hardcode the Arabic it encodes and then decodes correctly, but why can't static encoded string not readable in arabic?
Thanks for your reply Jake. Yes I loose data while decoding the "staticEncodedString".But All I want is to decode the following string back to Arabic.
NSString * staticEncodedString = #"اÙÙØ­ÙÙ Ùد٠ÙÙÙÙÙÙ٠رÙبÙ٠اÙÙعÙاÙÙÙ ÙÙÙÙ";
The encode is in ANSI i think change it to UTF-8 from any tool.
Use Notepad++ to apply for example and then you can use encode it within sqlite or ios.
Latin1 can not represent the Arabic characters, so you can not encode that string to Latin1. Arabic belongs to the Latin4 character set. The method cStringUsingEncoding will return null if the string cannot losslessly be encoded to the specified encoding.
Why would you want to encode an arabic string to LatinX? UTF-8 will most likely be the best representation since it uses only standard characters and a straightforward approach with no headaches. It may take a bit more bytes than Latin4, but in most cases it will be worth it.
Converting to Latin1 will make you lose your text.

Resources