NSArray element to NSString - ios

I have application where I receive from socket message like: "\r\nIDLE|03/17/2013 19:48\n". I convert this message into UTF8 string with code:
NSString* newStr = [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
then I try to separate command and time from this string:
NSArray *nums = [command componentsSeparatedByString:[NSString stringWithFormat:#"%c", (char)13]];
NSLog(#"First separate = %#", nums);
if ([nums count] == 3)
{
NSArray *nums1 = [[nums objectAtIndex:1] componentsSeparatedByString:#"|"];
NSLog(#"second separate = %#", nums1);
if ([nums1 count] == 2)
{
NSString* strState = [(NSString *)[nums1 objectAtIndex:0] description];
NSLog(#"State = %#", strState);
...
}
}
in the log I see next:
First separate = (
"IDLE|03/17/2013 19:48",
"\n"
second separate = (
"IDLE",
"03/17/2013 19:48"
State = I
After second separating in the 0 elements I have text IDLE, but when I try to get this text I strState variable, I see only first character of this text.
Can anybody help me get full command of this element?
Thank you.
UPDATE1
As I say previously I get from socket message like:"\r\nIDLE|03/17/2013 19:48\n"
I'm sure that the server send me this message.
in the Socket read callback I'm use next code for reading this message:
int result = CFReadStreamRead(_inputStream, (UInt8*)[data mutableBytes], length);
NSString* newStr = [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
Here app read 50 bytes and this is right for this message? you can see my debugger at this point in the next picture:
but in the newStr member I see only \r.
then I send try separate this message with code I wrote above.
When I try separate message logger show me all message instead of \r:
Thank you

Have you considered using sscanf()?
char command[COMMAND_SIZE_LIMIT];
char date[DATE_SIZE_LIMIT];
char time[TIME_SIZE_LIMIT];
sscanf (input, "\r\n%[^|]%*[|]%s%s\n", command, date, time);
(Note: Above is example code; you've got a security issue if the scanned string has oversized command or datetime substrings.)
# test
ebg#ebg$ ./foo
Input: '
IDLE|03/17/2013 19:48
'
Command: 'IDLE'
Date: '03/17/2013'
Time: '19:48'
It is easy to convert NSString <==> C string.

There is a fairly straightforward way to accomplish what you want using the -[NSString stringByTrimmingCharactersInSet:] method. Passing the whitespace and newline character set allows you to trim the leading and trailing whitespace and newlines from your string, and then you only need to do a single call to componentsSeparatedByString:.
NSString *raw = #"\r\nIDLE|03/17/2013 19:48\n";
NSString *trimmedString = [raw stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
NSArray *components = [trimmedString componentsSeparatedByString:#"|"];
NSString *command = components[0];
NSString *time = components[1];
NSLog(#"command = %#, time = %#",command,time);
This code yields the following output:
command = IDLE, time = 03/17/2013 19:48

Related

Display Unicode String as Emoji

I currently receive emojis in a payload in the following format:
\\U0001F6A3\\U0000200D\\U00002640\\U0000FE0F
which represents "🚣‍♀️"
However, if I try to display this, it only shows the string above (escaped with 1 less ), not the emoji e.g.
NSString *emoji = payload[#"emoji"];
NSLog(#"%#", emoji) then displays as \U0001F6A3\U0000200D\U00002640\U0000FE0F
It's as if the unicode escape it not being recognised. How can I get the string above to show as an emoji?
Please assume that the format the data is received in from the server cannot be changed.
UPDATE
I found another way to do it, but I think the answer by Albert posted below is better. I am only posting this for completeness and reference:
NSArray *emojiArray = [unicodeString componentsSeparatedByString:#"\\U"];
NSString *transformedString = #"";
for (NSInteger i = 0; i < [emojiArray count]; i++) {
NSString *code = emojiArray[i];
if ([code length] == 0) continue;
NSScanner *hexScan = [NSScanner scannerWithString:code];
unsigned int hexNum;
[hexScan scanHexInt:&hexNum];
UTF32Char inputChar = hexNum;
NSString *res = [[NSString alloc] initWithBytes:&inputChar length:4 encoding:NSUTF32LittleEndianStringEncoding];
transformedString = [transformedString stringByAppendingString:res];
}
Remove the excess backslash then convert with a reverse string transform stringByApplyingTransform. The transform must use key "Any-Hex" for emojis.
NSString *payloadString = #"\\U0001F6A3\\U0000200D\\U00002640\\U0000FE0F";
NSString *unescapedPayloadString = [payloadString stringByReplacingOccurrencesOfString:#"\\\\" withString:#"\\"];
NSString *transformedString = [unescapedPayloadString stringByApplyingTransform:#"Any-Hex" reverse:YES];
NSLog(#"%#", transformedString);//logs "🚣‍♀️"
I investigated this, and it seems you may not be receiving what you say you are receiving. If you see \U0001F6A3\U0000200D\U00002640\U0000FE0F in your NSLog, chances are you are actually receiving \\U0001F6A3\\U0000200D\\U00002640\\U0000FE0F at your end instead. I tried using a variable
NSString *toDecode = #"\U0001F6A3\U0000200D\U00002640\U0000FE0F";
self.tv.text = toDecode;
And in textview it is displaying the emoji fine.
So you got to fix that first and then it will display well.

How to show special character in UILabel iOS

I am trying to implement an app where I would like to show some text in Spanish format. For example I would like to show "España" but in my label it shows "Espa√ɬ±a" and also it changes the text for some of other text.
How to get rid of these. If anybody could help. Thanks.
Edit: When i am getting my response it logs that Below result
Message = (
"Espa\U221a\U00c9\U00ac\U00b1a:1.3\U221a\U00c7\U00ac\U00a2/min"
);
But when i extract the value according to key from Dictionary it shows
España:1.3¢/min
It means when i am getting the value from dictionary it cant do proper decoding.
how to resolve this. Any idea..?
First convert your response String to NSData using NSUTF8StringEncoding encoding, then again convert the same data to finalString like below.
NSString *string = #"España"; //Your response String goes here
NSData *data = [string dataUsingEncoding:NSUTF8StringEncoding];
NSString *finalString = [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding];
lblTemp.text = finalString;
UPDATE 1
I think there is some error from your response, Please see below
NSString *string = #"Nu\\u0161a Florjan\\u010di\\u010d";
NSString *finalString = [NSString
stringWithCString:[string cStringUsingEncoding:NSUTF8StringEncoding]
encoding:NSNonLossyASCIIStringEncoding];
NSLog(#"finalString = %#", finalString);
Output of above code is,
finalString = Nuša Florjančič
UPDATE 2
If you want output string like "España", your desired response should be "Espa\u00F1a", Find below,
NSString *string = #"Espa\\u00F1a";
NSString *finalString = [NSString
stringWithCString:[string cStringUsingEncoding:NSUTF8StringEncoding]
encoding:NSNonLossyASCIIStringEncoding];
NSLog(#"%#",finalString);
Output is España

Get substring between two ASCII control characters

Example Response:
_STX_<Response>
<Error>Cancelled</Error>
</Response>
_ETX_<AuditCharacter_not_XML>_EOT_
_VAR_ are actual ASCII Hex codes (reference)
_STX_ = 0x02 // start of heading
_ETX_ = 0x03 // end of text
_EOT_ = 0x04 // end of transmission
We are doing some integration with some 3rd party devices, one of them we have a socket that reads in a response from the device. We use CocoaAsyncSocket for this. So the data is available to us as NSData or NSString
EG:
NSData *strData = [data subdataWithRange:NSMakeRange(0, [data length])];
NSString *msg = [[NSString alloc] initWithData:strData encoding:NSUTF8StringEncoding];
What i am trying to do is get the XML between the ASCII control characters.
One way that works is to do the following (Knowing that NSString is a UTF16)
NSRange rSub = NSMakeRange(1, [msg length] - 5);
NSString *sub = [msg substringWithRange:rSub];
This correctly returns the XML but this is very limited, what happens when the AuditCharacter is more than 1 byte/char. We should be getting the string between the two control STX and ETX characters.
We have tried the following
unichar STX = 0x02; // \u0002 Start of Text
unichar ETX = 0x03; // \u0003 End of Text
unichar EOT = 0x04; // \u0004 End of transmission
unichar ACK = 0x06; // \u0006 ACK
unichar NAK = 0x15; // \u0015 NAK
NSScanner *scanner = [NSScanner scannerWithString:msg];
// Tried as NSString
NSCharacterSet *seperator = [NSCharacterSet characterSetWithCharactersInString:[NSString stringWithFormat: #"<%c>", ETX]]
// Tried as NSData
NSCharacterSet *seperator = [NSCharacterSet characterSetWithBitmapRepresentation:[[NSData alloc] initWithBytes:&ETX length:2]]; // tried length as 1 and 2
That always seems to just return the whole string.
We then tried using a range
NSRange r1 = [msg rangeOfString:[NSString stringWithFormat: #"<%c>", STX]];
NSRange r2 = [msg rangeOfString:[NSString stringWithFormat: #"<%c>", ETX]];
But both ranges always return a length of zero.
I know this has to do with the fact that were trying to split/locate the control characters in the string but i am not sure what the correct why to do this would be.
Any help would be appreciated.
You're searching for the string <_STX_> with angle brackets, but your actual string doesn't have angle brackets there. Just remove the angle brackets.

How to right pad a string using stringWithFormat

I would like to be able to right align a string using spaces. I have to be able to use the stringWithFormat: method.
So far I have tried the recommended format and it does not seem to work: [NSString stringWithFormat:#"%10#",#"test"].
I would expect this to return a string that has six spaces followed by "test" but all I am getting is "test" with no spaces.
It appears that stringWithFormat ignores the sizing requests of the %# format specifier. However, %s specifier works correctly:
NSString *test = #"test";
NSString *str = [NSString stringWithFormat:#"%10s", [test cStringUsingEncoding:NSASCIIStringEncoding]];
NSLog(#"'%#'", str);
This prints ' test'.
It's C style formatting. %nd means the width is n.
check following code.
NSLog(#"%10#",[NSString stringWithFormat:#"%10#",#"test"]);
NSLog(#"%#",[NSString stringWithFormat:#" %#",#"test"]);
NSLog(#"%10#", #"test");
NSLog(#"%10s", [#"test" cStringUsingEncoding:[NSString defaultCStringEncoding]]);
NSLog(#"%10d", 1);
NSString *str = #"test";
int padding = 10-[str length]; //6
if (padding > 0)
{
NSString *pad = [[NSString string] stringByPaddingToLength:padding withString:#" " startingAtIndex:0];
str = [pad stringByAppendingString:str];
}
NSLog(#"%#", str);

Decoding the scanned barcode value to int value

When I scan the barcode and I get some value if it is Equal=2 then I need to display with == and if it is Equal=3 then I need to display with = and if the value is 4 then invalid.
But Scanned Barcode are of integer value -- when decode using NSASCII it is displaying only till value 127 after that it is showing invalid results. Eg: if my Barcode value = 9699 the result value=jem then my added result value=jem= actualstring value=%åasc value id only showing 37
Here is my code:
- (void) readerView:(ZBarReaderView *)view didReadSymbols:(ZBarSymbolSet *)syms fromImage:(UIImage *)img
{
// do something useful with results -- cool thing is that you get access to the image too
for (ZBarSymbol *symbol in syms) {
[resultsBox setText:symbol.data];
if ([resultsBox.text length] == 2) {
addedresult.text = [resultsBox.text stringByAppendingString:#"=="];
} else if ([resultsBox.text length] == 3) {
addedresult.text = [resultsBox.text stringByAppendingString:#"="];
} if ([resultsBox.text length] >= 4) {
addedresult.text = #"Invalid";
}
[Base64 initialize];
NSString *myString = [[NSString alloc]initWithString:addedresult.text];
NSData * data = [Base64 decode:myString];
NSString * actualString = [[NSString alloc] initWithData:data encoding:NSASCIIStringEncoding];
NSLog(#"%#",actualString);
labeltext.text= actualString;
int asc = [actualString characterAtIndex:0];
label.text = [NSString stringWithFormat:#"%d", asc];
[actualString release];
break;
}
}
Since someone revived this question's comments, i'll revive this entire post.
You shouldn't go through NSData to create an NSString from something you already have, and you're probably losing something along the way. Go directly to NSString using stringWithFormat. Also, ASCII will come back and byte you later, if you have a choice, use UTF8.
NSString *actualStringUTF8 = [NSString stringWithFormat:#"%#",[addedresult.text urlEncodeUsingEncoding:NSUTF8StringEncoding]];
NSString *actualStringASCII = [NSString stringWithFormat:#"%#",[addedresult.text urlEncodeUsingEncoding:NSUTF8StringEncoding]];
NSLog(#"%#",actualStringUTF8);
NSLog(#"%c",[actualStringUTF8 UTF8String]); //This is a const char*
Secondly, I looked into the SDK and it says symbol.data is already an NSString*. Depending on what you want, you may not need to do anything. If you do end up needing to change encoding, make sure you understand why you need to (one good reason is "the rest of the application uses NS****StringEncoding").
Also make sure you compare strings the correct "Objective-C" way:
[actualString isEqualToString: testString];
NOT actualString == testString;

Resources