I want to get a cString from NSString.
So we used cStringUsingEncoding: method.
However, the return value of the cStringUsingEncoding: method is not guaranteed.
(Apple's doc: The returned C string is guaranteed to be valid only until either the receiver is freed.)
So Apple recommends the getCString:maxLength:encoding: method.
I want to pass the exact length to maxLength.
Example 1)
NSString *tmp = #"中日韓" // cString 9bytes
char *buffer = new tmp[9 + 1];
[tmp getCString:buffer maxLength:9+1 encoding:NSUTF8StringEncoding];
Example 2)
NSString *tmp = #"中日韓123" // cString 12bytes
char *buffer = new tmp[12 + 1];
[tmp getCString:buffer maxLength:12+1 encoding:NSUTF8StringEncoding];
Is there a way to know the lengths of 9 and 12 in the example above?
// Add one because this doesn't include the NULL
NSUInteger maxLength = [string maximumLengthOfBytesUsingEncoding:NSUTF8StringEncoding] + 1;
You can use cStringUsingEncoding to get the length. If you need the resulting char * to live longer than tmp, then simply copy the C-string:
NSString *tmp = #"中日韓" // cString 9bytes
const char *cStr = [tmp cStringUsingEncoding:NSUTF8StringEncoding];
size_t len = strlen(cStr);
char *buffer = new tmp[len + 1];
strcpy(buffer, cStr);
Related
This question already has answers here:
NSString (hex) to bytes
(7 answers)
Closed 5 years ago.
I want to convert ObjectiveC NSString, that contains for example string "0d" to hex value 0x0d. What's the easiest way to achieve that?
Example:
NSString *str = #"50";
unsigned char mac[1];
mac[0] = 0x50; //<- how to set mac[0] to 0x50 from "str" string?
- (NSData *)dataFromHexString {
const char *chars = [self UTF8String];
int i = 0, len = self.length;
NSMutableData *data = [NSMutableData dataWithCapacity:len/2];
char byteChars[3] = {chars[0],chars[1],'\0'};
unsigned long wholeByte = strtoul(byteChars, NULL, 16);
[data appendBytes:&wholeByte length:1];
return data;
}
I have a string variable in iOS and I would like to convert that to a character array and then to a hex bytes like 0xD6, 0xD6 etc.
It will be great if there is a library in Objective-C that I can use for this
swift 4
string to byte:
let strChar = "A"
let data1 = [UInt8](self.strChar.utf8)
may be answer is here:
string to chars:
NSString *s = #"Some string";
const char *c = [s UTF8String];
chars to hex:
- (NSData *)dataFromHexString {
const char *chars = [self UTF8String];
int i = 0, len = self.length;
NSMutableData *data = [NSMutableData dataWithCapacity:len / 2];
char byteChars[3] = {'\0','\0','\0'};
unsigned long wholeByte;
while (i < len) {
byteChars[0] = chars[i++];
byteChars[1] = chars[i++];
wholeByte = strtoul(byteChars, NULL, 16);
[data appendBytes:&wholeByte length:1];
}
return data;
}
reference:NSString (hex) to bytes
I have an NSData object that contains just <64> which is supposed to represent the int 100
How can I convert this NSData to an int?
I can convert it to it's Chr equivalent d using
NSString *string = [[NSString alloc] initWithData:characteristic.value encoding:NSUTF8StringEncoding];
but I need the Dec equivalent of 100
Thanks
<64> means that the NSData object contains a single byte with the value 0x64 = 100,
so the following should work;
const uint8_t *bytes = [data bytes]; // pointer to the bytes in data
int value = bytes[0]; // first byte
int *b = (int *)data.bytes;
printf("%d",*b); //prints 100
Below logic converts NSData to integer perefctly. Length of bytes does not matter. It just works.
NSData *data;
NSString *stringData = [data description];
stringData = [stringData substringWithRange:NSMakeRange(1, [stringData length]-2)];
unsigned dataAsInt = 0;
NSScanner *scanner = [NSScanner scannerWithString: stringData];
[scanner scanHexInt:& dataAsInt];
According to my requirement:
The input string has to be converted into Byte Values.
Each character of string , which are 16 bit values , has to be converted to low 8 bits.
The Sha1 is then computed over the byte Array.
The resulting SHA-1 is converted into a 40 character string.
I know how to convert a string into SHA1 , but the rest of part is a bit gloomy to me.
I have been able to do the last two steps.
unsigned char digest[CC_SHA1_DIGEST_LENGTH];
NSData *dataString = [yourString dataUsingEncoding: NSUTF8StringEncoding];
if (CC_SHA1([dataString bytes], [dataString length], digest)) {
//Sha1 is calculated & stored in digest.
}
Any help will be appreciated.
I have created this function , which works fine according to your requirement . You just have to input a string.
#import <CommonCrypto/CommonDigest.h>
- (NSString *)calculateSHA:(NSString *)yourString
{
const char *ptr = [yourString UTF8String];
int i =0;
int len = strlen(ptr);
Byte byteArray[len];
while (i!=len)
{
unsigned eachChar = *(ptr + i);
unsigned low8Bits = eachChar & 0xFF;
byteArray[i] = low8Bits;
i++;
}
unsigned char digest[CC_SHA1_DIGEST_LENGTH];
CC_SHA1(byteArray, len, digest);
NSMutableString *hex = [NSMutableString string];
for (int i=0; i<20; i++)
[hex appendFormat:#"%02x", digest[i]];
NSString *immutableHex = [NSString stringWithString:hex];
return immutableHex;
}
Then you just have to call the above method.
[self calculateSHA:yourString];
NSData *dataString = [yourString dataUsingEncoding: NSUTF8StringEncoding];
converts the string to UTF-8 bytes, e.g. "é" = Unicode 00E9 is converted to the two bytes C3 A9, and "€" = Unicode 20AC is converted to three bytes E2 82 AC.
If your requirement is to "truncate" the Unicode characters to the lower 8 bits, you have to do this "manually", I do not know a built-in encoding that could be used for that:
NSMutableData *dataString = [NSMutableData dataWithLength:[yourString length]];
uint8_t *dataBytes = [dataString mutableBytes];
for (NSUInteger i = 0; i < [yourString length]; i++) {
// assigning the character to a uint_8 truncates to the lower 8 bit:
dataBytes[i] = [yourString characterAtIndex:i];
}
Based on your code snippet, you want to do something like:
unsigned char digest[CC_SHA1_DIGEST_LENGTH];
NSData *dataString = [yourString dataUsingEncoding: NSUTF8StringEncoding];
NSMutableString *outString;
if (CC_SHA1([dataString bytes], [dataString length], digest)) {
for (int i=0;i<CC_SHA1_DIGEST_LENGTH;i++) {
[outString appendFormat:#"%02x", digest[i]];
}
}
Where outString will be your 40-char string.
Here's an NSString category for creating a SHA1 hash of an NSString.
Creating SHA1 Hash from NSString
In my iPhone app I am getting the device token from Apple which I am assigning a public property inside the Delegate file as shown below:
- (void)application:(UIApplication*)application didRegisterForRemoteNotificationsWithDeviceToken:(NSData*)deviceToken
{
self.dToken = [[NSString alloc] initWithData:deviceToken encoding:NSUTF8StringEncoding];
}
The dToken property is declared as shown below:
NSString *dToken;
#property (nonatomic,retain) NSString *dToken;
But when I try to retrieve the device token from another file I get the null value.
+(NSString *) getDeviceToken
{
NSString *deviceToken = [(MyAppDelegate *)[[UIApplication sharedApplication] delegate] dToken];
NSLog(#" getDeviceToken = %#",deviceToken); // This prints NULL
return deviceToken;
}
What am I doing wrong?
I suggest you to convert token to string in this way:
self.dToken = [[[deviceToken description]
stringByTrimmingCharactersInSet:[NSCharacterSet characterSetWithCharactersInString:#"<>"]]
stringByReplacingOccurrencesOfString:#" "
withString:#""];
UPDATED:
As many people mentioned it is better to use next approach to convert NSData * to NSString *:
#implementation NSData (Conversion)
- (NSString *)hexadecimalString
{
const unsigned char *dataBuffer = (const unsigned char *)[self bytes];
if (!dataBuffer) {
return [NSString string];
}
NSUInteger dataLength = [self length];
NSMutableString *hexString = [NSMutableString stringWithCapacity:(dataLength * 2)];
for (int i = 0; i < dataLength; ++i) {
[hexString appendFormat:#"%02lx", (unsigned long)dataBuffer[i]];
}
return hexString;
}
#end
From the discussion at Best way to serialize an NSData into a hexadeximal string, here is a better way to do it. Is longer, but your code will be future-proof if Apple changes the way NSData emit debugger descriptions.
Extend NSData as follows:
#implementation NSData (Hex)
- (NSString*)hexString {
unichar* hexChars = (unichar*)malloc(sizeof(unichar) * (self.length*2));
unsigned char* bytes = (unsigned char*)self.bytes;
for (NSUInteger i = 0; i < self.length; i++) {
unichar c = bytes[i] / 16;
if (c < 10) c += '0';
else c += 'A' - 10;
hexChars[i*2] = c;
c = bytes[i] % 16;
if (c < 10) c += '0';
else c += 'A' - 10;
hexChars[i*2+1] = c;
}
NSString* retVal = [[NSString alloc] initWithCharactersNoCopy:hexChars
length:self.length*2
freeWhenDone:YES];
return [retVal autorelease];
}
#end
I know that this is an old question and that this may be new information that has come up since then, but I'd just like to point something out to all of the people who are claiming that using the description method is a really bad idea. In most cases, you'd be exactly right. The description property is generally just used for debugging, but for the NSData class, it's specifically defined as returning a hexadecimal representation of the receivers contents which is exactly what is needed here. Since Apple has put it in their documentation, I think you're pretty safe as far as them changing it.
This can be found in the NSData Class Reference here: https://developer.apple.com/library/ios/documentation/Cocoa/Reference/Foundation/Classes/NSData_Class/Reference/Reference.html