IOS:Convert string to hexadecimal array - ios

i have string representing data .i need to convert those data to the hex array .By using the hex array data i can pass it to the CRC for writing to the peripheral
My string data is like this
NSString *stringsdata=#"helloworld1234567812345q";
i need to convert to hex format array like
{0x0h,0x0e............0x0q}.
so by using this array i can keep the data in the crc and write it to the peripheral data as
Byte comm[24];
comm[0]=0x01;
comm[1]=0x30;
comm[2]=0x62;
comm[3]=0x00;................
have tried with many possible solutions but not luck.can any body help will be greatly appreciated.

A. The hexadecimal format is simply another representation of the same data.
B. You do not convert them into hex array. Every character has a number. For example in ASCII and UTF-8 the A has the number 65 (decimal representation). This is 0x41 in hex representation.
'A' (ASCII) == 65 == 0x41.
A hex number has the the digits 0-9, a-f, wherein a has the value of 10, b the value of 11 … It is converter into decimal representation by multiplying the upper digit by 16 and adding the lower digit. (0x41: 4 x 16 + 1 = 65.)
Please read and understand this: http://en.wikipedia.org/wiki/Hexadecimal
C. To convert a string into its number, you have to know, which code you want to apply. Probably you want to use UTF-8.
NSString *text = #"helloworld123123989128739";
NSUInteger length = [text lengthOfBytesUsingEncoding:NSUTF8StringEncoding];
char data[length];
[text getCString:data maxLength:length usingEncoding:NSUTF8StringEncoding];
// Here we go

Byte[] class is an array of characters. I mean you can only set one character at it's index.
If we have
Byte comm[24]; then comm[0]=0x01; is looks like confusing here because it only saves one character.
And the statement will be like comm[0]='x';.
Below code will creates Byte[] from given string.
NSString *stringsdata=#"helloworld1234567812345q";
CFStringRef cfString = (__bridge CFStringRef)stringsdata;
char *array = charArrayFromCFStringRef(cfString);
size_t length= strlen(array);
Byte comm[24];
for (int i = 0; i < length; i++) {
comm[i] = array[i];
}
Conversion function:
char * charArrayFromCFStringRef(CFStringRef stringRef) {
if (stringRef == NULL) {
return NULL;
}
CFIndex length = CFStringGetLength(stringRef);
CFIndex maxSize = CFStringGetMaximumSizeForEncoding(length, kCFStringEncodingUTF8);
char *buffer = (char *)malloc(maxSize);
if (CFStringGetCString(stringRef, buffer, maxSize, kCFStringEncodingUTF8)) {
return buffer;
}
return NULL;
}
OutPut:
Printing description of comm:
(Byte [24]) comm = {
[0] = 'h'
[1] = 'e'
[2] = 'l'
[3] = 'l'
[4] = 'o'
[5] = 'w'
[6] = 'o'
[7] = 'r'
[8] = 'l'
[9] = 'd'
[10] = '1'
[11] = '2'
[12] = '3'
[13] = '4'
[14] = '5'
[15] = '6'
[16] = '7'
[17] = '8'
[18] = '1'
[19] = '2'
[20] = '3'
[21] = '4'
[22] = '5'
[23] = 'q'
}
The thing here is if you still convert any character from Byte[] then you can only save one character at any index.
Because for above characters it's hex value is more than one character and you can only save one character in Byte[].
I suggest to use NSArray to save each character's hex value in NSString format.

Related

Converting C unsigned char array with null-terminating char in middle to Objective-C NSString

In an iOS app, I'm using a 3rd party SDK written in C.
As a result of an SDK method call I receive an unsigned char array (see example below).
I need to convert this array to an Objective-C String (NSString) as to save it and later on convert it back to C to pass it as a parameter to another SDK method.
I've seen multiple ways to convert a C string to Objective-C.
NSString *example1 = [NSString stringWithFormat:#"%s", myArray]; // "8oFDO{c."rägÕªö"
NSString *example2 = [[NSString alloc] initWithBytes:myArray length:sizeof(myArray) encoding:NSASCIIStringEncoding]; // "8oFDO{c."rägÕªö"
...
But none of them seems to allows \0 (null-terminating character) in the middle (see [29] in the example below).
Example unsigned char array response:
{
[0] = '8'
[1] = '\b'
[2] = '\x01'
[3] = 'o'
[4] = '\x01'
[5] = '\x03'
[6] = 'F'
[7] = 'D'
[8] = 'O'
[9] = '\x02'
[10] = '\x10'
[11] = '\x0e'
[12] = '{'
[13] = '\x8d'
[14] = 'c'
[15] = '.'
[16] = '\x19'
[17] = '"'
[18] = 'r'
[19] = '\xe4'
[20] = 'g'
[21] = '\x18'
[22] = '\xd5'
[23] = '\xaa'
[24] = '\xf6'
[25] = '\x95'
[26] = '\x18'
[27] = '\x03'
[28] = '\x01'
[29] = '\0'
[30] = '\x04'
[31] = '\x01'
[32] = '\x05'
[33] = '\x05'
[34] = '\x01'
[35] = '\x05'
[36] = '\x06'
[37] = '\x01'
[38] = '\x01'
...
}
How can I convert from C to Objective-C and then back from Objective-C to C?
This unsigned char array doesn't sims to be a string at all. You should handle it as if it is void * and use NSData to store such arguments.

Randomisation Services iOS (Random bytes conversion to NSString)

I am trying to generate a random bytes received from the SecRandomCopyBytes to a Random String.
uint8_t resultBytes[10];
SecRandomCopyBytes(kSecRandomDefault, 10, resultBytes);
The resultBytes is shown below.
(uint8_t [10]) resultBytes = ([0] = '\xf5', [1] = '[', [2] =
'\x0e', [3] = '\xb0', [4] = '\xaf', [5] = '|',
[6] = '\x13', [7] = '\xfb', [8] = 'r', [9] = '\xb8')
If I convert to NSString with ASCII encoding it works but does not work with UTF encoding.
NSLog(#"Random Number Results=<%#>",[[NSString alloc] initWithBytes:resultBytes length:10 encoding:NSASCIIStringEncoding]);
Number Results =<õ[°¯|ûr¸>
Is the output string correct or I am doing something wrong?

How to convert an Int to Hex String in Swift

In Obj-C I used to convert an unsigned integer n to a hex string with
NSString *st = [NSString stringWithFormat:#"%2X", n];
I tried for a long time to translate this into Swift language, but unsuccessfully.
You can now do:
let n = 14
var st = String(format:"%02X", n)
st += " is the hexadecimal representation of \(n)"
print(st)
0E is the hexadecimal representation of 14
Note: The 2 in this example is the field width and represents the minimum length desired. The 0 tells it to pad the result with leading 0's if necessary. (Without the 0, the result would be padded with leading spaces). Of course, if the result is larger than two characters, the field length will not be clipped to a width of 2; it will expand to whatever length is necessary to display the full result.
This only works if you have Foundation imported (this includes the import of Cocoa or UIKit). This isn't a problem if you're doing iOS or macOS programming.
Use uppercase X if you want A...F and lowercase x if you want a...f:
String(format: "%x %X", 64206, 64206) // "face FACE"
If you want to print integer values larger than UInt32.max, add ll (el-el, not eleven) to the format string:
let n = UInt64.max
print(String(format: "%llX is hexadecimal for \(n)", n))
FFFFFFFFFFFFFFFF is hexadecimal for 18446744073709551615
Original Answer
You can still use NSString to do this. The format is:
var st = NSString(format:"%2X", n)
This makes st an NSString, so then things like += do not work. If you want to be able to append to the string with += make st into a String like this:
var st = NSString(format:"%2X", n) as String
or
var st = String(NSString(format:"%2X", n))
or
var st: String = NSString(format:"%2X", n)
Then you can do:
let n = 123
var st = NSString(format:"%2X", n) as String
st += " is the hexadecimal representation of \(n)"
// "7B is the hexadecimal representation of 123"
In Swift there is a specific init method on String for exactly this:
let hex = String(0xF, radix: 16, uppercase: false)
println("hex=\(hex)") // Output: f
With Swift 5, according to your needs, you may choose one of the three following methods in order to solve your problem.
#1. Using String's init(_:radix:uppercase:) initializer
Swift String has a init(_:radix:uppercase:) initializer with the following declaration:
init<T>(_ value: T, radix: Int = 10, uppercase: Bool = false) where T : BinaryInteger
Creates a string representing the given value in base 10, or some other specified base.
The Playground code below shows how to create a String instance that represents an integer value in hexadecimal format by using init(_:radix:uppercase:) and without having to import Foundation:
let string1 = String(2, radix: 16)
print(string1) // prints: "2"
let string2 = String(211, radix: 16)
print(string2) // prints: "d3"
let string3 = String(211, radix: 16, uppercase: true)
print(string3) // prints: "D3"
#2. Using String's init(format:​_:​) initializer
Foundation provides String a init(format:​_:​) initializer. init(format:​_:​) has the following declaration:
init(format: String, _ arguments: CVarArg...)
Returns a String object initialized by using a given format string as a template into which the remaining argument values are substituted.
The Apple's String Programming Guide gives a list of the format specifiers that are supported by String and NSString. Among those format specifiers, %X has the following description:
Unsigned 32-bit integer (unsigned int), printed in hexadecimal using the digits 0–9 and uppercase A–F.
The Playground code below shows how to create a String instance that represents an integer value in hexadecimal format with init(format:​_:​):
import Foundation
let string1 = String(format:"%X", 2)
print(string1) // prints: "2"
let string2 = String(format:"%02X", 1)
print(string2) // prints: "01"
let string3 = String(format:"%02X", 211)
print(string3) // prints: "D3"
let string4 = String(format: "%02X, %02X, %02X", 12, 121, 255)
print(string4) // prints: "0C, 79, FF"
#3. Using String's init(format:​arguments:​) initializer
Foundation provides String a init(format:​arguments:​) initializer. init(format:​arguments:​) has the following declaration:
init(format: String, arguments: [CVarArg])
Returns a String object initialized by using a given format string as a template into which the remaining argument values are substituted according to the user’s default locale.
The Playground code below shows how to create a String instance that represents an integer value in hexadecimal format with init(format:​arguments:​):
import Foundation
let string1 = String(format:"%X", arguments: [2])
print(string1) // prints: "2"
let string2 = String(format:"%02X", arguments: [1])
print(string2) // prints: "01"
let string3 = String(format:"%02X", arguments: [211])
print(string3) // prints: "D3"
let string4 = String(format: "%02X, %02X, %02X", arguments: [12, 121, 255])
print(string4) // prints: "0C, 79, FF"
Swift 5.2.4
let value = 200
let hexString = String(format: "%02X", value)
Answers above work fine for values in the range of a 32 bit Int, but values over this won't work as the value will roll over.
You need to use the length modifier for values greater than a 32bit Int
%x = Unsigned 32-bit integer (unsigned int)
ll = Length modifiers specifying that a following d, o, u, x, or X conversion specifier applies to a long long or unsigned long long argument.
let hexString = String(format:"%llX", decimalValue)
To use
let string2 = String(format:"%02X", 1)
print(string2) // prints: "01"
In Swift3 import foundation is not required, At least not in a Project.
String should have all the functionality as NSString.

Hashing the device id into a 64 bit (or greater) then convert that into base-31 [iOS]

I am having trouble hashing my device's id into a 64 bit (or greater) representation and then converting that into a base-31 representation. Any one have any tips or guidance? I have been looking online and can't seem to find much.
Each base-31 digit should then be represented by the this list: 2 3 4 5 6 7 8 9 A B C D E F G H J K M N P Q R S T U V W X Y Z
What I've tried:
NSString *myID = [[[UIDevice currentDevice] identifierForVendor]UUIDString];
NSLog(#"Non Hash: %#", myID); //Logs the 36 character string
myID = [NSString stringWithFormat:#"%lu",(unsigned long)[myID hash]]; // Changed thanks to rokjarc
NSLog(#"Hash: %#", myID); //Logs same 36 character string
//Logs 36 character string
NSLog(#"UUIDString: %#", [[[UIDevice currentDevice] identifierForVendor] UUIDString]);
//Logs out a 10 character numeric value
NSLog(#"Hash: %lu", (unsigned long)[[[[UIDevice currentDevice] identifierForVendor] UUIDString] hash]);
//Logs out a 2 character numeric value
NSLog(#"LongLong: %lld", [[[[UIDevice currentDevice] identifierForVendor] UUIDString] longLongValue]);
[[[UIDevice currentDevice] identifierForVendor]UUIDString] returns a UUID which is comprised of 32 hex characters which is 128 bits. 12 base31 characters can only represent 63 bits. Thus the entire UUID can not be represented.
Best bet is to run the UUID through SHA (which seems to be what [myID hash] does) and convert 63 of the bits of that into 12 base31 characters.
The reason for the hash function (SHA) is to remove any pattern in the UUID, each bit in the result of SHA is equally likely to be a 1 or 0.
Notes:
31^12 = 7.87E17 and 2^64 = 1.84E19
thus a 64 bit number can not be represented in 12 base 31 characters. 63 bit can however.
Base32 is a lot simpler than base31 for values larger than 64 bits.
Here is a code sample that creates a string of base31 characters from a 64-bit integer:
uint64_t uid = 14467240737094581;
NSString *baseCharacters = #"23456789ABCDEFGHJKMNPQRSTUVWXYZ";
NSUInteger base = baseCharacters.length;
NSMutableString *baseString = [NSMutableString new];
while (baseString.length < 12) {
uint64_t remainder = uid % base;
uid /= base;
NSString *baseCharacter = [baseCharacters substringWithRange:NSMakeRange(remainder, 1)];
[baseString insertString:baseCharacter atIndex:0];
}
NSLog(#"baseString: %#", baseString);
NSLog output:
baseString: 2KP7MAR5CX86

How do you convert 8-bit bytes to 6-bit characters?

I have a specific requirement to convert a stream of bytes into a character encoding that happens to be 6-bits per character.
Here's an example:
Input: 0x50 0x11 0xa0
Character Table:
010100 T
000001 A
000110 F
100000 SPACE
Output: "TAF "
Logically I can understand how this works:
Taking 0x50 0x11 0xa0 and showing as binary:
01010000 00010001 10100000
Which is "TAF ".
What's the best way to do this programmatically (pseudo code or c++). Thank you!
Well, every 3 bytes, you end up with four characters. So for one thing, you need to work out what to do if the input isn't a multiple of three bytes. (Does it have padding of some kind, like base64?)
Then I'd probably take each 3 bytes in turn. In C#, which is close enough to pseudo-code for C :)
for (int i = 0; i < array.Length; i += 3)
{
// Top 6 bits of byte i
int value1 = array[i] >> 2;
// Bottom 2 bits of byte i, top 4 bits of byte i+1
int value2 = ((array[i] & 0x3) << 4) | (array[i + 1] >> 4);
// Bottom 4 bits of byte i+1, top 2 bits of byte i+2
int value3 = ((array[i + 1] & 0xf) << 2) | (array[i + 2] >> 6);
// Bottom 6 bits of byte i+2
int value4 = array[i + 2] & 0x3f;
// Now use value1...value4, e.g. putting them into a char array.
// You'll need to decode from the 6-bit number (0-63) to the character.
}
Just in case if someone is interested - another variant that extracts 6-bit numbers from the stream as soon as they appear there. That is, results can be obtained even if less then 3 bytes are currently read. Would be useful for unpadded streams.
The code saves the state of the accumulator a in variable n which stores the number of bits left in accumulator from the previous read.
int n = 0;
unsigned char a = 0;
unsigned char b = 0;
while (read_byte(&byte)) {
// save (6-n) most significant bits of input byte to proper position
// in accumulator
a |= (b >> (n + 2)) & (077 >> n);
store_6bit(a);
a = 0;
// save remaining least significant bits of input byte to proper
// position in accumulator
a |= (b << (4 - n)) & ((077 << (4 - n)) & 077);
if (n == 4) {
store_6bit(a);
a = 0;
}
n = (n + 2) % 6;
}

Resources