Swift 3 - How to convert memory of Int32 as four characters - ios

I want to convert an Int32 to a string consisting of four C-style, 1-byte wide characters (probably closely related to this but in Swift 3).
The use for this is that many API functions of Core Audio return an OSStatus (really an Int32), which can often be interpreted as string consisting of four C-style characters.
fun interpretAsString(possibleMsg: Int32) -> String {
// Blackbox
}

Actually a "four character code" is usually an unsigned 32-bit
value:
public typealias FourCharCode = UInt32
public typealias OSType = FourCharCode
The four bytes (from the MSB to the LSB) each define one character.
Here is a simple Swift 3 function to convert the integer to a string,
inspired by the various C/Objective-C/Swift 1+2 solutions in
iOS/C: Convert "integer" into four character string:
func fourCCToString(_ value: FourCharCode) -> String {
let utf16 = [
UInt16((value >> 24) & 0xFF),
UInt16((value >> 16) & 0xFF),
UInt16((value >> 8) & 0xFF),
UInt16((value & 0xFF)) ]
return String(utf16CodeUnits: utf16, count: 4)
}
Example:
print(fourCCToString(0x48454C4F)) // HELO
I have chosen an array with the UTF-16 code points as intermediate storage because that can directly be used to create a string.
If you really need it for a signed 32-bit integer then you can
call
fourCCToString(FourCharCode(bitPattern: i32value)
or define a similar function taking an Int32 parameter.
As Tim Vermeulen suggested below, the UTF-16 array can also be
created with map:
let utf16 = stride(from: 24, through: 0, by: -8).map {
UInt16((value >> $0) & 0xFF)
}
or
let utf16 = [24, 16, 8, 0].map { UInt16((value >> $0) & 0xFF) }
Unless the function is performance critical for your application,
pick what you feel most familiar with (otherwise measure and compare).

I don't test this code but try this:
func interpretAsString(possibleMsg: Int32) -> String {
var result = String()
result.append(Character(UnicodeScalar(UInt32(possibleMsg>>24))!))
result.append(Character(UnicodeScalar(UInt32((possibleMsg>>16) & UInt32(0xFF)))!))
result.append(Character(UnicodeScalar(UInt32((possibleMsg>>8) & UInt32(0xFF)))!))
result.append(Character(UnicodeScalar(UInt32((possibleMsg) & UInt32(0xFF)))!))
return result
}

This may be an old question, but since it was asking in the context of Core Audio, I just wanted to share a variant I was playing with.
For Core Audio, where some (but not all?) OSStatus/Int32 values are defined using four characters, some code from Apple's old Core Audio Utility Classes can provide inspiration (very similar to the linked question)
From CAXException.h:
class CAX4CCStringNoQuote {
public:
CAX4CCStringNoQuote(OSStatus error) {
// see if it appears to be a 4-char-code
UInt32 beErr = CFSwapInt32HostToBig(error);
char *str = mStr;
memcpy(str, &beErr, 4);
if (isprint(str[0]) && isprint(str[1]) && isprint(str[2]) && isprint(str[3])) {
str[4] = '\0';
} else if (error > -200000 && error < 200000)
// no, format it as an integer
snprintf(str, sizeof(mStr), "%d", (int)error);
else
snprintf(str, sizeof(mStr), "0x%x", (int)error);
}
const char *get() const { return mStr; }
operator const char *() const { return mStr; }
private:
char mStr[16];
};
In Swift 5, one rough translation (without the hex representation for large values) might be:
private func osStatusToString(_ value: OSStatus) -> String {
let data = withUnsafeBytes(of: value.bigEndian, { Data($0) })
// If all bytes are printable characters, we treat it like characters of a string
if data.allSatisfy({ 0x20 <= $0 && $0 <= 0x7e }) {
return String(data: data, encoding: .ascii)!
} else {
return String(value)
}
}
Note that the Data initializer is making a copy of the bytes, though it may be possible to avoid that if desired.
Of course, with Core Audio we encounter four character codes with both Int32 and UInt32 types. I haven't done generics with Swift before, but one way to handle them in a single function could be:
private func stringifyErrorCode<T: FixedWidthInteger>(_ value: T) -> String {
let data = withUnsafeBytes(of: value.bigEndian, { Data($0) })
// If all bytes are printable characters, we treat it like characters of a string
if data.allSatisfy({ 0x20 <= $0 && $0 <= 0x7e }) {
return String(data: data, encoding: .ascii)!
} else {
return String(value, radix: 10)
}
}
This may not be suitable for general purpose handling of four character codes (I've seen other answers that support characters in the MacOS Roman encoding versus ASCII in the example above. There's likely some history there I'm not aware of), but may be reasonable for Core Audio status/selector codes.

Related

Cannot Initialise Variable for type Unsafepointer Swift 3.0 conversion

Hi I am converting existing swift 2.0 code to swift 3.0 but came across an error while conversion:
Cannot invoke initializer for type 'UnsafePointer' with an argument list of type '(UnsafeRawPointer)'
Here is my code:
extension Data {
var hexString : String {
let buf = UnsafePointer<UInt8>(bytes) // here is the error
let charA = UInt8(UnicodeScalar("a").value)
let char0 = UInt8(UnicodeScalar("0").value)
func itoh(_ i: UInt8) -> UInt8 {
return (i > 9) ? (charA + i - 10) : (char0 + i)
}
let p = UnsafeMutablePointer<UInt8>.allocate(capacity: count * 2)
for i in 0..<count {
p[i*2] = itoh((buf[i] >> 4) & 0xF)
p[i*2+1] = itoh(buf[i] & 0xF)
}
return NSString(bytesNoCopy: p, length: count*2, encoding: String.Encoding.utf8.rawValue, freeWhenDone: true)! as String
}
}
In Swift 3 you have to use withUnsafeBytes() to access the raw bytes of a Data value. In your case:
withUnsafeBytes { (buf: UnsafePointer<UInt8>) in
for i in 0..<count {
p[i*2] = itoh((buf[i] >> 4) & 0xF)
p[i*2+1] = itoh(buf[i] & 0xF)
}
}
Alternatively, use the fact that Data is a collection of bytes:
for (i, byte) in self.enumerated() {
p[i*2] = itoh((byte >> 4) & 0xF)
p[i*2+1] = itoh(byte & 0xF)
}
Note that there is another problem in your code:
NSString(..., freeWhenDone: true)
uses free() to release the memory, which means that it must be
allocated with malloc().
Other (shorter, but potentially less efficient) methods to create
a hex representation of a Data value can be found at
How to convert Data to hex string in swift.

iOS Swift 3 BLE read characteristic returning "1 bytes"? [duplicate]

I want the hexadecimal representation of a Data value in Swift.
Eventually I'd want to use it like this:
let data = Data(base64Encoded: "aGVsbG8gd29ybGQ=")!
print(data.hexString)
A simple implementation (taken from How to hash NSString with SHA1 in Swift?, with an additional option for uppercase output) would be
extension Data {
struct HexEncodingOptions: OptionSet {
let rawValue: Int
static let upperCase = HexEncodingOptions(rawValue: 1 << 0)
}
func hexEncodedString(options: HexEncodingOptions = []) -> String {
let format = options.contains(.upperCase) ? "%02hhX" : "%02hhx"
return self.map { String(format: format, $0) }.joined()
}
}
I chose a hexEncodedString(options:) method in the style of the existing method base64EncodedString(options:).
Data conforms to the Collection protocol, therefore one can use
map() to map each byte to the corresponding hex string.
The %02x format prints the argument in base 16, filled up to two digits
with a leading zero if necessary. The hh modifier causes the argument
(which is passed as an integer on the stack) to be treated as a one byte
quantity. One could omit the modifier here because $0 is an unsigned
number (UInt8) and no sign-extension will occur, but it does no harm leaving
it in.
The result is then joined to a single string.
Example:
let data = Data([0, 1, 127, 128, 255])
// For Swift < 4.2 use:
// let data = Data(bytes: [0, 1, 127, 128, 255])
print(data.hexEncodedString()) // 00017f80ff
print(data.hexEncodedString(options: .upperCase)) // 00017F80FF
The following implementation is faster by a factor about 50
(tested with 1000 random bytes). It is inspired to
RenniePet's solution
and Nick Moore's solution, but takes advantage of
String(unsafeUninitializedCapacity:initializingUTF8With:)
which was introduced with Swift 5.3/Xcode 12 and is available on macOS 11 and iOS 14 or newer.
This method allows to create a Swift string from UTF-8 units efficiently, without unnecessary copying or reallocations.
An alternative implementation for older macOS/iOS versions is also provided.
extension Data {
struct HexEncodingOptions: OptionSet {
let rawValue: Int
static let upperCase = HexEncodingOptions(rawValue: 1 << 0)
}
func hexEncodedString(options: HexEncodingOptions = []) -> String {
let hexDigits = options.contains(.upperCase) ? "0123456789ABCDEF" : "0123456789abcdef"
if #available(macOS 11.0, iOS 14.0, watchOS 7.0, tvOS 14.0, *) {
let utf8Digits = Array(hexDigits.utf8)
return String(unsafeUninitializedCapacity: 2 * self.count) { (ptr) -> Int in
var p = ptr.baseAddress!
for byte in self {
p[0] = utf8Digits[Int(byte / 16)]
p[1] = utf8Digits[Int(byte % 16)]
p += 2
}
return 2 * self.count
}
} else {
let utf16Digits = Array(hexDigits.utf16)
var chars: [unichar] = []
chars.reserveCapacity(2 * self.count)
for byte in self {
chars.append(utf16Digits[Int(byte / 16)])
chars.append(utf16Digits[Int(byte % 16)])
}
return String(utf16CodeUnits: chars, count: chars.count)
}
}
}
This code extends the Data type with a computed property. It iterates through the bytes of data and concatenates the byte's hex representation to the result:
extension Data {
var hexDescription: String {
return reduce("") {$0 + String(format: "%02x", $1)}
}
}
My version. It's about 10 times faster than the [original] accepted answer by Martin R.
public extension Data {
private static let hexAlphabet = Array("0123456789abcdef".unicodeScalars)
func hexStringEncoded() -> String {
String(reduce(into: "".unicodeScalars) { result, value in
result.append(Self.hexAlphabet[Int(value / 0x10)])
result.append(Self.hexAlphabet[Int(value % 0x10)])
})
}
}
Swift 4 - From Data to Hex String
Based upon Martin R's solution but even a tiny bit faster.
extension Data {
/// A hexadecimal string representation of the bytes.
func hexEncodedString() -> String {
let hexDigits = Array("0123456789abcdef".utf16)
var hexChars = [UTF16.CodeUnit]()
hexChars.reserveCapacity(count * 2)
for byte in self {
let (index1, index2) = Int(byte).quotientAndRemainder(dividingBy: 16)
hexChars.append(hexDigits[index1])
hexChars.append(hexDigits[index2])
}
return String(utf16CodeUnits: hexChars, count: hexChars.count)
}
}
Swift 4 - From Hex String to Data
I've also added a fast solution for converting a hex String into Data (based on a C solution).
extension String {
/// A data representation of the hexadecimal bytes in this string.
func hexDecodedData() -> Data {
// Get the UTF8 characters of this string
let chars = Array(utf8)
// Keep the bytes in an UInt8 array and later convert it to Data
var bytes = [UInt8]()
bytes.reserveCapacity(count / 2)
// It is a lot faster to use a lookup map instead of strtoul
let map: [UInt8] = [
0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, // 01234567
0x08, 0x09, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, // 89:;<=>?
0x00, 0x0a, 0x0b, 0x0c, 0x0d, 0x0e, 0x0f, 0x00, // #ABCDEFG
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 // HIJKLMNO
]
// Grab two characters at a time, map them and turn it into a byte
for i in stride(from: 0, to: count, by: 2) {
let index1 = Int(chars[i] & 0x1F ^ 0x10)
let index2 = Int(chars[i + 1] & 0x1F ^ 0x10)
bytes.append(map[index1] << 4 | map[index2])
}
return Data(bytes)
}
}
Note: this function does not validate the input. Make sure that it is only used for hexadecimal strings with (an even amount of) characters.
Backward compatible and fast solution:
extension Data {
/// Fast convert to hex by reserving memory (instead of mapping and join).
public func toHex(uppercase: Bool = false) -> String {
// Constants (Hex has 2 characters for each Byte).
let size = self.count * 2;
let degitToCharMap = Array((
uppercase ? "0123456789ABCDEF" : "0123456789abcdef"
).utf16);
// Reserve dynamic memory (plus one for null termination).
let buffer = UnsafeMutablePointer<unichar>.allocate(capacity: size + 1);
// Convert each byte.
var index = 0
for byte in self {
buffer[index] = degitToCharMap[Int(byte / 16)];
index += 1;
buffer[index] = degitToCharMap[Int(byte % 16)];
index += 1;
}
// Set Null termination.
buffer[index] = 0;
// Casts to string (without any copying).
return String(utf16CodeUnitsNoCopy: buffer,
count: size, freeWhenDone: true)
}
}
Note that above passes ownership of buffer to returned String object.
Also know that, because Swift's internal String data is UTF16 (but can be UTF8 since Swift 5), all solutions provided in accepted answer do full copy (and are slower), at least if NOT #available(macOS 11.0, iOS 14.0, watchOS 7.0, tvOS 14.0, *) ;-)
As mentioned on my profile, usage under Apache 2.0 license is allowed too (without attribution need).
This doesn't really answer the OP's question since it works on a Swift byte array, not a Data object. And it's much bigger than the other answers. But it should be more efficient since it avoids using String(format: ).
Anyway, in the hopes someone finds this useful ...
public class StringMisc {
// MARK: - Constants
// This is used by the byteArrayToHexString() method
private static let CHexLookup : [Character] =
[ "0", "1", "2", "3", "4", "5", "6", "7", "8", "9", "A", "B", "C", "D", "E", "F" ]
// Mark: - Public methods
/// Method to convert a byte array into a string containing hex characters, without any
/// additional formatting.
public static func byteArrayToHexString(_ byteArray : [UInt8]) -> String {
var stringToReturn = ""
for oneByte in byteArray {
let asInt = Int(oneByte)
stringToReturn.append(StringMisc.CHexLookup[asInt >> 4])
stringToReturn.append(StringMisc.CHexLookup[asInt & 0x0f])
}
return stringToReturn
}
}
Test case:
// Test the byteArrayToHexString() method
let byteArray : [UInt8] = [ 0x25, 0x99, 0xf3 ]
assert(StringMisc.byteArrayToHexString(byteArray) == "2599F3")
A bit different from other answers here:
extension DataProtocol {
func hexEncodedString(uppercase: Bool = false) -> String {
return self.map {
if $0 < 16 {
return "0" + String($0, radix: 16, uppercase: uppercase)
} else {
return String($0, radix: 16, uppercase: uppercase)
}
}.joined()
}
}
However in my basic XCTest + measure setup this was fastest of the 4 I tried.
Going through a 1000 bytes of (the same) random data 100 times each:
Above: Time average: 0.028 seconds, relative standard deviation: 1.3%
MartinR: Time average: 0.037 seconds, relative standard deviation: 6.2%
Zyphrax: Time average: 0.032 seconds, relative standard deviation: 2.9%
NickMoore: Time average: 0.039 seconds, relative standard deviation: 2.0%
Repeating the test returned the same relative results. (Nick and Martins sometimes swapped)
Edit:
Nowadays I use this:
var hexEncodedString: String {
return self.reduce(into:"") { result, byte in
result.append(String(byte >> 4, radix: 16))
result.append(String(byte & 0x0f, radix: 16))
}
}
Maybe not the fastest, but data.map({ String($0, radix: 16) }).joined() does the job. As mentioned in the comments, this solution was flawed.

How do I convert an NSData object with hex data to ASCII in Swift?

I have an NSData object with hex data and I want to convert it to an ASCII string. I've seen several similar questions to mine but they are all either in Objective-C and/or they convert a string into hex data instead of the other way around.
I found this function but it doesn't work in Swift 2 and the Apple documentation doesn't explain the difference between the old stride and the new stride (it doesn't explain stride at all):
func hex2ascii (example: String) -> String
{
var chars = [Character]()
for c in example.characters
{
chars.append(c)
}
let numbers = stride(from: 0, through: chars.count, by: 2).map{ // error: 'stride(from:through:by:)' is unavailable: call the 'stride(through:by:)' method instead.
strtoul(String(chars[$0 ..< $0+2]), nil, 16)
}
var final = ""
var i = 0
while i < numbers.count {
final.append(Character(UnicodeScalar(Int(numbers[i]))))
i++
}
return final
}
I don't know what stride is and I don't know what it does.
How do you convert hex to ASCII in Swift 2? Maybe an NSData extension...
Thanks!
try:
let asciiString = String(data: data, encoding: NSASCIIStringEncoding)
print(asciiString)
Sorry for answering my own question, but I just (accidentally) found an amazing solution to my problem and hopefully this will help someone.
If you have an NSData object with a hex representation of an ASCII string, then all you have to do is write String(data: theNSDataObject, encoding: NSUTF8StringEncoding) and that is the ASCII string.
Hope this helps someone!
In swift 2.0 stride became a method on Int rather than a standalone method so now you do something like
0.stride(through: 10, by: 2)
So now the code you posted should be:
func hex2ascii (example: String) -> String {
var chars = [Character]()
for c in example.characters {
chars.append(c)
}
let numbers = 0.stride(through: chars.count, by: 2).map{
strtoul(String(chars[$0 ..< $0+2]), nil, 16)
}
var final = ""
var i = 0
while i < numbers.count {
final.append(Character(UnicodeScalar(Int(numbers[i]))))
i++
}
return final
}

strtoul() Function- Swift

I'm trying to create a swift iOS program that converts a number into dec, bin, and hex numbers. I've come across the strtoul function, but don't quite understand how to use it, would someone be able to explain it? Thanks!
The method strtoul is pretty simple to use. You will need also to use String(radix:()) to convert it to the other direction. You can create an extension to convert from hexaToDecimal or from binaryToDecimal as follow:
Usage String(radix:())
extension Int {
var toBinary: String {
return String(self, radix: 2)
}
var toHexa: String {
return String(self, radix: 16)
}
}
Usage strtoul()
extension String {
var hexaToDecimal: Int {
return Int(strtoul(self, nil, 16))
}
var hexaToBinary: String {
return hexaToDecimal.toBinary
}
var binaryToDecimal: Int {
return Int(strtoul(self, nil, 2))
}
var binaryToHexa: String {
return binaryToDecimal.toHexa
}
}
Testing
let myBinFromInt = 255.toBinary // "11111111"
let myhexaFromInt = 255.toHexa // "ff"
let myIntFromHexa = "ff".hexaToDecimal // 255
let myBinFromHexa = "ff".hexaToBinary // "11111111"
let myIntFromBin = "11111111".binaryToDecimal // 255
let myHexaFromBin = "11111111".binaryToHexa // "ff"
The strtoul() function converts the string in str to an unsigned long
value. The conversion is done according to the given base, which must be between 2 and 36 inclusive, or be the special value 0.
Really it sounds like you want to use NSString
From what it sounds like, you want to convert an unsigned integer to decimal, hex and binary.
For example, if you had an integer n:
var st = NSString(format:"%2X", n)
would convert the integer to hexadecimal and store it in the variable st.
//NSString(format:"%2X", 10) would give you 'A' as 10 is A in hex
//NSString(format:"%2X", 17) would give you 11 as 17 is 11 in hex
Binary:
var st = NSString(format:"%u", n)
Decimal (2 decimal places)
var st = NSString(format:"%.02f", n)

In Swift, how to convert String to Int for base 2 though base 36 like Long.parseLong in Java?

How do I convert a String to a Long in Swift?
In Java I would do Long.parseLong("str", Character.MAX_RADIX).
We now have these conversion functions built-in in Swift Standard Library:
Encode using base 2 through 36: https://developer.apple.com/documentation/swift/string/2997127-init
Decode using base 2 through 36: https://developer.apple.com/documentation/swift/int/2924481-init
As noted here, you can use the standard library function strtoul():
let base36 = "1ARZ"
let number = strtoul(base36, nil, 36)
println(number) // Output: 60623
The third parameter is the radix. See the man page for how the function handles whitespace and other details.
Here is parseLong() in Swift. Note that the function returns an Int? (optional Int) that must be unwrapped to be used.
// Function to convert a String to an Int?. It returns nil
// if the string contains characters that are not valid digits
// in the base or if the number is too big to fit in an Int.
func parseLong(string: String, base: Int) -> Int? {
let digits = Array("0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZ")
var number = 0
for char in string.uppercaseString {
if let digit = find(digits, char) {
if digit < base {
// Compute the value of the number so far
// allowing for overflow
let newnumber = number &* base &+ digit
// Check for overflow and return nil if
// it did overflow
if newnumber < number {
return nil
}
number = newnumber
} else {
// Invalid digit for the base
return nil
}
} else {
// Invalid character not in digits
return nil
}
}
return number
}
if let result = parseLong("1110", 2) {
println("1110 in base 2 is \(result)") // "1110 in base 2 is 14"
}
if let result = parseLong("ZZ", 36) {
println("ZZ in base 36 is \(result)") // "ZZ in base 36 is 1295"
}
Swift ways:
"123".toInt() // Return an optional
C way:
atol("1232")
Or use the NSString's integerValue method
("123" as NSString).integerValue

Resources