Convert Hexadecimal String to Base64 in Swift - ios

Is there a way to convert a hexadecimal string to base64 in Swift? For example, I would like to convert:
BA5E64C0DE
to:
ul5kwN4=
It's possible to convert a normal string to base64 by using:
let hex: String = "BA5E64C0DE"
let utf8str: NSData = hex.dataUsingEncoding(NSUTF8StringEncoding)!
let base64Encoded: NSString = utf8str.base64EncodedStringWithOptions(NSDataBase64EncodingOptions(rawValue: 0))
let base64: String = (base64Encoded as String)
But this would give a result of:
QkE1RTY0QzBERQ==
Because it's just treating the hex as a normal UTF-8 String, and not hexadecimal.
It's possible to convert it to base64 correctly by looping through every six hex characters and converting it to it's four respective base64 characters, but this would be highly inefficient, and is just plain stupid (there would need to be 17,830,160 if statements):
if(hex == "000000"){base64+="AAAA"}
else if(hex == "000001"){base64+="AAAB"}
else if(hex == "000002"){base64+="AAAC"}
else if(hex == "BA5E64"){base64+="ul5k"}
//...
It would be nice if there was something like this:
let hex: String = "BA5E64C0DE"
let data: NSData = hex.dataUsingEncoding(NSHexadecimalEncoding)!
let base64Encoded: NSString = data.base64EncodedStringWithOptions(NSDataBase64EncodingOptions(rawValue: 0))
let base64: String = (base64Encoded as String)
But sadly, there's no NSHexadecimalEncoding. Is there any efficient way to convert a hexadecimal string to it's base64 representation in Swift?

The base-64 string, "ul5kwN4=", translates to a binary NSData consisting of of five bytes BA, 5E, 64, C0, and DE.
Now, if you really had a string with the hexadecimal representation, you could convert it to a binary NSData using a routine like the one here: https://stackoverflow.com/a/26502285/1271826
Once you have a NSData, you could build your base 64 string:
let hexString = "BA5E64C0DE"
let binaryData = hexString.dataFromHexadecimalString()
let base64String = binaryData?.base64EncodedStringWithOptions(nil)
That generates the desired output, ul5kwN4=.

First, convert Hex String to Data using the following routine. (Worked in Swift 3.0.2)
extension String {
/// Expanded encoding
///
/// - bytesHexLiteral: Hex string of bytes
/// - base64: Base64 string
enum ExpandedEncoding {
/// Hex string of bytes
case bytesHexLiteral
/// Base64 string
case base64
}
/// Convert to `Data` with expanded encoding
///
/// - Parameter encoding: Expanded encoding
/// - Returns: data
func data(using encoding: ExpandedEncoding) -> Data? {
switch encoding {
case .bytesHexLiteral:
guard self.characters.count % 2 == 0 else { return nil }
var data = Data()
var byteLiteral = ""
for (index, character) in self.characters.enumerated() {
if index % 2 == 0 {
byteLiteral = String(character)
} else {
byteLiteral.append(character)
guard let byte = UInt8(byteLiteral, radix: 16) else { return nil }
data.append(byte)
}
}
return data
case .base64:
return Data(base64Encoded: self)
}
}
}
Then, convert Data to Base64 String using Data.base64EncodedString(options:).
Usage
let base64 = "BA5E64C0DE".data(using: .bytesHexLiteral)?.base64EncodedString()
if let base64 = base64 {
print(base64)
// Prints "ul5kwN4="
}

Related

Convert NSDecimalNumber to hex String

I managed to convert a NSDecimalNumber to a hex String using BInt library. Here is my code:
public func toDecimalHex(value: NSDecimalNumber) -> String{
let bint = BInt(value.stringValue)
return (bint?.asString(radix: 16))!
}
example:
let number = NSDecimalNumber(string: "1000000000000000000000000")
let hex = toDecimalHex(value: number)
//result : d3c21bcecceda1000000
Basically, how can I convert without using any library like BInt? This brings too much overhead. I just want to get rid of it.

How to convert from unichar to character

There is better way to convert from unichar to Character?
tried :
var unichar = ...
let str = NSString(characters: &unichar, length: 1) as String
let character = Array(str.characters)[0])
You can convert unichar -> UnicodeScalar -> Character:
let c = unichar(8364)
if let uc = UnicodeScalar(c) {
let char = Character(uc)
print(char) // €
} else {
print("illegal input")
}
Input values in the range 0xD800...0xDFFF
(high and low surrogates) are not allowed because they do not
correspond to valid Unicode scalar values.
If it is guaranteed that those input values do not occur then you
can simplify the conversion to
let char = Character(UnicodeScalar(c)!)
To replace a possible invalid input value by a default character
(e.g. a question mark), use
let char = Character(UnicodeScalar(c) ?? "?")
public class func stringForIcon(_ icon: NSInteger) -> Character {
return Character(UnicodeScalar(icon)!)
}

Decode base64URL to base64 -- Swift

I haven't found properly way how to decode base64URL to base64 format in swift.
According to base64url to base64 hJQWHABDBjoPHorYF5xghQ(base64URL) should be hJQWHABDBjoPHorYF5xghQ==(base64). Here could be more differences.
There are no solutions on stackoverflow.
"base64url" differs from the standard Base64 encoding in two aspects:
different characters are used for index 62 and 63 (- and _ instead
of + and /)
no mandatory padding with = characters to make the string length
a multiple of four.
(compare https://en.wikipedia.org/wiki/Base64#Variants_summary_table).
Here is a possible conversion function:
func base64urlToBase64(base64url: String) -> String {
var base64 = base64url
.replacingOccurrences(of: "-", with: "+")
.replacingOccurrences(of: "_", with: "/")
if base64.characters.count % 4 != 0 {
base64.append(String(repeating: "=", count: 4 - base64.characters.count % 4))
}
return base64
}
Example:
let base64url = "hJQWHABDBjoPHorYF5xghQ"
let base64 = base64urlToBase64(base64url: base64url)
print(base64) // hJQWHABDBjoPHorYF5xghQ==
if let data = Data(base64Encoded: base64) {
print(data as NSData) // <8494161c 0043063a 0f1e8ad8 179c6085>
}
For the sake of completeness, this would be the opposite conversion:
func base64ToBase64url(base64: String) -> String {
let base64url = base64
.replacingOccurrences(of: "+", with: "-")
.replacingOccurrences(of: "/", with: "_")
.replacingOccurrences(of: "=", with: "")
return base64url
}
Update for Swift 4:
func base64urlToBase64(base64url: String) -> String {
var base64 = base64url
.replacingOccurrences(of: "-", with: "+")
.replacingOccurrences(of: "_", with: "/")
if base64.count % 4 != 0 {
base64.append(String(repeating: "=", count: 4 - base64.count % 4))
}
return base64
}
Here is a cleaned up version of what Martin posted within a Swift 4 extension:
import Foundation
/// Extension for making base64 representations of `Data` safe for
/// transmitting via URL query parameters
extension Data {
/// Instantiates data by decoding a base64url string into base64
///
/// - Parameter string: A base64url encoded string
init?(base64URLEncoded string: String) {
self.init(base64Encoded: string.toggleBase64URLSafe(on: false))
}
/// Encodes the string into a base64url safe representation
///
/// - Returns: A string that is base64 encoded but made safe for passing
/// in as a query parameter into a URL string
func base64URLEncodedString() -> String {
return self.base64EncodedString().toggleBase64URLSafe(on: true)
}
}
extension String {
/// Encodes or decodes into a base64url safe representation
///
/// - Parameter on: Whether or not the string should be made safe for URL strings
/// - Returns: if `on`, then a base64url string; if `off` then a base64 string
func toggleBase64URLSafe(on: Bool) -> String {
if on {
// Make base64 string safe for passing into URL query params
let base64url = self.replacingOccurrences(of: "/", with: "_")
.replacingOccurrences(of: "+", with: "-")
.replacingOccurrences(of: "=", with: "")
return base64url
} else {
// Return to base64 encoding
var base64 = self.replacingOccurrences(of: "_", with: "/")
.replacingOccurrences(of: "-", with: "+")
// Add any necessary padding with `=`
if base64.count % 4 != 0 {
base64.append(String(repeating: "=", count: 4 - base64.count % 4))
}
return base64
}
}
}
This is objective c version of Base64URL to Base64
-(NSString *) base64urlToBase64 :(NSString *)base64url
{
NSString * base64 = [base64url stringByReplacingOccurrencesOfString:#"-" withString:#"+"];
base64 = [base64 stringByReplacingOccurrencesOfString:#"_" withString:#"/"];
if(base64.length % 4 != 0)
{
base64 = [base64 stringByAppendingString:[self stringWithRepeatString:"=" times:(4 - base64.length % 4)]];
}
return base64;
}
-(NSString*)stringWithRepeatString:(char*)characters times:(unsigned int)repetitions;
{
unsigned long stringLength = strlen(characters);
unsigned long repeatStringLength = stringLength * repetitions + 1;
char repeatString[repeatStringLength];
for (unsigned int i = 0; i < repetitions; i++) {
unsigned int pointerPosition = i * repetitions;
memcpy(repeatString + pointerPosition, characters, stringLength);
}
repeatString[repeatStringLength - 1] = 0;
return [NSString stringWithCString:repeatString encoding:NSASCIIStringEncoding];
}

String with Unicode (variable) [duplicate]

I have a problem I couldn't find a solution to.
I have a string variable holding the unicode "1f44d" and I want to convert it to a unicode character 👍.
Usually one would do something like this:
println("\u{1f44d}") // 👍
Here is what I mean:
let charAsString = "1f44d" // code in variable
println("\u{\(charAsString)}") // not working
I have tried several other ways but somehow the workings behind this magic stay hidden for me.
One should imagine the value of charAsString coming from an API call or from another object.
One possible solution (explanations "inline"):
let charAsString = "1f44d"
// Convert hex string to numeric value first:
var charCode : UInt32 = 0
let scanner = NSScanner(string: charAsString)
if scanner.scanHexInt(&charCode) {
// Create string from Unicode code point:
let str = String(UnicodeScalar(charCode))
println(str) // 👍
} else {
println("invalid input")
}
Slightly simpler with Swift 2:
let charAsString = "1f44d"
// Convert hex string to numeric value first:
if let charCode = UInt32(charAsString, radix: 16) {
// Create string from Unicode code point:
let str = String(UnicodeScalar(charCode))
print(str) // 👍
} else {
print("invalid input")
}
Note also that not all code points are valid Unicode scalars,
compare Validate Unicode code point in Swift.
Update for Swift 3:
public init?(_ v: UInt32)
is now a failable initializer of UnicodeScalar and checks if the
given numeric input is a valid Unicode scalar value:
let charAsString = "1f44d"
// Convert hex string to numeric value first:
if let charCode = UInt32(charAsString, radix: 16),
let unicode = UnicodeScalar(charCode) {
// Create string from Unicode code point:
let str = String(unicode)
print(str) // 👍
} else {
print("invalid input")
}
This can be done in two steps:
convert charAsString to Int code
convert code to unicode character
Second step can be done e.g. like this
var code = 0x1f44d
var scalar = UnicodeScalar(code)
var string = "\(scalar)"
As for first the step, see here how to convert String in hex representation to Int
As of Swift 2.0, every Int type has an initializer able to take String as an input. You can then easily generate an UnicodeScalar corresponding and print it afterwards. Without having to change your representation of chars as string ;).
UPDATED: Swift 3.0 changed UnicodeScalar initializer
print("\u{1f44d}") // 👍
let charAsString = "1f44d" // code in variable
let charAsInt = Int(charAsString, radix: 16)! // As indicated by #MartinR radix is required, default won't do it
let uScalar = UnicodeScalar(charAsInt)! // In Swift 3.0 this initializer is failible so you'll need either force unwrap or optionnal unwrapping
print("\(uScalar)")
You can use
let char = "-12"
print(char.unicodeScalars.map {$0.value }))
You'll get the values as:
[45, 49, 50]
Here are a couple ways to do it:
let string = "1f44d"
Solution 1:
"&#x\(string);".applyingTransform(.toXMLHex, reverse: true)
Solution 2:
"U+\(string)".applyingTransform(StringTransform("Hex/Unicode"), reverse: true)
I made this extension that works pretty well:
extension String {
var unicode: String? {
if let charCode = UInt32(self, radix: 16),
let unicode = UnicodeScalar(charCode) {
let str = String(unicode)
return str
}
return nil
}
}
How to test it:
if let test = "e9c8".unicode {
print(test)
}
//print:
You cannot use string interpolation in Swift as you try to use it. Therefore, the following code won't compile:
let charAsString = "1f44d"
print("\u{\(charAsString)}")
You will have to convert your string variable into an integer (using init(_:radix:) initializer) then create a Unicode scalar from this integer. The Swift 5 Playground sample code below shows how to proceed:
let validCodeString = "1f44d"
let validUnicodeScalarValue = Int(validCodeString, radix: 16)!
let validUnicodeScalar = Unicode.Scalar(validUnicodeScalarValue)!
print(validUnicodeScalar) // 👍

How do I convert an NSData object with hex data to ASCII in Swift?

I have an NSData object with hex data and I want to convert it to an ASCII string. I've seen several similar questions to mine but they are all either in Objective-C and/or they convert a string into hex data instead of the other way around.
I found this function but it doesn't work in Swift 2 and the Apple documentation doesn't explain the difference between the old stride and the new stride (it doesn't explain stride at all):
func hex2ascii (example: String) -> String
{
var chars = [Character]()
for c in example.characters
{
chars.append(c)
}
let numbers = stride(from: 0, through: chars.count, by: 2).map{ // error: 'stride(from:through:by:)' is unavailable: call the 'stride(through:by:)' method instead.
strtoul(String(chars[$0 ..< $0+2]), nil, 16)
}
var final = ""
var i = 0
while i < numbers.count {
final.append(Character(UnicodeScalar(Int(numbers[i]))))
i++
}
return final
}
I don't know what stride is and I don't know what it does.
How do you convert hex to ASCII in Swift 2? Maybe an NSData extension...
Thanks!
try:
let asciiString = String(data: data, encoding: NSASCIIStringEncoding)
print(asciiString)
Sorry for answering my own question, but I just (accidentally) found an amazing solution to my problem and hopefully this will help someone.
If you have an NSData object with a hex representation of an ASCII string, then all you have to do is write String(data: theNSDataObject, encoding: NSUTF8StringEncoding) and that is the ASCII string.
Hope this helps someone!
In swift 2.0 stride became a method on Int rather than a standalone method so now you do something like
0.stride(through: 10, by: 2)
So now the code you posted should be:
func hex2ascii (example: String) -> String {
var chars = [Character]()
for c in example.characters {
chars.append(c)
}
let numbers = 0.stride(through: chars.count, by: 2).map{
strtoul(String(chars[$0 ..< $0+2]), nil, 16)
}
var final = ""
var i = 0
while i < numbers.count {
final.append(Character(UnicodeScalar(Int(numbers[i]))))
i++
}
return final
}

Resources