Properly used NSGetExecutablePath - ios

I try to get the path to my application at runtime. I found some old sources from C and converted it accordingly to the functions parameter type definition:
var path = [Int8] (count:1024, repeatedValue: 0)
var bufsize : UInt32 = 1024
if _NSGetExecutablePath(&path, &bufsize) == 0 {
println("executable path is \(path)")
}
It runs, but I need an Int8 array, not a string. So I have to search for the end of the character chain and convert it back to a string. What is the correct way to use this function in SWIFT?

You need to create a Swift String from a C String
let executablePath = String(CString: path, encoding: NSASCIIStringEncoding)!
println("executable path is \(executablePath)")
But there is an easier way to get the path to the executable
let executablePath = Bundle.main.executablePath!

In Swift 4
let executablePath = Bundle.main.executablePath!
print(executablePath)

Related

NSNetService dictionaryFromTXTRecord fails an assertion on invalid input

The input to dictionary(fromTXTRecord:) comes from the network, potentially from outside the app, or even the device. However, Apple's docs say:
... Fails an assertion if txtData cannot be represented as an NSDictionary object.
Failing an assertion leaves the programmer (me) with no way of handling the error, which seems illogic for a method that processes external data.
If I run this in Terminal on a Mac:
dns-sd -R 'My Service Name' _myservice._tcp local 4567 asdf asdf
my app, running in an iPhone, crashes.
dictionary(fromTXTRecord:) expects the TXT record data (asdf asdf) to be in key=val form. If, like above, a word doesn't contain any = the method won't be able to parse it and fail the assertion.
I see no way of solving this problem other than not using that method at all and implementing my own parsing, which feels wrong.
Am I missing something?
Here's a solution in Swift 4.2, assuming the TXT record has only strings:
/// Decode the TXT record as a string dictionary, or [:] if the data is malformed
public func dictionary(fromTXTRecord txtData: Data) -> [String: String] {
var result = [String: String]()
var data = txtData
while !data.isEmpty {
// The first byte of each record is its length, so prefix that much data
let recordLength = Int(data.removeFirst())
guard data.count >= recordLength else { return [:] }
let recordData = data[..<(data.startIndex + recordLength)]
data = data.dropFirst(recordLength)
guard let record = String(bytes: recordData, encoding: .utf8) else { return [:] }
// The format of the entry is "key=value"
// (According to the reference implementation, = is optional if there is no value,
// and any equals signs after the first are part of the value.)
// `ommittingEmptySubsequences` is necessary otherwise an empty string will crash the next line
let keyValue = record.split(separator: "=", maxSplits: 1, omittingEmptySubsequences: false)
let key = String(keyValue[0])
// If there's no value, make the value the empty string
switch keyValue.count {
case 1:
result[key] = ""
case 2:
result[key] = String(keyValue[1])
default:
fatalError()
}
}
return result
}
I'm still hoping there's something I'm missing here, but in the mean time, I ended up checking the data for correctness and only then calling Apple's own method.
Here's my workaround:
func dictionaryFromTXTRecordData(data: NSData) -> [String:NSData] {
let buffer = UnsafeBufferPointer<UInt8>(start: UnsafePointer(data.bytes), count: data.length)
var pos = 0
while pos < buffer.count {
let len = Int(buffer[pos])
if len > (buffer.count - pos + 1) {
return [:]
}
let subdata = data.subdataWithRange(NSRange(location: pos + 1, length: len))
guard let substring = String(data: subdata, encoding: NSUTF8StringEncoding) else {
return [:]
}
if !substring.containsString("=") {
return [:]
}
pos = pos + len + 1
}
return NSNetService.dictionaryFromTXTRecordData(data)
}
I'm using Swift 2 here. All contributions are welcome. Swift 3 versions, Objective-C versions, improvements, corrections.
I just ran into this one using Swift 3. In my case the problem only occurred when I used NetService.dictionary(fromTXTRecord:) but did not occur when I switched to Objective-C and called NSNetService dictionaryFromTXTRecord:. When the Objective-C call encounters an entry without an equal sign it creates a key containing the data and shoves it into the dictionary with an NSNull value. From what I can tell the Swift version then enumerates that dictionary and throws a fit when it sees the NSNull. My solution was to add an Objective-C file and a utility function that calls dictionaryFromTXTRecord: and cleans up the results before handing them back to my Swift code.

String with Unicode (variable) [duplicate]

I have a problem I couldn't find a solution to.
I have a string variable holding the unicode "1f44d" and I want to convert it to a unicode character ๐Ÿ‘.
Usually one would do something like this:
println("\u{1f44d}") // ๐Ÿ‘
Here is what I mean:
let charAsString = "1f44d" // code in variable
println("\u{\(charAsString)}") // not working
I have tried several other ways but somehow the workings behind this magic stay hidden for me.
One should imagine the value of charAsString coming from an API call or from another object.
One possible solution (explanations "inline"):
let charAsString = "1f44d"
// Convert hex string to numeric value first:
var charCode : UInt32 = 0
let scanner = NSScanner(string: charAsString)
if scanner.scanHexInt(&charCode) {
// Create string from Unicode code point:
let str = String(UnicodeScalar(charCode))
println(str) // ๐Ÿ‘
} else {
println("invalid input")
}
Slightly simpler with Swift 2:
let charAsString = "1f44d"
// Convert hex string to numeric value first:
if let charCode = UInt32(charAsString, radix: 16) {
// Create string from Unicode code point:
let str = String(UnicodeScalar(charCode))
print(str) // ๐Ÿ‘
} else {
print("invalid input")
}
Note also that not all code points are valid Unicode scalars,
compare Validate Unicode code point in Swift.
Update for Swift 3:
public init?(_ v: UInt32)
is now a failable initializer of UnicodeScalar and checks if the
given numeric input is a valid Unicode scalar value:
let charAsString = "1f44d"
// Convert hex string to numeric value first:
if let charCode = UInt32(charAsString, radix: 16),
let unicode = UnicodeScalar(charCode) {
// Create string from Unicode code point:
let str = String(unicode)
print(str) // ๐Ÿ‘
} else {
print("invalid input")
}
This can be done in two steps:
convert charAsString to Int code
convert code to unicode character
Second step can be done e.g. like this
var code = 0x1f44d
var scalar = UnicodeScalar(code)
var string = "\(scalar)"
As for first the step, see here how to convert String in hex representation to Int
As of Swift 2.0, every Int type has an initializer able to take String as an input. You can then easily generate an UnicodeScalar corresponding and print it afterwards. Without having to change your representation of chars as string ;).
UPDATED: Swift 3.0 changed UnicodeScalar initializer
print("\u{1f44d}") // ๐Ÿ‘
let charAsString = "1f44d" // code in variable
let charAsInt = Int(charAsString, radix: 16)! // As indicated by #MartinR radix is required, default won't do it
let uScalar = UnicodeScalar(charAsInt)! // In Swift 3.0 this initializer is failible so you'll need either force unwrap or optionnal unwrapping
print("\(uScalar)")
You can use
let char = "-12"
print(char.unicodeScalars.map {$0.value }))
You'll get the values as:
[45, 49, 50]
Here are a couple ways to do it:
let string = "1f44d"
Solution 1:
"&#x\(string);".applyingTransform(.toXMLHex, reverse: true)
Solution 2:
"U+\(string)".applyingTransform(StringTransform("Hex/Unicode"), reverse: true)
I made this extension that works pretty well:
extension String {
var unicode: String? {
if let charCode = UInt32(self, radix: 16),
let unicode = UnicodeScalar(charCode) {
let str = String(unicode)
return str
}
return nil
}
}
How to test it:
if let test = "e9c8".unicode {
print(test)
}
//print:
You cannot use string interpolation in Swift as you try to use it. Therefore, the following code won't compile:
let charAsString = "1f44d"
print("\u{\(charAsString)}")
You will have to convert your string variable into an integer (using init(_:radix:) initializer) then create a Unicode scalar from this integer. The Swift 5 Playground sample code below shows how to proceed:
let validCodeString = "1f44d"
let validUnicodeScalarValue = Int(validCodeString, radix: 16)!
let validUnicodeScalar = Unicode.Scalar(validUnicodeScalarValue)!
print(validUnicodeScalar) // ๐Ÿ‘

swift2 decrypt MD5

Hello I just want to decrypt from md5 to 'normal string'
extension String {
func MD5() -> String {
var data = (self as NSString).dataUsingEncoding(NSUTF8StringEncoding)
let result = NSMutableData(length: Int(CC_MD5_DIGEST_LENGTH))
let resultBytes = UnsafeMutablePointer<CUnsignedChar>(result!.mutableBytes)
CC_MD5(data!.bytes, CC_LONG(data!.length), resultBytes)
let buff = UnsafeBufferPointer<CUnsignedChar>(start: resultBytes, count: result!.length)
let hash = NSMutableString()
for i in buff {
hash.appendFormat("%02x", i)
}
return hash as String
}
var x = "abc".MD5()
I want to get back to abc from "x"
It's not possible that's the whole point of hashing. You can however bruteforce by going through all possibilities (using all possible digits characters in every possible order) and hashing them and checking for a collision.
it was hard to reverse.
also check...https://en.wikipedia.org/wiki/MD5
Simple: Not possible, because MD5 hash is not possible to invert.
Check about One way function

iOS: Convert UnsafeMutablePointer<Int8> to String in swift?

As the title says, what is the correct way to convert UnsafeMutablePointer to String in swift?
//lets say x = UnsafeMutablePointer<Int8>
var str = x.memory.????
I tried using x.memory.description obviously it is wrong, giving me a wrong string value.
If the pointer points to a NUL-terminated C string of UTF-8 bytes, you can do this:
import Foundation
let x: UnsafeMutablePointer<Int8> = ...
// or UnsafePointer<Int8>
// or UnsafePointer<UInt8>
// or UnsafeMutablePointer<UInt8>
let str = String(cString: x)
Times have changed. In Swift 3+ you would do it like this:
If you want the utf-8 to be validated:
let str: String? = String(validatingUTF8: c_str)
If you want utf-8 errors to be converted to the unicode error symbol: ๏ฟฝ
let str: String = String(cString: c_str)
Assuming c_str is of type UnsafePointer<UInt8> or UnsafePointer<CChar> which is the same type and what most C functions return.
this:
let str: String? = String(validatingUTF8: c_str)
doesn't appear to work with UnsafeMutablePointer<UInt8>
(which is what appears to be in my data).
This is me trivially figuring out how to do something like the C/Perl system function:
let task = Process()
task.launchPath = "/bin/ls"
task.arguments = ["-lh"]
let pipe = Pipe()
task.standardOutput = pipe
task.launch()
let data = pipe.fileHandleForReading.readDataToEndOfFile()
var unsafePointer = UnsafeMutablePointer<Int8>.allocate(capacity: data.count)
data.copyBytes(to: unsafePointer, count: data.count)
let output : String = String(cString: unsafePointer)
print(output)
//let output : String? = String(validatingUTF8: unsafePointer)
//print(output!)
if I switch to validatingUTF8 (with optional) instead of cString, I get this error:
./ls.swift:19:37: error: cannot convert value of type 'UnsafeMutablePointer<UInt8>' to expected argument type 'UnsafePointer<CChar>' (aka 'UnsafePointer<Int8>')
let output : String? = String(validatingUTF8: unsafePointer)
^~~~~~~~~~~~~
Thoughts on how to validateUTF8 on the output of the pipe (so I don't get the unicode error symbol anywhere)?
(yes, I'm not doing proper checking of my optional for the print(), that's not the problem I'm currently solving ;-) ).

use fscanf() function in swift code

In objective c, I use fscanf to read stream from file and assign the value to variables:
int count;
char type[5];
fscanf(myFile, โ€œcount is %d, type is %4s โ€, &count, type)
I want to do the same thing in swift code, I tried:
//ERROR: Type annotation missing in pattern
//What type should I use for `count`?
var count
//ERROR: consecutive statement on a line must be separated by โ€˜;โ€™
var type[5] : char
fscanf(myFile, โ€œcount is %d, type is %4s โ€, &count, type)
But I got compiler errors showing above. What is the correct way to use fscanf in swift ?
If you know any swift way to achieve the same thing (without using fscanf), it would be great too!
I recommend you use Foundation framework solution for reading/writing file data. A sample code to read contents of files which I used in my app to stream file into NSData:
if let fileHandle = NSFileHandle(forReadingAtPath: "path/to/file") {
fileHandle.seekToFileOffset(0)
var data = fileHandle.readDataOfLength(5)
var chars = [UInt8](count: 5, repeatedValue: 0)
data.getBytes(&chars, length: 5)
fileHandle.closeFile()
}
In case you need read Int64 data from file at a specific location:
if let fileHandle = NSFileHandle(forReadingAtPath: "path/to/file") {
fileHandle.seekToFileOffset(0)
var data = fileHandle.readDataOfLength(500)
var intFetched: Int64 = 0
let location = 100 // start at 101st character of file
data.getBytes(&intFetched, range: NSMakeRange(location, 8))
println(intFetched.littleEndian)
fileHandle.closeFile()
}

Resources