Error writing to disk in Swift 5 (iOS 12) - ios

I want to store JSON text (as String) in a text file, or rather append each time I have fresh data to add. However, the following code always returns -1 as the return code from output.write(). I'm doing something wrong but I cannot figure out what:
let fileURL = (try! FileManager.default.urls(for: FileManager.SearchPathDirectory.documentDirectory, in: FileManager.SearchPathDomainMask.userDomainMask)).first!.appendingPathComponent("data.json")
let json = "..."
let tenGB = 10 * 1000 * 1000 * 1000
if let output = OutputStream(url: fileURL, append: true) {
output.open()
let bytes = output.write(json, maxLength: tenGB)
if bytes < 0 {
print("Failure writing to disk")
} else if bytes == 0 {
print("Failure writing to disk (capacity)")
} else {
print("\(bytes) bytes written to disk")
}
output.close()
} else {
print("Unable to open file")
}
I don't expect the data to be 10 GB at all, more in the kB-MB range, but I thought I'd give it a large value.
The output of streamError: Error Domain=NSPOSIXErrorDomain Code=22 "Invalid argument" UserInfo={_kCFStreamErrorCodeKey=22, _kCFStreamErrorDomainKey=1}

As we understand in the comments the problem comes from the 10 GB
What you need is to write data as the size of the data switch the line:
let bytes = output.write(json, maxLength: tenGB)
with
bytes = output.write(json, maxLength: json.utf8.count)
you need to append data after that, look this question doing something similar question

I wrapped the code in SwiftUI to test it:
import SwiftUI
let json = "[ 1, 2, 3, 4, 5 ]\n"
func stringWrite(_ string: String) {
let fileURL = FileManager.default.urls(for: FileManager.SearchPathDirectory.documentDirectory, in: FileManager.SearchPathDomainMask.userDomainMask).first!.appendingPathComponent("data.json")
if let output = OutputStream(url: fileURL, append: true) {
output.open()
let out = [UInt8](string.utf8)
let bytes = output.write(out, maxLength: out.count)
if bytes < 0 {
print("Failure writing to disk")
print("Error: \(String(describing: output.streamError))")
} else if bytes == 0 {
print("Failure writing to disk (capacity)")
} else {
print("\(bytes) bytes written to disk")
}
output.close()
} else {
print("Unable to open file")
}
}
struct ContentView: View {
var body: some View {
Button(
action: {stringWrite(json)},
label: { Text("Do it") }
)
}
}
The stream expects a pointer to a UInt8 array. I also added printing the error and took the try away from FileManager as it doesn't throw anything. data.json looks like this after running a few times:
[ 1, 2, 3, 4, 5 ]
[ 1, 2, 3, 4, 5 ]
[ 1, 2, 3, 4, 5 ]
[ 1, 2, 3, 4, 5 ]
This query is more or less a duplicate of Writing a String to an NSOutputStream in Swift

Related

Having trouble with input image with iOS Swift TensorFlowLite Image Classification Model?

I've been trying to add a plant recognition classifier to my app through a Firebase cloud-hosted ML model, and I've gotten close - problem is, I'm pretty sure I'm messing up the input for the image data somewhere along the way. My classifier is churning out nonsense probabilities/results based on this classifier's output, and I've been testing the same classifier through a python script which is giving me accurate results.
The input for the model requires a 224x224 image with 3 channels scaled to 0,1. I've done all this but can't seem to figure out the CGImage through the Camera/ImagePicker. Here is the bit of the code that processes the input for the image:
if let imageData = info[.originalImage] as? UIImage {
DispatchQueue.main.async {
let resizedImage = imageData.scaledImage(with: CGSize(width:224, height:224))
let ciImage = CIImage(image: resizedImage!)
let CGcontext = CIContext(options: nil)
let image : CGImage = CGcontext.createCGImage(ciImage!, from: ciImage!.extent)!
guard let context = CGContext(
data: nil,
width: image.width, height: image.height,
bitsPerComponent: 8, bytesPerRow: image.width * 4,
space: CGColorSpaceCreateDeviceRGB(),
bitmapInfo: CGImageAlphaInfo.noneSkipFirst.rawValue
) else {
return
}
context.draw(image, in: CGRect(x: 0, y: 0, width: image.width, height: image.height))
guard let imageData = context.data else { return }
print("Image data showing as: \(imageData)")
var inputData = Data()
do {
for row in 0 ..< 224 {
for col in 0 ..< 224 {
let offset = 4 * (row * context.width + col)
// (Ignore offset 0, the unused alpha channel)
let red = imageData.load(fromByteOffset: offset+1, as: UInt8.self)
let green = imageData.load(fromByteOffset: offset+2, as: UInt8.self)
let blue = imageData.load(fromByteOffset: offset+3, as: UInt8.self)
// Normalize channel values to [0.0, 1.0].
var normalizedRed = Float32(red) / 255.0
var normalizedGreen = Float32(green) / 255.0
var normalizedBlue = Float32(blue) / 255.0
// Append normalized values to Data object in RGB order.
let elementSize = MemoryLayout.size(ofValue: normalizedRed)
var bytes = [UInt8](repeating: 0, count: elementSize)
memcpy(&bytes, &normalizedRed, elementSize)
inputData.append(&bytes, count: elementSize)
memcpy(&bytes, &normalizedGreen, elementSize)
inputData.append(&bytes, count: elementSize)
memcpy(&bytes, &normalizedBlue, elementSize)
inputData.append(&bytes, count: elementSize)
}
}
print("Successfully added inputData")
self.parent.invokeInterpreter(inputData: inputData)
} catch let error {
print("Failed to add input: \(error)")
}
}
}
Afterwards, I process the inputData with the following:
func invokeInterpreter(inputData: Data) {
do {
var interpreter = try Interpreter(modelPath: ProfileUserData.sharedUserData.modelPath)
var labels: [String] = []
try interpreter.allocateTensors()
try interpreter.copy(inputData, toInputAt: 0)
try interpreter.invoke()
let output = try interpreter.output(at: 0)
switch output.dataType {
case .uInt8:
guard let quantization = output.quantizationParameters else {
print("No results returned because the quantization values for the output tensor are nil.")
return
}
let quantizedResults = [UInt8](output.data)
let results = quantizedResults.map {
quantization.scale * Float(Int($0) - quantization.zeroPoint)
}
let sum = results.reduce(0, +)
print("Sum of all dequantized results is: \(sum)")
print("Count of dequantized results is: \(results.indices.count)")
let filename = "plantLabels"
let fileExtension = "csv"
guard let labelPath = Bundle.main.url(forResource: filename, withExtension: fileExtension) else {
print("Labels file not found in bundle. Please check labels file.")
return
}
do {
let contents = try String(contentsOf: labelPath, encoding: .utf8)
labels = contents.components(separatedBy: .newlines)
print("Count of label rows is: \(labels.indices.count)")
} catch {
fatalError("Labels file named \(filename).\(fileExtension) cannot be read. Please add a " +
"valid labels file and try again.")
}
let zippedResults = zip(labels.indices, results)
// Sort the zipped results by confidence value in descending order.
let sortedResults = zippedResults.sorted { $0.1 > $1.1 }.prefix(3)
print("Printing sortedResults: \(sortedResults)")
case .float32:
print("Output tensor data type [Float32] is unsupported for this model.")
default:
print("Output tensor data type \(output.dataType) is unsupported for this model.")
return
}
} catch {
//Error with interpreter
print("Error with running interpreter: \(error.localizedDescription)")
}
}

NFC Money/Transport Card Reading in Swift

I'm trying to build an iOS app for reading money/transportation card (eg. credit card) information. My apps already detect it. However, I still can't get the data inside the byte I retrieve.
Here's my APDU command code:
if case let .iso7816(tag) = nfcTag {
let apdu = NFCISO7816APDU(instructionClass: 13, instructionCode: 0xB2, p1Parameter: 0, p2Parameter: 0, data: ReaderType.cardInfo.rawValue.hexStringToData(), expectedResponseLength: 32)
self.tagSession?.alertMessage = "Select Command Success!"
tag.sendCommand(apdu: apdu) { (data:Data, sw1:UInt8, sw2:UInt8, error:Error?) in
guard error == nil else {
print("Error send command: \(error!)")
session.invalidate(errorMessage: "Error send command: \(error!)")
return
}
self.processingData(data: data, sTag: tag, session: session)
}
}
}
}
Here's my data processing code:
func processingData(data: Data, sTag: NFCISO7816Tag, session: NFCReaderSession) {
print(String(data:data,encoding: .utf8) ?? "")
let tagUIDData = sTag.identifier
var byteData: [UInt8] = []
tagUIDData.withUnsafeBytes { byteData.append(contentsOf: $0) }
var uidString = ""
for byte in byteData {
let decimalNumber = String(byte, radix: 16)
if (Int(decimalNumber) ?? 0) < 10 {
uidString.append("0\(decimalNumber)")
} else {
uidString.append(decimalNumber)
}
}
print("\n\(byteData) converted to Tag UID: \(uidString)")
let value: String = getBalanceTagConverter(byteData.hexa)
print("\ncardUID : \(value)")
self.tagSession?.alertMessage = "Your UID is \(byteData.hexa)"
self.tagSession?.invalidate()
}

In iOS, How to create audio file(.wav, .mp3) file from data?

I am working on BLE project where hardware records the audio data & sending to the iOS application. I writting a logic to convert mp3/wav file from data.
Here, I written mp3 file conversion logic from Data like below:
func storeMusicFile(data: Data) {
let fileName = "Record-1"
guard mediaDirectoryURL != nil else {
print("Error: Failed to fetch mediaDirectoryURL")
return
}
let filePath = mediaDirectoryURL!.appendingPathComponent("/\(fileName).mp3")
do {
try data.write(to: filePath, options: .atomic)
} catch {
print("Failed while storing files.")
}
}
But while playing an audio file in AVAudioPlayer, I am getting "The operation couldn’t be completed. (OSStatus error 1954115647.)" error.
So, Confused whether audio file conversion logic is wrong or data from hardware is still needs to decode?
The previous answer from #sagar-thummar saved me a ton of time. Unfortunately I am not allowed to vote or comment on it.
A few corrections I need to do was:
change mediaDirectoryURL to documentDirectoryURL
create ARError exception
adjust the sample rate AND bits per sample to my settings
To create a audio file(.mp3/.wav), you have to dynamically calculate header file data & need to append that header with actual transfer audio data from the hardware.
Reference: WAVE PCM soundfile format
Here, below I added Swift 4 code snippet for reference
//MARK: Logic for Creating Audio file
class ARFileManager {
static let shared = ARFileManager()
let fileManager = FileManager.default
var documentDirectoryURL: URL? {
return fileManager.urls(for: .documentDirectory, in: .userDomainMask).first
}
func createWavFile(using rawData: Data) throws -> URL {
//Prepare Wav file header
let waveHeaderFormate = createWaveHeader(data: rawData) as Data
//Prepare Final Wav File Data
let waveFileData = waveHeaderFormate + rawData
//Store Wav file in document directory.
return try storeMusicFile(data: waveFileData)
}
private func createWaveHeader(data: Data) -> NSData {
let sampleRate:Int32 = 2000
let chunkSize:Int32 = 36 + Int32(data.count)
let subChunkSize:Int32 = 16
let format:Int16 = 1
let channels:Int16 = 1
let bitsPerSample:Int16 = 8
let byteRate:Int32 = sampleRate * Int32(channels * bitsPerSample / 8)
let blockAlign: Int16 = channels * bitsPerSample / 8
let dataSize:Int32 = Int32(data.count)
let header = NSMutableData()
header.append([UInt8]("RIFF".utf8), length: 4)
header.append(intToByteArray(chunkSize), length: 4)
//WAVE
header.append([UInt8]("WAVE".utf8), length: 4)
//FMT
header.append([UInt8]("fmt ".utf8), length: 4)
header.append(intToByteArray(subChunkSize), length: 4)
header.append(shortToByteArray(format), length: 2)
header.append(shortToByteArray(channels), length: 2)
header.append(intToByteArray(sampleRate), length: 4)
header.append(intToByteArray(byteRate), length: 4)
header.append(shortToByteArray(blockAlign), length: 2)
header.append(shortToByteArray(bitsPerSample), length: 2)
header.append([UInt8]("data".utf8), length: 4)
header.append(intToByteArray(dataSize), length: 4)
return header
}
private func intToByteArray(_ i: Int32) -> [UInt8] {
return [
//little endian
UInt8(truncatingIfNeeded: (i ) & 0xff),
UInt8(truncatingIfNeeded: (i >> 8) & 0xff),
UInt8(truncatingIfNeeded: (i >> 16) & 0xff),
UInt8(truncatingIfNeeded: (i >> 24) & 0xff)
]
}
private func shortToByteArray(_ i: Int16) -> [UInt8] {
return [
//little endian
UInt8(truncatingIfNeeded: (i ) & 0xff),
UInt8(truncatingIfNeeded: (i >> 8) & 0xff)
]
}
func storeMusicFile(data: Data) throws -> URL {
let fileName = "Record \(Date().dateFileName)"
guard mediaDirectoryURL != nil else {
debugPrint("Error: Failed to fetch mediaDirectoryURL")
throw ARError(localizedDescription: AlertMessage.medioDirectoryPathNotAvaiable)
}
let filePath = mediaDirectoryURL!.appendingPathComponent("\(fileName).wav")
debugPrint("File Path: \(filePath.path)")
try data.write(to: filePath)
return filePath //Save file's path respected to document directory.
}
}

Decrypt Media Files in chunks and play via AVPlayer

I have a mp4 video file which i am encrypting to save and decrypting to play via AVPlayer. Using CRYPTOSWIFT Library for encrypting/decrypting
Its working fine when i am decrypting whole file at once but my file is quite big and taking 100% CPU usage and lot of memory. So, I need to decrypt encrypted file in chunks.
I tried to decrypt file in chunks but its not playing video as AVPlayer is not recognizing decrypted chunk data maybe data is not stored sequentially while encrypting file. I have tried chacha20, AES, AES.CTR & AES.CBC protocols to encrypt and decrypt files but to no avail.
extension PlayerController: AVAssetResourceLoaderDelegate {
func resourceLoader(resourceLoader: AVAssetResourceLoader, shouldWaitForLoadingOfRequestedResource loadingRequest: AVAssetResourceLoadingRequest) -> Bool {
let request = loadingRequest.request
guard let path = request.URL?.path where request.URL?.scheme == Constants.customVideoScheme else { return true }
if let contentRequest = loadingRequest.contentInformationRequest {
do {
let fileAttributes = try NSFileManager.defaultManager().attributesOfItemAtPath(path)
if let fileSizeNumber = fileAttributes[NSFileSize] {
contentRequest.contentLength = fileSizeNumber.longLongValue
}
} catch { }
if fileHandle == nil {
fileHandle = NSFileHandle(forReadingAtPath: (request.URL?.path)!)!
}
contentRequest.contentType = "video/mp4"
contentRequest.byteRangeAccessSupported = true
}
if let data = decryptData(loadingRequest, path: path), dataRequest = loadingRequest.dataRequest {
dataRequest.respondWithData(data)
loadingRequest.finishLoading()
return true
}
return true
}
func decryptData(loadingRequest: AVAssetResourceLoadingRequest, path: String) -> NSData? {
print("Current OFFSET: \(loadingRequest.dataRequest?.currentOffset)")
print("requested OFFSET: \(loadingRequest.dataRequest?.requestedOffset)")
print("Current Length: \(loadingRequest.dataRequest?.requestedLength)")
if loadingRequest.contentInformationRequest != nil {
var data = fileHandle!.readDataOfLength((loadingRequest.dataRequest?.requestedLength)!)
fileHandle!.seekToFileOffset(0)
data = decodeVideoData(data)!
return data
} else {
fileHandle?.seekToFileOffset(UInt64((loadingRequest.dataRequest?.currentOffset)!))
let data = fileHandle!.readDataOfLength((loadingRequest.dataRequest?.requestedLength)!)
// let data = fileHandle!.readDataOfLength(length!) ** When I use this its not playing video but play fine when try with requestedLength **
return decodeVideoData(data)
}
}
}
Decode code to decode nsdata :
func decodeVideoData(data: NSData) -> NSData? {
if let cha = ChaCha20(key: Constants.Encryption.SecretKey, iv: Constants.Encryption.IvKey) {
let decrypted: NSData = try! data.decrypt(cha)
return decrypted
}
return nil
}
I need help regarding this issue, Kindly guide me to the right way to achieve this.
For in depth and a more complete CommonCrypto wrapper, check out my CommonCrypto wrapper. I've extracted bits and pieces for this answer.
First of all, we need to define some functions that will do the encryption/decryption. I'm assuming, for now, you use AES(256) CBC with PKCS#7 padding. Summarising the snippet below: we have an update function, that can be called repeatedly to consume the chunks. There's also a final function that will wrap up any left overs (usually deals with padding).
import CommonCrypto
import Foundation
enum CryptoError: Error {
case generic(CCCryptorStatus)
}
func getOutputLength(_ reference: CCCryptorRef?, inputLength: Int, final: Bool) -> Int {
CCCryptorGetOutputLength(reference, inputLength, final)
}
func update(_ reference: CCCryptorRef?, data: Data) throws -> Data {
var output = [UInt8](repeating: 0, count: getOutputLength(reference, inputLength: data.count, final: false))
let status = data.withUnsafeBytes { dataPointer -> CCCryptorStatus in
CCCryptorUpdate(reference, dataPointer.baseAddress, data.count, &output, output.count, nil)
}
guard status == kCCSuccess else {
throw CryptoError.generic(status)
}
return Data(output)
}
func final(_ reference: CCCryptorRef?) throws -> Data {
var output = [UInt8](repeating: 0, count: getOutputLength(reference, inputLength: 0, final: true))
var moved = 0
let status = CCCryptorFinal(reference, &output, output.count, &moved)
guard status == kCCSuccess else {
throw CryptoError.generic(status)
}
output.removeSubrange(moved...)
return Data(output)
}
Next up, for the purpose of demonstration, the encryption.
let key = Data(repeating: 0x0a, count: kCCKeySizeAES256)
let iv = Data(repeating: 0, count: kCCBlockSizeAES128)
let bigFile = (0 ..< 0xffff).map { _ in
return Data(repeating: UInt8.random(in: 0 ... UInt8.max), count: kCCBlockSizeAES128)
}.reduce(Data(), +)
var encryptor: CCCryptorRef?
CCCryptorCreate(CCOperation(kCCEncrypt), CCAlgorithm(kCCAlgorithmAES), CCOptions(kCCOptionPKCS7Padding), Array(key), key.count, Array(iv), &encryptor)
do {
let ciphertext = try update(encryptor, data: bigFile) + final(encryptor)
print(ciphertext) // 1048576 bytes
} catch {
print(error)
}
That appears to me as quite a large file. Now decrypting, would be done in a similar fashion.
var decryptor: CCCryptorRef?
CCCryptorCreate(CCOperation(kCCDecrypt), CCAlgorithm(kCCAlgorithmAES), CCOptions(kCCOptionPKCS7Padding), Array(key), key.count, Array(iv), &decryptor)
do {
var plaintext = Data()
for i in 0 ..< 0xffff {
plaintext += try update(decryptor, data: ciphertext[i * kCCBlockSizeAES128 ..< i * kCCBlockSizeAES128 + kCCBlockSizeAES128])
}
plaintext += try final(decryptor)
print(plaintext == bigFile, plaintext) // true 1048560 bytes
} catch {
print(error)
}
The encryptor can be altered for different modes and should also be released once it's done, and I'm not too sure how arbitrary output on the update function will behave, but this should be enough to give you an idea of how it can be done using CommonCrypto.

Swift base64 decoding returns nil

I am trying to decode a base64 string to an image in Swift using the following code:
let decodedData=NSData(base64EncodedString: encodedImageData, options: NSDataBase64DecodingOptions.IgnoreUnknownCharacters)
Unfortunately, the variable decodedData turns out to have a value of nil
Debugging through the code, I verified that the variable encodedImageData is not nil and is the correct encoded image data(verified by using an online base64 to image converter). What could possibly be the reason behind my issue?
This method requires padding with “=“, the length of the string must be multiple of 4.
In some implementations of base64 the padding character is not needed for decoding, since the number of missing bytes can be calculated. But in Fundation's implementation it is mandatory.
Updated:
As noted on the comments, it's a good idea to check first if the string lenght is already a multiple of 4. if encoded64 has your base64 string and it's not a constant, you can do something like this:
Swift 2
let remainder = encoded64.characters.count % 4
if remainder > 0 {
encoded64 = encoded64.stringByPaddingToLength(encoded64.characters.count + 4 - remainder,
withPad: "=",
startingAt: 0)
}
Swift 3
let remainder = encoded64.characters.count % 4
if remainder > 0 {
encoded64 = encoded64.padding(toLength: encoded64.characters.count + 4 - remainder,
withPad: "=",
startingAt: 0)
}
Swift 4
let remainder = encoded64.count % 4
if remainder > 0 {
encoded64 = encoded64.padding(toLength: encoded64.count + 4 - remainder,
withPad: "=",
startingAt: 0)
}
Updated one line version:
Or you can use this one line version that returns the same string when its length is already a multiple of 4:
encoded64.padding(toLength: ((encoded64.count+3)/4)*4,
withPad: "=",
startingAt: 0)
When the number of characters is divisible by 4, you need to avoid padding.
private func base64PaddingWithEqual(encoded64: String) -> String {
let remainder = encoded64.characters.count % 4
if remainder == 0 {
return encoded64
} else {
// padding with equal
let newLength = encoded64.characters.count + (4 - remainder)
return encoded64.stringByPaddingToLength(newLength, withString: "=", startingAtIndex: 0)
}
}
(Swift 3)
I been in this situation, Trying to get the data using base64 encoded string returns nil with me when I used this line
let imageData = Data(base64Encoded: strBase64, options: .ignoreUnknownCharacters)
Tried padding the string and didn't work out too
This is what worked with me
func imageForBase64String(_ strBase64: String) -> UIImage? {
do{
let imageData = try Data(contentsOf: URL(string: strBase64)!)
let image = UIImage(data: imageData)
return image!
}
catch{
return nil
}
}
Check the content of your decodedData variable and look for this prefix "data:image/png;base64", I had this issue and noticed that my String base64 had a prefix like this, so I used this approached and it worked
extension String {
func getImageFromBase64() -> UIImage? {
guard let url = URL(string: self) else {
return nil
}
do {
let data = try Data(contentsOf: url)
return UIImage(data: data)
} catch {
return nil
}
}
}
This helped me:
extension String {
func fromBase64() -> String? {
guard let data = Data(base64Encoded: self.replacingOccurrences(of: "_", with: "="), options: Data.Base64DecodingOptions(rawValue: 0)) else {
return nil
}
return String(data: data, encoding: .utf8)
}
}
Usage:
print(base64EncodedString.fromBase64())
Another one-line version:
let length = encoded64.characters.count
encoded64 = encoded64.padding(toLength: length + (4 - length % 4) % 4, withPad: "=", startingAt: 0)
It's make problem with special character, but an interesting point is if we use NSData and NSString then it's working fine.
static func decodeBase64(input: String)->String{
let base64Decoded = NSData(base64Encoded: input, options: NSData.Base64DecodingOptions(rawValue: 0))
.map({ NSString(data: $0 as Data, encoding: String.Encoding.utf8.rawValue) })
return base64Decoded!! as String
}
You can use this extension to make sure that the string has the correct length fro decoding with Foundation (divisible by 4):
extension String {
var paddedForBase64Decoding: String {
appending(String(repeating: "=", count: (4 - count % 4) % 4))
}
}
Usage:
Data(base64Encoded: base64String.paddedForBase64Decoding)

Resources