I need some sample code to see how to transfer an audio file (or any other binary data) using CoreBlueTooth L2CAP channel. Assuming the file is not so small, let us say it is a few hundred kilobytes.
I am working on a small iOS app doing that, but it is only working half way.
At this point I can transfer a few thousand bytes, but it does not go further.
Just in case, this is the related code I have on the sending side:
let path = Bundle.main.path(forResource: "\(name)", ofType: nil)!,
url = URL(fileURLWithPath: path)
do {let audioData = try Data(contentsOf: url)
// do something useful with audioData to send it
// to the other device.
..........
let bytesWritten = data.withUnsafeBytes {outStream!.write($0, maxLength: audioData.count)}
if bytesWritten > 0 {
..........
}
} catch {
print("Error: \(error.localizedDescription)")
}
On the receiving side:
let inData = Data(reading: inStream)
if inData.count != 0 {
// Data has been received.
.......
}
I am obviously not showing much detail, but my code is missing some critical parts anyway, this is why I would be glad to find some little working sample in order to see how it works.
Related
I'm making Audio Player.
Importing file from iCloud Drive using .fileImporter.
I get file URL that looks like this: file:///private/var/mobile/Library/Mobile%20Documents/com~apple~CloudDocs/_Storage/Audio-books/%D0%91%D1%80%D0%B5%D0%BD%D0%B4%D1%8F%D1%82%D0%B8%D0%BD%D0%B0/Audiobook.mp3"
Then I pass it to audio player (tried AVPlayer and AVAudioPlayer). Both works on iOS simulator.
On the device after import I get error: The operation couldn’t be completed. (OSStatus error -54.)
I know it's possible, app called Evermusic does quite the same with on device files.
Is there permissions I need to be granted to play audio that stored on device?
How can I access Container for com~apple~CloudDocs?
Thank you very much for help, any suggestions greatly appreciated, I'm seriously stuck.
For future references repo of the project: https://github.com/yaosamo/AudioPlayer
You need to be using startAccessingSecurityScopedResource in order to get read access to those files. See documentation:
https://developer.apple.com/documentation/foundation/nsurl/1417051-startaccessingsecurityscopedreso
https://developer.apple.com/documentation/corefoundation/1543318-cfurlstartaccessingsecurityscope
In addition to #jnpdx answer want to add some details, and my solution example.
Few core things
✅ for my app if you need to access secured audio you need to use startAccessingSecurityScopedResource()
❌ you can't simple store URL and use it later, in fact you don't store fileURL at all. You need to use what's called bookmarkData() on your URL and store it. So later you can restore URL
✅ Watch Apple pres here
Here's how I import file:
.fileImporter(isPresented: $presentImporter, allowedContentTypes: [.mp3]) { result in
switch result {
case .success(let url):
// Start accessing secured url
let StartAccess = url.startAccessingSecurityScopedResource()
defer {
// Must stop accessing once stop using
if StartAccess {
url.stopAccessingSecurityScopedResource()
}
}
// Creating new book
let newBook = Book(context: viewContext)
let _ = print("---- Access Granted?", url.startAccessingSecurityScopedResource())
// Getting bookmarkData of the URL
let bookmarkData = try? url.bookmarkData()
newBook.name = "\(url.lastPathComponent)"
// Save bookmarkURL into CoreData
newBook.urldata = bookmarkData
// Specifiying parent item in CoreData
newBook.origin = playlist.self
try? viewContext.save()
case .failure(let error):
print(error)
}
}
Player retrieving URL:
func Audioplayer(bookmarkData: Data) {
// Restore security scoped bookmark
var bookmarkDataIsStale = false
let playNow = try? URL(resolvingBookmarkData: bookmarkData, bookmarkDataIsStale: &bookmarkDataIsStale)
do {
player = try AVAudioPlayer(contentsOf: playNow!)
// Delegate listen when audio is finished
player?.delegate = del
NotificationCenter.default.addObserver(forName: NSNotification.Name("ended"), object: nil, queue: .main) { (_) in
player?.stop()
ended = true
let _ = print("---- Book has ended ----")
}
} catch let error {
print("Player Error", error.localizedDescription)
}
player?.prepareToPlay()
player?.play()
}
Thank you and again here's repo on git.
I'm trying to write out an audio file after doing some processing, and am getting an error. I've reduced the error to this simple standalone case:
import Foundation
import AVFoundation
do {
let inputFileURL = URL(fileURLWithPath: "/Users/andrewmadsen/Desktop/test.m4a")
let file = try AVAudioFile(forReading: inputFileURL, commonFormat: .pcmFormatFloat32, interleaved: true)
guard let buffer = AVAudioPCMBuffer(pcmFormat: file.processingFormat, frameCapacity: AVAudioFrameCount(file.length)) else {
throw NSError()
}
buffer.frameLength = buffer.frameCapacity
try file.read(into: buffer)
let tempURL =
URL(fileURLWithPath: NSTemporaryDirectory())
.appendingPathComponent("com.openreelsoftware.AudioWriteTest")
.appendingPathComponent(UUID().uuidString)
.appendingPathExtension("caf")
let fm = FileManager.default
let dirURL = tempURL.deletingLastPathComponent()
if !fm.fileExists(atPath: dirURL.path, isDirectory: nil) {
try fm.createDirectory(at: dirURL, withIntermediateDirectories: true, attributes: nil)
}
var settings = buffer.format.settings
settings[AVAudioFileTypeKey] = kAudioFileCAFType
let tempFile = try AVAudioFile(forWriting: tempURL, settings: settings)
try tempFile.write(from: buffer)
} catch {
print(error)
}
When this code runs, the tempFile.write(from: buffer) call throws an error:
Error Domain=com.apple.coreaudio.avfaudio Code=-50 "(null)" UserInfo={failed call=ExtAudioFileWrite(_imp->_extAudioFile, buffer.frameLength, buffer.audioBufferList)}
test.m4a is a stereo, 44.1 KHz AAC file (from the iTunes store), though the failure occurs with other stereo files in other formats (AIFF and WAV) as well.
The code does not fail, and instead correctly saves the original audio out to a new file if I change the interleaved parameter to false when creating the original input AVAudioFile (file). However, in this case, the following message is logged to the console:
Audio files cannot be non-interleaved. Ignoring setting AVLinearPCMIsNonInterleaved YES.
It seems strange and confusing that writing a non-interleaved buffer works fine, despite a message saying that files must be interleaved, while writing an interleaved buffer fails. This is the opposite of what I expected.
I'm aware that reading a file using the plain AVAudioFile(forReading:) initializer without specifying a format defaults to using non-interleaved (ie. the "standard" AVAudioFormat at the file's actual sample rate and channel count). Does this mean that I really do have to convert interleaved audio to non-interleaved before trying to write it?
Notably, in the actual program where this problem came up, I'm doing something much more complex than simply reading a file in and writing it back out again, and I do need to handle interleaved audio. I have confirmed however that that original, more complex code is also failing only for interleaved stereo audio.
Is there something tricky I need to do to get AVAudioFile to write out a buffer containing interleaved PCM audio?
The mixup here is that there are TWO formats in play: the format of the output file, and the format of the buffers you will write (the processing format). The initializer AVAudioFile(forWriting: settings:) does not let you choose the processing format and defaults to de-interleaved, hence your error.
This opens the file for writing using the standard format (deinterleaved floating point).
You need to use the other initializer: AVAudioFile(forWriting:settings: commonFormat:interleaved:) whose last two arguments specify the processing format (the argument names could have been clearer about that tbh).
var settings: [String : Any] = [:]
settings[AVFormatIDKey] = kAudioFormatMPEG4AAC
settings[AVAudioFileTypeKey] = kAudioFileCAFType
settings[AVSampleRateKey] = buffer.format.sampleRate
settings[AVNumberOfChannelsKey] = 2
settings[AVLinearPCMIsFloatKey] = (buffer.format.commonFormat == .pcmFormatInt32)
let tempFile = try AVAudioFile(forWriting: tempURL, settings: settings, commonFormat: buffer.format.commonFormat, interleaved: buffer.format.isInterleaved)
try tempFile.write(from: buffer)
p.s. passing the buffer format setting directly to AVAudioFile gets you an LPCM caf file, which you may not want, hence I reconstruct the file settings.
Not positive here, but maybe since you're making the outputFile settings the same as the processing format, it's possible that the processing format has an inflexible policy on interleaving, whereas the file settings format will be fine with it - or vice versa.
Here's what I'd try first. Incomplete example, but should be enough to illustrate the areas to test.
let sourceFile: AVAudioFile
let format: AVAudioFormat
do {
// for the moment, try this without any specific format and see what it gives you
let sourceFile = try AVAudioFile(forReading: inputFileURL)
format = sourceFile.processingFormat
print(format) // let's see what we're getting so far, maybe some clues
} catch {
fatalError("Unable to load the source audio file: \(error.localizedDescription).")
}
let sourceSettings = sourceFile.fileFormat.settings
var outputSettings = sourceSettings // start with the settings of the original file rather than the buffer format settings
outputSettings[AVAudioFileTypeKey] = kAudioFileCAFType
// etc...
I am currently working on a code that downloads .m4a audio format files from Firebase and allows the user to hear the music play within the same View Controller. Through lots of research on SO and other websites, I managed to get the audio downloaded, but when the audio is supposed to be being played, I only hear static. In fact, it actually plays the entire duration of the audio clip (e.g. the audio is 6s long, so a user would hear static for exactly 6s), so I know for a fact it is actually downloading something, just nothing is playing.
Below is my source code for the download function and play music function. I tried searching this topic on SO, but there are very few articles pertaining to audio download to iOS from Firebase (mostly it seems to be "how to download images", etc.)
Thank you very much in advance!
Sam
//Function to start playing music
func playMusic(){
do {
self.audioPlay = try AVAudioPlayer(contentsOf: self.localSongURL)
self.audioPlay.delegate = self
self.audioPlay.prepareToPlay()
self.audioPlay.play()
self.audioPlay.volume = 1.0 //I tried adjusting volumes to see if it would make a difference
} catch {
createAlert(title: "File Not Found", message: "Audio downloaded cannot be interpereted.")
}
}
//Function to download music
//In the Firebase storage, songs are listed underneath an ID, so the "tillAt"s help to parse through
//the appropriate strings to get the right path for collection
func downloadMusic(){
let tillAt1 = self.songPath.components(separatedBy: ID + "/")
let tillAt2 = tillAt1[1].components(separatedBy: ".")
store = Storage.storage().reference().child(patientID).child(tillAt1[1])
//From here is where I think the issue is within
self.localSongURL = try! FileManager.default.url(for: .documentDirectory, in: .userDomainMask, appropriateFor: nil, create: false).appendingPathComponent(tillAt2[0] + ".m4a")
store.getData(maxSize: 128 * 1024 * 1024, completion: {(data, error) in
if let error = error {
print("Error Here _______________ Level 1")
print(error)
} else {
if let d = data {
do {
try d.write(to: self.localSongURL)
} catch {
print("Error Here _______________ Level 2")
print(error)
}
}
}
})
}
I decided to utilize Amazon S3 to upload files, however I find AWS Docs a bit confusing about S3 capabilities for iOS platform.
I would like to know how my app would act in the following scenarios:
Scenario 1: During the upload user has accidentally lost internet connection
Scenario 2: App crashes during the upload
I heard that iOS SDK takes care of such issues itself by resuming remaining upload when possible, I failed to find relevant information in docs, though.
Will AWSS3 framework cover both this scenarios? Does it need any additional lines od code to not be vulnerable for potential crashes and network errors?
I've found some relevant informations for Android platform
I'd love to know what can I expect from the following code
let image = UIImage(named: "12.jpeg")
let fileManager = FileManager.default
let imageData = UIImageJPEGRepresentation(image!, 0.99)
let path = (NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0] as NSString).appendingPathComponent("\(imageData!).jpeg")
fileManager.createFile(atPath: path as String, contents: imageData, attributes: nil)
let fileUrl = NSURL(fileURLWithPath: path)
let uploadRequest = AWSS3TransferManagerUploadRequest()
uploadRequest?.bucket = "bucketname"
uploadRequest?.key = "folder/12.jpeg"
uploadRequest?.contentType = "image/jpeg"
uploadRequest?.body = fileUrl as URL!
uploadRequest?.serverSideEncryption = AWSS3ServerSideEncryption.awsKms
uploadRequest?.uploadProgress = { (bytesSent, totalBytesSent, totalBytesExpectedToSend) -> Void in
DispatchQueue.main.async(execute: {
print("bytes sent \(bytesSent), total bytes sent \(totalBytesSent), of total \(totalBytesExpectedToSend)")
})
}
let transferManager = AWSS3TransferManager.default()
transferManager?.upload(uploadRequest).continue(with: AWSExecutor.mainThread(), withSuccessBlock: { (taskk: AWSTask) -> Any? in
if taskk.error != nil {
// Error.
} else {
// Do something with your result.
}
return nil
})
Is it already crash/network proof?
EDIT:
This is the part of docs that sounds ambiguous to me:
S3 provides a multipart upload feature that lets you upload a single
object as a set of parts. Each part is a contiguous portion of the
object's data, and the object parts are uploaded independently and in
any order. If transmission of any part fails, you can retransmit that
part without affecting other parts. After all parts of the object are
uploaded, S3 assembles these parts and creates the object.
Does that mean it has its own inherent mechanism to manage that? Let's say I kill app during uploading a file, when I relaunch it and start over the upload process , will it start with the last chunk where it left off before I killed the app?
I'm trying to transfer an array of MPMediaItem(s) to another device using MultipeerConnectivity, so I can show a list (TableView) of songs and therefore remote control the peer.
This piece of code encodes my music library items so I can send them to another peer.
func encodeLibrary(lib: [MPMediaItem]) -> NSData {
print("before encoding: \(lib.count)")
// --> prints "before encoding: 511"
let data = NSMutableData()
let archiver = NSKeyedArchiver.init(forWritingWithMutableData: data)
archiver.encodeObject(lib, forKey: "data")
archiver.finishEncoding()
let unarchiver = NSKeyedUnarchiver.init(forReadingWithData: data)
let newLib = unarchiver.decodeObjectForKey("data") as! [MPMediaItem]
print("decoded: \(newLib.count)")
// --> prints "decoded: 511"
return data
}
The following code is then executed on another peer:
func decodeLibrary(data: NSData) -> [MPMediaItem] {
let unarchiver = NSKeyedUnarchiver.init(forReadingWithData: data)
let lib = unarchiver.decodeObjectForKey("data") as! [MPMediaItem]
print("items: \(lib.count)")
//prints "items: 0"
return lib
}
To send the data I use the following call:
try! session.sendData(data, toPeers: [peerID], withMode: .Reliable)
It's not a problem with the en-/decoding because it works when i run the decoding on the same device right after the encoding as you can see. It prints 511 songs before and after.
There has to be a problem while being transmitted or anything I can't think of why.
When data arrives on another device, everything except these MPMediaItems is available as well.
I do not receive any errors and other parts of the communication are working fine. Just this one array does not seem to be available on other devices. Any idea how to fix this?
Thanks in advance,
KlixxOne
EDIT: Actually the array is there, but it has no content (which it had on the other device (511 entries)).