I am developing an application in which audio is being recorded and being transcribed to text. I am using the Speechkit provided by Nuance Developers.
The functions I am adding are:
Save the recorded audio file to persistent memory
Display the audio files in a table view
Load the saved audio files later
Play the audio files
How do I save the audio files to persistent storage?
Here's the code : https://gist.github.com/buildFlash/48d143217b721823ff4c3c03a925ba55
When you record audio with AVAudioRecorder then you have to pass path as url of the location where you are storing your audio. so by defaul it's store audio at that location.
for example,
var audioSession:AVAudioSession = AVAudioSession.sharedInstance()
audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord, error: nil)
audioSession.setActive(true, error: nil)
var documents: AnyObject = NSSearchPathForDirectoriesInDomains( NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomainMask.UserDomainMask, true)[0]
var str = documents.stringByAppendingPathComponent("myRecording1.caf")
var url = NSURL.fileURLWithPath(str as String)
var recordSettings = [AVFormatIDKey:kAudioFormatAppleIMA4,
AVSampleRateKey:44100.0,
AVNumberOfChannelsKey:2,AVEncoderBitRateKey:12800,
AVLinearPCMBitDepthKey:16,
AVEncoderAudioQualityKey:AVAudioQuality.Max.rawValue
]
println("url : \(url)")
var error: NSError?
audioRecorder = AVAudioRecorder(URL:url, settings: recordSettings, error: &error)
if let e = error {
println(e.localizedDescription)
} else {
audioRecorder.record()
}
So, here url is the location where your audio is stored and you can use that same url to play that audio. and you can get that file from url or path as data if you want to send it to server.
So, if you are using third party library then check that where it is storing audio and you can get it from there or it should have some method to get the location of it.
PS : there is no need to use third party library to record audio because you can easly manage it via AVAudioRecorder and AVAudioPlayer (for playing audio from url).
Inshort if you are recording audio then you definitely parallel storing it also!
You can refer Ravi shankar's tutorial also
Reference : this so post
Related
Since iOS 10, Apple has provided the support for downloading HLS (m3u8) video for offline viewing.
My question is: Is it necessary that we can only download HLS when it is being played ? Or we can just download when user press download button and show progress.
Does anyone has implemented this in Objective C version? Actually my previous App is made in Objective C. Now I want to add support for downloading HLS rather than MP4 (previously I was downloading MP4 for offline view).
I am really desperate to this. Please share thoughts or any code if implemented.
I used the apple code guid to download HLS content with the following code:
var configuration: URLSessionConfiguration?
var downloadSession: AVAssetDownloadURLSession?
var downloadIdentifier = "\(Bundle.main.bundleIdentifier!).background"
func setupAssetDownload(videoUrl: String) {
// Create new background session configuration.
configuration = URLSessionConfiguration.background(withIdentifier: downloadIdentifier)
// Create a new AVAssetDownloadURLSession with background configuration, delegate, and queue
downloadSession = AVAssetDownloadURLSession(configuration: configuration!,
assetDownloadDelegate: self,
delegateQueue: OperationQueue.main)
if let url = URL(string: videoUrl){
let asset = AVURLAsset(url: url)
// Create new AVAssetDownloadTask for the desired asset
let downloadTask = downloadSession?.makeAssetDownloadTask(asset: asset,
assetTitle: "Some Title",
assetArtworkData: nil,
options: nil)
// Start task and begin download
downloadTask?.resume()
}
}//end method
func urlSession(_ session: URLSession, assetDownloadTask: AVAssetDownloadTask, didFinishDownloadingTo location: URL) {
// Do not move the asset from the download location
UserDefaults.standard.set(location.relativePath, forKey: "testVideoPath")
}
if you don't understand what's going on, read up about it here:
https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/MediaPlaybackGuide/Contents/Resources/en.lproj/HTTPLiveStreaming/HTTPLiveStreaming.html
now you can use the stored HSL content to play the video in AVPlayer with the following code:
//get the saved link from the user defaults
let savedLink = UserDefaults.standard.string(forKey: "testVideoPath")
let baseUrl = URL(fileURLWithPath: NSHomeDirectory()) //app's home directory
let assetUrl = baseUrl.appendingPathComponent(savedLink!) //append the saved link to home path
now use the path to play video in AVPlayer
let avAssest = AVAsset(url: assetUrl)
let playerItem = AVPlayerItem(asset: avAssest)
let player = AVPlayer(playerItem: playerItem) // video path coming from above function
let playerViewController = AVPlayerViewController()
playerViewController.player = player
self.present(playerViewController, animated: true, completion: {
player.play()
})
The only way you can do this is to set up an HTTP server to serve the files locally after you've downloaded them.
The Live playlist uses a sliding-window. You need to periodically reload it after target-duration time and download only the new segments as they appear in the list (they will be removed at a later time).
Here are some related answers: Can IOS devices stream m3u8 segmented video from the local file system using html5 video and phonegap/cordova?
You can easily download an HLS stream with AVAssetDownloadURLSession makeAssetDownloadTask. Have a look at the AssetPersistenceManager in Apples Sample code: https://developer.apple.com/library/content/samplecode/HLSCatalog/Introduction/Intro.html
It should be fairly straight forward to use the Objective C version of the api.
Yes, you can download video stream served over HLS and watch it later.
There is a very straight forward sample app (HLSCatalog) from apple on this. The code is fairly simple. you can find it here - https://developer.apple.com/services-account/download?path=/Developer_Tools/FairPlay_Streaming_Server_SDK_v3.1/FairPlay_Streaming_Server_SDK_v3.1.zip
You can find more about offline HLS streaming here.
I've been scratching my head around this a full day now and don't seem to get closer, so I hope you guys can guide me in the right path :)
Heres's the situation.
I have a AVCaptureSession properly initialized where I add the audio input as follows :
let audioDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeAudio)
let audioIn = try! AVCaptureDeviceInput(device: audioDevice)
if (session.canAddInput(audioIn)) {
session.addInput(audioIn)
}
The audio output is added as follows :
if session.canAddOutput(self.audioOutput) {
self.audioOutput = AVCaptureAudioDataOutput()
session.addOutput(self.audioOutput)
self.audioConnection: AVCaptureConnection = self.audioOutput.connectionWithMediaType(AVMediaTypeAudio)
}
I then setup the recording settings :
if let audioAssetWriterOutput = self.audioOutput.recommendedAudioSettingsForAssetWriterWithOutputFileType(AVFileTypeAppleM4A) {
return audioAssetWriterOutput as? [String: AnyObject]
}
which I assign to my AVAssetWriterInput that is initialized in audio mode, with those settings and the correct format description.
_audioInput = AVAssetWriterInput(mediaType: AVMediaTypeAudio, outputSettings: audioSettings, sourceFormatHint: audioFormatDescription)
_audioInput!.expectsMediaDataInRealTime = true
then I just simply start the AVCaptureSession via startRunning() that will capture the audio data into a .M4A file.
Everything is fine during the audio capture, here are the observations that I made :
The file recorded exists on disk as expected
The file can be played by any player : my mac, my iphone (I imported it via iTunes), seems all good.
The file is at the correct location when I setup my AVAudioPlayer.
Tried to initialize the AVAudioPlayer with NSData or NSURL, same result
Now later in my code, I try to read that audio file via an AVAudioPlayer :
self.audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:pathLink] error:&error];
I get thrown an error
Error Domain=NSOSStatusErrorDomain Code=1954115647 "(null)"
which after double checking is "Unsupported type".
What's incorrect in my setup ? Does this come from my AVCaptureSession or can something be wrong with my AVAudioPlayer setup ?
Thanks !
In iOS 8/Xcode 6 I had a function that included a sound effect. It no longer works in iOS 9 after changing the code multiple times. This is what I've tried:
Original:
let bangSoundEffect = SKAction.playSoundFileNamed("Bang.mp3", waitForCompletion: false)
runAction(bangSoundEffect)
Other attempt:
self.runAction(SKAction.playSoundFileNamed("Bang.mp3", waitForCompletion: false))
Also:
func playRocketExplosionSound(filename: String) {
let url = NSBundle.mainBundle().URLForResource(
filename, withExtension: nil)
if (url == nil) {
print("Could not find file: \(filename)")
return }
var error: NSError? = nil
do {
backgroundMusicPlayer =
try AVAudioPlayer(contentsOfURL: url!)
} catch let error1 as NSError {
error = error1
backgroundMusicPlayer = nil
}
if backgroundMusicPlayer == nil {
print("Could not create audio player: \(error!)")
return}
backgroundMusicPlayer.numberOfLoops = 1
backgroundMusicPlayer.prepareToPlay()
backgroundMusicPlayer.play() }
playRocketExplosionSound("Bang.mp3")
I'm pulling my hair out. I'm using the same code in a different scene for another sound effect and it works fine!! What's going wrong?
I've noticed that the sound effect begins to play sometimes in the simulator, however it doesn't complete and throws this error:
2015-09-24 19:12:14.554 APPNAME[4982:270835] 19:12:14.553 ERROR: 177: timed out after 0.012s (735 736); mMajorChangePending=0
It doesn't work at all on actual devices.
What is the problem? :'(
Possible problem with MP3 file
The problem is most likely connected with the MP3 file you're using. The code works for other sounds, this suggests that the MP3 file might be corrupted and AVAudioPlayer fails with decoding it. You can try re-encode this file and see if the problem persists. Or, even better, converting it to WAV.
Using WAVs
General rule of the thumb when creating short sound effects for games, is to use WAV unless you really feel you need the trim the fat.
Top-notch games are going for top-of-the-line production quality, so they record and produce assets uncompressed 24bit/48kHz. Titles with slightly lesser ambitions might record and produce in 16/44.1, which is the official standard for CD quality audio.
This has at least two benefits. One is that the sound has a better quality. Second one, the CPU does not have to decode the file to play it.
Corrupt data file | AVAudioPlayer out of scope
1. Corrupt data file
This will ensure you have found the file:
var backgroundMusicPlayer: AVAudioPlayer? = nil
if let url = Bundle.main.url(
forResource: "Bang", withExtension: "mp3") {
do {
try backgroundMusicPlayer = AVAudioPlayer(contentsOf: url)
backgroundMusicPlayer!.play()
} catch {}
}
return nil
2. AVAudioPlayer out of scope
The variable retaining backgroundMusicPlayer must not go out of scope before play() has completed and returns. This is generally achieved by using a class variable:
var backgroundMusicPlayer: AVAudioPlayer? = nil
Don't do this: the following sound will play for, at best, outOfScopeDelay due to the local scope of var audioPlayer.
let outOfScopeDelay = 0.5
do {
var audioPlayer:AVAudioPlayer! // Incorrectly scoped variable
try audioPlayer = AVAudioPlayer(contentsOf: audioRecorder.url)
audioPlayer.play()
Thread.sleep(forTimeInterval: outOfScopeDelay)
} catch {}
► Find this solution on GitHub and additional details on Swift Recipes.
try this:
dispatch_async(dispatch_get_main_queue(), {
(self.playRocketExplosionSound("Bang.mp3")
})
it's no longer safe to play audio in child thread under iOS 9.
My app has a "Click" sound functionality. I used the
import AVFoundation
then the following function to run the "Click" sound:
var audioPlayer = AVAudioPlayer()
func playSound() {
var soundPath = NSBundle.mainBundle().pathForResource("tick", ofType: "wav")
var soundURL = NSURL.fileURLWithPath(soundPath!)
self.audioPlayer = AVAudioPlayer(contentsOfURL: soundURL, error: nil)
self.audioPlayer.play()
}
Now if the user is running a music player, my app causes the music player to stop. I read about the Audio Session Default Behavior in the documentation, but I don't know how to apply it.
Can you please help?
Thank you!
If you are wondering the syntax for swift 2, here it is:
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord, withOptions: .DuckOthers)
} catch {
print("AVAudioSession cannot be set: \(error)")
}
Depending on what you want the app to behave, i.e, how your app's sound effect or music should interact with other app's background audio session, you might need to tweak both the audio session category and categoryOption.
If you just want to play the sound effect, like "tick" sound, then, AVAudioSessionCategoryAmbient and DuckOthers should be used respectively, for example:
let audioSession = AVAudioSession.sharedInstance()
var error: NSErrorPointer = nil
audioSession.setCategory(AVAudioSessionCategoryAmbient, withOptions: .DuckOthers, error: error)
However, I suppose you are actually trying to play a sound effect, in this case, the AudioServices API is a more suitable choice. You can check func AudioServicesPlaySystemSound(inSystemSoundID: SystemSoundID) in AudioToolbox framework for more details.
Another common scenario. If you want to have your app to play audio exclusively, even if there're other app's playing the music in the background, you need to set the category to AVAudioSessionCategorySoloAmbient, for example:
let audioSession = AVAudioSession.sharedInstance()
var error: NSErrorPointer = nil
audioSession.setCategory(AVAudioSessionCategorySoloAmbient, error: error)
I hope you've got what you're looking for.
I am trying to play an audio file i saved on parse. I am getting the url from the PFFile from the object i saved to parse. When i run the app the avplayer produces no audio. I tested to see if the avplayer was playing by the first code snippet below and it prints out "Playing" which means the player is playing but no audio. I also tried setting the volume for avplayer and that didn't help. Don't understand why it won't play if anyone would like to help me out.
Audio File URL: http://files.parsetfss.com/292b6f11-5fee-4be7-b317-16fd494dfa3d/tfss-ccc3a843-967b-4773-b92e-1cf2e8f3c1c6-testfile.wav
This Code stops avplayer if it is playing:
if (player.rate > 0) && (player.error == nil) {
// player is playing
println("Playing")
} else {
println("Not Playing")
}
AVPlayer Code:
let objectAudio: PFObject = object as PFObject
let parseAudio: PFFile = objectAudio.valueForKey("audioFileParse") as PFFile
let audioPath: String = parseAudio.url
let urlParse: NSURL = NSURL(fileURLWithPath: audioPath)!
player = AVPlayer(URL: urlParse)
println(player) //prints out <AVPlayer: 0x79e863c0>
player.volume = 1.0
player.play()
You are using the wrong method to get a NSURL here, you try to create a local file URL from an URL that points to a resource on a remote server.
Instead of NSURL(fileURLWithPath: audioPath) you should use the initalizer that accepts an URL string as the input (see here https://developer.apple.com/library/mac/documentation/Cocoa/Reference/Foundation/Classes/NSURL_Class/#//apple_ref/occ/instm/NSURL/initWithString:)
Your current code would point to a local resource which does not exist on the local filesystem whereas it should point to the file on the Parse server.
Just as a reference, the difference between URLWithString and fileURLWithPath What is difference between URLWithString and fileURLWithPath of NSURL?