i'm thinking about an iOS alarm app. At the time of the alarm i want to play custom music from Apple Music or another source.
Unfortunately app background operations are really restrictive and notifications only allow bundled music files to be played. Is there any way to achieve my goal by using background tasks or something else?
There're no other choices except using notifications, seealso: Background Execution
Change the sound of notification.
1. Import audio files.
Visit the iPod music library
let mediaquery = MPMediaQuery()
// MPMusicPlayerControllerNowPlayingItemDidChangeNotification
if let musics = mediaquery.items {
for music in musics {
let title = music.valueForProperty(MPMediaItemPropertyTitle) as? String
if let url = music.assetURL {
saveNotificationSound(url,name: title,isLast: music == musics.last)
}
}
}
The important param is assetURL, you can get audio file by it. NOTICE: If the music is download from Apple Music or in the iCloud, it's assertURL is nil.
Use file sharing
How to enable file sharing for my App
2. Cut the audio to 30s and specified format for notification
Because the notification is limited: 1. The duration is not more than 30s; 2. The format is limited, we cut it to m4a.
/**
Cut the duration and convert to m4a, than save it.
- parameter audioPath: Source file path
- parameter startTime: Cut start time
- parameter endTime: Cut end time
- parameter saveDirect: ...
- parameter handler: ...
*/
func cutoffAudio(audioPath: NSURL, startTime: Int64, endTime: Int64, saveDirect:NSURL, handler: (succeed: Bool) -> Void){
let audioAsset = AVURLAsset(URL: audioPath, options: nil)
if let exportSession = AVAssetExportSession(asset: audioAsset, presetName: AVAssetExportPresetAppleM4A){
let startTime = CMTimeMake(startTime, 1)
let stopTime = CMTimeMake(endTime, 1)
exportSession.outputURL = saveDirect
// Output is m4a
exportSession.outputFileType = AVFileTypeAppleM4A
exportSession.timeRange = CMTimeRangeFromTimeToTime(startTime, stopTime)
exportSession.exportAsynchronouslyWithCompletionHandler({
handler(succeed: exportSession.status == .Completed)
})
}
}
3. Set as the sound of notification
Attention: The custom audio files can only be placed in /Library/Sounds in App's Sandbox, than the soundName only needs to provide the file name (including the extension), the audio files are just like in the main bundle.
Here is a demo from github.com/ToFind1991
The code is not compatible with the current Swift version, you need to adjust it.
Related
I'm trying to get my audio file duration to display on my app but my code always return 0. I hope there is a method that notifies me when the AVplayer has data from the file and then I can call my code after that to get the data. Any suggestions?
func loadAudioUrl() {
guard let url = URL(string: sampleShortAudioUrl) else {return}
audioPlayer = AVPlayer(url: url)
audioPlayer?.play()
if let duration = audioPlayer?.currentItem?.duration{
print(duration)
}
}
You can get duration, but you need to wait because content is loading. Your code assumes that it is loaded instantly.
You need to use AVPlayerItem with AVPlayer.
When AVPlayerItem status is ready to play, you can ask for duration. Complete code example is right from Apple here:
AVPlayerItem
You can't get duration from a remote audio url , you must store it's duration remotely in your database and grap it while listening or downloading . . .
I'm streaming audio livestreams and audio on-demand m3u8 files from my iOS sender app to the chromecast. When doing so, the receivers screen is black and not showing any information about the current streamed audio. Only when the stream is paused, the information is shown. Is there any way to show the audio information while playing the audio? If mp3 files are streamed to the chromecast, the audio information is shown in playing and paused state.
I'm using the Cast 4.0.1 SDK, this is my media setup:
// setup metadata with playback information
let metadata = GCKMediaMetadata(metadataType: GCKMediaMetadataType.musicTrack)
metadata.setString(playback.title, forKey: kGCKMetadataKeyTitle)
metadata.setString(playback.subtitle, forKey: kGCKMetadataKeyArtist)
metadata.addImage(GCKImage(url: playback.imageSmall, width: 1024, height: 1024))
let streamType: GCKMediaStreamType = .buffered
let contentType: String = "application/vnd.apple.mpegurl"
let duration: Double = playback.duration
let mediaInfo = GCKMediaInformation(contentID: streamURL.absoluteString, streamType: streamType, contentType: contentType, metadata: metadata, streamDuration: duration, mediaTracks: nil, textTrackStyle: nil, customData: nil)
// seek to start position
let mediaLoadOptions = GCKMediaLoadOptions()
mediaLoadOptions.playPosition = playPosition
// load media and start playback
let request = session.remoteMediaClient?.loadMedia(mediaInfo, with: mediaLoadOptions)
request?.delegate = self
You should use the Chrome Remote Debugger to determine if there are any errors or debug information that will explain the issue you are seeing.
Not sure about this but you need to use a custom receiver in order to use remote debugging and you may not have added the code to your custom receiver to display what you want.
Try changing to the default receiver. If that displays what you want then try adding more display code to your custom receiver. If it doesn't then try adding more metadata to your mediaInfo.
I am having a really difficult time with playing audio in the background of my app. The app is a timer that is counting down and plays bells, and everything worked using the timer originally. Since you cannot run a timer over 3 minutes in the background, I need to play the bells another way.
The user has the ability to choose bells and set the time for these bells to play (e.g. play bell immediately, after 5 minutes, repeat another bell every 10 minutes, etc).
So far I have tried using notifications using DispatchQueue.main and this will work fine if the user does not pause the timer. If they re-enter the app though and pause, I cannot seem to cancel this queue or pause it in anyway.
Next I tried using AVAudioEngine, and created a set of nodes. These will play while the app is in the foreground but seem to stop upon backgrounding. Additionally when I pause the engine and resume later, it won't pause the sequence properly. It will squish the bells into playing one after the other or not at all.
If anyone has any ideas of how to solve my issue that would be great. Technically I could try remove everything from the engine and recreate it from the paused time when the user pauses/resumes, but this seems quite costly. It also doesn't solve the problem of the audio stopping in the background. I have the required background mode 'App plays audio or streams audio/video using Airplay', and it is also checked under the background modes in capabilities.
Below is a sample of how I tried to set up the audio engine. The registerAndPlaySound method is called several other times to create the chain of nodes (or is this done incorrectly?). The code is kinda messy at the moment because I have been trying many ways trying to get this to work.
func setupSounds{
if (attached){
engine.detach(player)
}
engine.attach(player)
attached = true
let mixer = engine.mainMixerNode
engine.connect(player, to: mixer, format: mixer.outputFormat(forBus: 0))
var bell = ""
do {
try engine.start()
} catch {
return
}
if (currentSession.bellObject?.startBell != nil){
bell = (currentSession.bellObject?.startBell)!
guard let url = Bundle.main.url(forResource: bell, withExtension: "mp3") else {
return
}
registerAndPlaySound(url: url, delay: warmUpTime)
}
}
func registerAndPlaySound(url: URL, delay: Double) {
do {
let file = try AVAudioFile(forReading: url)
let format = file.processingFormat
let capacity = file.length
let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: AVAudioFrameCount(capacity))
do {
try file.read(into: buffer)
}catch {
return
}
let sampleRate = buffer.format.sampleRate
let sampleTime = sampleRate*delay
let futureTime = AVAudioTime(sampleTime: AVAudioFramePosition(sampleTime), atRate: sampleRate)
player.scheduleBuffer(buffer, at: futureTime, options: AVAudioPlayerNodeBufferOptions(rawValue: 0), completionHandler: nil)
player.play()
} catch {
return
}
}
I need to create a Audio Player for streamed URL (m3u8 format). I have created music player using AVPlayer. But I need to show visualizer for streamed song. I have tried different solution but not found any working example of it.
I have created visualizer using AVAudioPlayer(averagePower) but it won't support streamed URL.
Any help to show visualizer for AVPlayer? Thanks in advance.
I have also tried using MYAudioTapProcessor which most of the people suggested, but for streamed URL, tracks always returns null.
Added the MYAudioTapProcessor.h and MYAudioTapProcessor.m in project
//Initialization of player
let playerItem = AVPlayerItem( url:NSURL( string:"https://bitdash-a.akamaihd.net/content/sintel/hls/playlist.m3u8" ) as! URL )
let audioPlayer: AVPlayer = AVPlayer(playerItem:playerItem)
//Added periodic time observer
audioPlayer!.addPeriodicTimeObserver(forInterval: CMTimeMakeWithSeconds(1, 1), queue: DispatchQueue.main) { (CMTime) -> Void in
if audioPlayer!.currentItem?.status == .readyToPlay
{
if let playerItem: AVPlayerItem = audioPlayer!.currentItem {
print(playerItem.asset.tracks.count)
if (playerItem.asset.tracks) != nil {
self.tapProcessor = MYAudioTapProcessor(avPlayerItem: playerItem)
playerItem.audioMix = self.tapProcessor.audioMix
self.tapProcessor.delegate = self
}
}
}
}
//Delegate callback method for MYAudioTapProcessor
func audioTabProcessor(_ audioTabProcessor: MYAudioTapProcessor!, hasNewLeftChannelValue leftChannelValue: Float, rightChannelValue: Float) {
print("volume: \(leftChannelValue) : \(rightChannelValue)")
volumeSlider.value = leftChannelValue
}
Also tried by adding the "Track" observer.
playerItem.addObserver(self, forKeyPath: "tracks", options: NSKeyValueObservingOptions.new, context: nil);
Now if play mp3 file, the callback method calls but for m3u8 callback method doesn't call. The main reason for failing m3u8 URL is it always show tracks array count zero whereas for mp3 files tracks array has one item.
You cannot get tracks for HLS via AVPLayer. You should use progressive download or local file for getting audio tracks while playing media.
Since iOS 10, Apple has provided the support for downloading HLS (m3u8) video for offline viewing.
My question is: Is it necessary that we can only download HLS when it is being played ? Or we can just download when user press download button and show progress.
Does anyone has implemented this in Objective C version? Actually my previous App is made in Objective C. Now I want to add support for downloading HLS rather than MP4 (previously I was downloading MP4 for offline view).
I am really desperate to this. Please share thoughts or any code if implemented.
I used the apple code guid to download HLS content with the following code:
var configuration: URLSessionConfiguration?
var downloadSession: AVAssetDownloadURLSession?
var downloadIdentifier = "\(Bundle.main.bundleIdentifier!).background"
func setupAssetDownload(videoUrl: String) {
// Create new background session configuration.
configuration = URLSessionConfiguration.background(withIdentifier: downloadIdentifier)
// Create a new AVAssetDownloadURLSession with background configuration, delegate, and queue
downloadSession = AVAssetDownloadURLSession(configuration: configuration!,
assetDownloadDelegate: self,
delegateQueue: OperationQueue.main)
if let url = URL(string: videoUrl){
let asset = AVURLAsset(url: url)
// Create new AVAssetDownloadTask for the desired asset
let downloadTask = downloadSession?.makeAssetDownloadTask(asset: asset,
assetTitle: "Some Title",
assetArtworkData: nil,
options: nil)
// Start task and begin download
downloadTask?.resume()
}
}//end method
func urlSession(_ session: URLSession, assetDownloadTask: AVAssetDownloadTask, didFinishDownloadingTo location: URL) {
// Do not move the asset from the download location
UserDefaults.standard.set(location.relativePath, forKey: "testVideoPath")
}
if you don't understand what's going on, read up about it here:
https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/MediaPlaybackGuide/Contents/Resources/en.lproj/HTTPLiveStreaming/HTTPLiveStreaming.html
now you can use the stored HSL content to play the video in AVPlayer with the following code:
//get the saved link from the user defaults
let savedLink = UserDefaults.standard.string(forKey: "testVideoPath")
let baseUrl = URL(fileURLWithPath: NSHomeDirectory()) //app's home directory
let assetUrl = baseUrl.appendingPathComponent(savedLink!) //append the saved link to home path
now use the path to play video in AVPlayer
let avAssest = AVAsset(url: assetUrl)
let playerItem = AVPlayerItem(asset: avAssest)
let player = AVPlayer(playerItem: playerItem) // video path coming from above function
let playerViewController = AVPlayerViewController()
playerViewController.player = player
self.present(playerViewController, animated: true, completion: {
player.play()
})
The only way you can do this is to set up an HTTP server to serve the files locally after you've downloaded them.
The Live playlist uses a sliding-window. You need to periodically reload it after target-duration time and download only the new segments as they appear in the list (they will be removed at a later time).
Here are some related answers: Can IOS devices stream m3u8 segmented video from the local file system using html5 video and phonegap/cordova?
You can easily download an HLS stream with AVAssetDownloadURLSession makeAssetDownloadTask. Have a look at the AssetPersistenceManager in Apples Sample code: https://developer.apple.com/library/content/samplecode/HLSCatalog/Introduction/Intro.html
It should be fairly straight forward to use the Objective C version of the api.
Yes, you can download video stream served over HLS and watch it later.
There is a very straight forward sample app (HLSCatalog) from apple on this. The code is fairly simple. you can find it here - https://developer.apple.com/services-account/download?path=/Developer_Tools/FairPlay_Streaming_Server_SDK_v3.1/FairPlay_Streaming_Server_SDK_v3.1.zip
You can find more about offline HLS streaming here.