Overriding MPNowPlayingInfoCenter while using WKWebView - ios

I am trying to create a multi-source music player with tracks coming from YouTube and Soundcloud, and I would like to override the content of MPNowPlayingInfoCenter to provide informations about the artists/releases instead of the name of the YouTube video.
Everything worked well when I used UIWebview, but for performance reasons, I had to switch to the new WKWebview and now the method I used before to set the nowPlayingInfos has no effect
Is there a way to disable the automatic mapping of the <audio> and <video> tags inside the HTML and/or to override the infos it provides with my infos?
Here's the code that I use which works on iOS 7 and worked on iOS 8 when I used UIWebview:
let newInfos = [
MPMediaItemPropertyTitle: (currentPlaylist[currentPlaylistIndex] as! Track).trackName,
MPMediaItemPropertyArtist: (currentPlaylist[currentPlaylistIndex] as! Track).trackArtist,
MPMediaItemPropertyPlaybackDuration: NSNumber(integer: self.getDuration()),
MPNowPlayingInfoPropertyElapsedPlaybackTime: NSNumber(integer: self.getCurrentTime()),
MPNowPlayingInfoPropertyPlaybackRate: NSNumber(double: self.playing ? 1.0 : 0.0),
MPMediaItemPropertyArtwork: MPMediaItemArtwork(image: image)
]
MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo = newInfos
I checked that none of the variables I use are nil, and I activated my AudioSession in the AppDelegate
var audioSession = AVAudioSession.sharedInstance()
var error : NSError?
audioSession.setCategory(AVAudioSessionCategoryPlayback, error: &error)
audioSession.setActive(true, error: &error)
UIApplication.sharedApplication().beginReceivingRemoteControlEvents()
Any ideas?

Although it's impossible to override the property itself on iOS, it should be noted that changing the "title" property of the <video> element in JavaScript
I've achieved this behaviour with
var control = ...; //Create video DOM control
control.title = "Desired title";

Related

MPNowPlayingInfoCenter doesn't update any information when after assigning to nowPlayingInfo

I'm making an iOS app using Swift 3 in which displaying information about the currently playing item on the lock screen and control center would be nice.
Currently, I'm using the following code to attempt to insert this information into the nowPlayingInfo dictionary. I've also included reference to the VideoInfo class that is used in videoBeganPlaying(_:).
class VideoInfo {
var channelName: String
var title: String
}
// ...
var videoInfoNowPlaying: VideoInfo?
// ...
#objc private func videoBeganPlaying(_ notification: NSNotification?) {
// apparently these have to be here in order for this to work... but it doesn't
UIApplication.shared.beginReceivingRemoteControlEvents()
self.becomeFirstResponder()
guard let info = self.videoInfoNowPlaying else { return }
let artwork = MPMediaItemArtwork(boundsSize: .zero, requestHandler:
{ (_) -> UIImage in #imageLiteral(resourceName: "DefaultThumbnail") }) // this is filler
MPNowPlayingInfoCenter.default().nowPlayingInfo = [
MPMediaItemPropertyTitle: info.title,
MPMediaItemPropertyArtist: info.channelName,
MPMediaItemPropertyArtwork: artwork
]
print("Current title: ", MPNowPlayingInfoCenter.default().nowPlayingInfo?[MPMediaItemPropertyTitle])
}
The function is called when it should be, and the print statement executed, outputting Optional("title"). However, the control center and lockscreen do not update their information. Pause/play, and the skip forward button work, as I set them in viewDidLoad() using MPRemoteCommandCenter.
What's going wrong?
EDIT:
As matt pointed out, AVPlayerViewController makes the MPNowPlayingInfoCenter funky. This was my issue. I should have specified that this is the class I am using, not just AVPlayer.
See:
https://developer.apple.com/reference/avkit/avplayerviewcontroller/1845194-updatesnowplayinginfocenter
It does work, and you don't need all that glop about the first responder and so on, as you can readily prove to yourself by just going ahead and setting the now playing info and nothing else:
So why isn't it working for you? Probably because you are using some sort of player (such as AVPlayerViewController) that sets the now playing info itself in some way, and thus overrides your settings.
If you are using AVPlayerViewController, you can specify that the AVPlayerViewController instance doesn't update the NowPlayingInfoCenter:
playerViewController.updatesNowPlayingInfoCenter = false
This caught me out. Found a helpful post regarding this. https://nowplayingapps.com/how-to-give-more-control-to-your-users-using-mpnowplayinginfocenter/
"In order for the show the nowplayinginfo on the notification screen, you need to add this one last piece of code inside your AppDelegate’s didFinishLaunchingWithOptions function."
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
application.beginReceivingRemoteControlEvents()
}
2021 and I was still experiencing this issue when using AVPlayerViewController. I'm working on an app that plays both podcast and videos. Now playing info center works correctly with podcast but it doesn't show any info for videos being played using AVPlayerViewController even though I'm setting MPNowPlayingInfoCenter.nowPlayingInfo correctly.
Found a solution watching Now Playing and Remote Commands on tvOS
even though I'm not working with tvOS.
Basically, instead of filling the info using MPNowPlayingInfoCenter I set externalMetadata on the AVPlayerItem and it works.
let playerItem = AVPlayerItem(url: data.url)
let title = AVMutableMetadataItem()
title.identifier = .commonIdentifierTitle
title.value = "Title" as NSString
title.extendedLanguageTag = "und"
let artist = AVMutableMetadataItem()
artist.identifier = .commonIdentifierArtist
artist.value = "Artist" as NSString
artist.extendedLanguageTag = "und"
let artwork = AVMutableMetadataItem()
artwork.identifier = .commonIdentifierArtwork
artwork.value = imageData as NSData
artwork.dataType = kCMMetadataBaseDataType_JPEG as String
artwork.extendedLanguageTag = "und"
playerItem.externalMetadata = [title, artist, artwork]
This code updates now playing info correctly.

How to save recorded audio iOS?

I am developing an application in which audio is being recorded and being transcribed to text. I am using the Speechkit provided by Nuance Developers.
The functions I am adding are:
Save the recorded audio file to persistent memory
Display the audio files in a table view
Load the saved audio files later
Play the audio files
How do I save the audio files to persistent storage?
Here's the code : https://gist.github.com/buildFlash/48d143217b721823ff4c3c03a925ba55
When you record audio with AVAudioRecorder then you have to pass path as url of the location where you are storing your audio. so by defaul it's store audio at that location.
for example,
var audioSession:AVAudioSession = AVAudioSession.sharedInstance()
audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord, error: nil)
audioSession.setActive(true, error: nil)
var documents: AnyObject = NSSearchPathForDirectoriesInDomains( NSSearchPathDirectory.DocumentDirectory, NSSearchPathDomainMask.UserDomainMask, true)[0]
var str = documents.stringByAppendingPathComponent("myRecording1.caf")
var url = NSURL.fileURLWithPath(str as String)
var recordSettings = [AVFormatIDKey:kAudioFormatAppleIMA4,
AVSampleRateKey:44100.0,
AVNumberOfChannelsKey:2,AVEncoderBitRateKey:12800,
AVLinearPCMBitDepthKey:16,
AVEncoderAudioQualityKey:AVAudioQuality.Max.rawValue
]
println("url : \(url)")
var error: NSError?
audioRecorder = AVAudioRecorder(URL:url, settings: recordSettings, error: &error)
if let e = error {
println(e.localizedDescription)
} else {
audioRecorder.record()
}
So, here url is the location where your audio is stored and you can use that same url to play that audio. and you can get that file from url or path as data if you want to send it to server.
So, if you are using third party library then check that where it is storing audio and you can get it from there or it should have some method to get the location of it.
PS : there is no need to use third party library to record audio because you can easly manage it via AVAudioRecorder and AVAudioPlayer (for playing audio from url).
Inshort if you are recording audio then you definitely parallel storing it also!
You can refer Ravi shankar's tutorial also
Reference : this so post

Set MPNowPlayingInfoCenter with other background audio playing

I am trying to play a video using MPMoviePlayerController for an iOS app in Swift.
My goal is to be able to play system music with something like apple music, then open my app and have the audio mix in, but I want my app to be able to take control of MPNowPlayingInfoCenter.
How can I use AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, withOptions: .MixWithOthers) while still set the MPNowPlayingInfoCenter?
Google Maps mixes in audio while taking setting MPNowPlayingInfoCenter. Below is how I am trying to set the MPNowPlayingInfoCenter:
func setMeta(){
UIApplication.sharedApplication().beginReceivingRemoteControlEvents()
self.becomeFirstResponder()
if let player = PlayWorkoutViewController.player{
let coverArt = MPMediaItemArtwork(image: UIImage(named: "AlbumArt")!)
let dict: [String: AnyObject] = [
MPMediaItemPropertyArtwork: coverArt,
MPMediaItemPropertyTitle:workout.title,
MPMediaItemPropertyArtist:"Alex",
MPMediaItemPropertyAlbumTitle:workout.program.title,
MPNowPlayingInfoPropertyPlaybackRate: player.currentPlaybackRate,
MPNowPlayingInfoPropertyElapsedPlaybackTime: player.currentPlaybackTime,
MPMediaItemPropertyPlaybackDuration: player.playableDuration
]
MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo = dict
}
}
The above function works when I am not trying to play outside music with an option (.MixWithOthers) at the same time, but while I am trying to play outside music with the option (.MixWithOthers) the info center does not update.
Edit 1: Just to make things super clear, I already having video playing properly I am trying to play video with other background audio while being able to set MPNowPlayingInfoCenter.
This isn't currently possible in iOS. Even just changing your category options to .MixWithOthers causes your nowPlayingInfo to be ignored.
My guess is iOS only considers non-mixing apps for inclusion in MPNowPlayingInfoCenter, because there is uncertainty as to which app would show up in (e.g.) Control Center if there are multiple mixing apps playing at the same time.
I'd very much like it if iOS used a best-effort approach for choosing the "now playing app", something like this:
If there's a non-mixing app playing, pick that. Else..
If there's only one mixing app playing, pick that. Else..
If there are multiple mixing apps playing, just pick one :) Or pick none, I'm fine with either.
If you'd like this behavior as well, I'd encourage you to file a bug with Apple.
Have you tried implementing your own custom function to update the MPNowPlayingInfoCenter? Recently I was using an AVAudioPlayer to play music and needed to do the updating manually.
This is basically the function I called upon a new song being loaded.
func updateNowPlayingCenter() {
let center = MPNowPlayingInfoCenter.defaultCenter()
if nowPlayingItem == nil {
center.nowPlayingInfo = nil
} else {
var songInfo = [String: AnyObject]()
// Add item to dictionary if it exists
if let artist = nowPlayingItem?.artist {
songInfo[MPMediaItemPropertyArtist] = artist
}
if let title = nowPlayingItem?.title {
songInfo[MPMediaItemPropertyTitle] = title
}
if let albumTitle = nowPlayingItem?.albumTitle {
songInfo[MPMediaItemPropertyAlbumTitle] = albumTitle
}
if let playbackDuration = nowPlayingItem?.playbackDuration {
songInfo[MPMediaItemPropertyPlaybackDuration] = playbackDuration
}
if let artwork = nowPlayingItem?.artwork {
songInfo[MPMediaItemPropertyArtwork] = artwork
}
center.nowPlayingInfo = songInfo
}
}
I am not sure if doing this upon a movie being loaded will override the MPMoviePlayerController, but it seems worth a shot.
Additionally, they have depreciated MPMoviePlayerController and replaced it with AVPlayerViewController, so thats also worth looking into.
Edit: Also I would check to make sure that you are properly receiving remote control events, as this impacts the data being displayed by the info center.
To play the video in swift use this:-
func playVideoEffect() {
let path = NSBundle.mainBundle().pathForResource("egg_grabberAnmi", ofType:"mp4")
let url = NSURL.fileURLWithPath(path!)
self.moviePlayer = MPMoviePlayerController(contentURL: url)
if let player = moviePlayer {
let screenSize: CGRect = UIScreen.mainScreen().bounds
player.view.frame = CGRect(x: frame.size.width*0.10,y: frame.size.width/2, width:screenSize.width * 0.80, height: screenSize.width * 0.80)
player.view.sizeToFit()
player.scalingMode = MPMovieScalingMode.Fill
player.fullscreen = true
player.controlStyle = MPMovieControlStyle.None
player.movieSourceType = MPMovieSourceType.File
player.play()
self.view?.addSubview(player.view)
var timer = NSTimer.scheduledTimerWithTimeInterval(6.0, target: self, selector: Selector("update"), userInfo: nil, repeats: false)
}
}
// And Use the function to play Video
playVideoEffect()

Stream audio from microphone via bluetooth to another iPhone

I am trying to take incoming microphone audio and stream it to another iPhone. Basically a phone call but via bluetooth. I have the audio coming in via AVAudioRecorder:
func startRecording() {
audioRecorder = nil
let audioSession:AVAudioSession = AVAudioSession.sharedInstance()
audioSession.setCategory(AVAudioSessionCategoryRecord, error: nil)
var recordSettings:NSMutableDictionary = NSMutableDictionary(capacity: 10)
recordSettings.setObject(NSNumber(integerLiteral: kAudioFormatLinearPCM), forKey: AVFormatIDKey)
recordSettings.setObject(NSNumber(float: 44100.0), forKey: AVSampleRateKey)
recordSettings.setObject(NSNumber(int: 2), forKey: AVNumberOfChannelsKey)
recordSettings.setObject(NSNumber(int: 16), forKey: AVLinearPCMBitDepthKey)
recordSettings.setObject(NSNumber(bool: false), forKey: AVLinearPCMIsBigEndianKey)
recordSettings.setObject(NSNumber(bool: false), forKey: AVLinearPCMIsFloatKey)
soundPath = documentsDirectory.stringByAppendingPathComponent("record.caf")
refURL = NSURL(fileURLWithPath: soundPath as String)
var error:NSError?
audioRecorder = AVAudioRecorder(URL: refURL, settings: recordSettings as [NSObject : AnyObject], error: &error)
if audioRecorder.prepareToRecord() == true {
audioRecorder.meteringEnabled = true
audioRecorder.record()
} else {
println(error?.localizedDescription)
}
}
Then I tried using StreamReader from HERE - StreamReader from #martin-r
Using:
if let aStreamReader = StreamReader(path: documentsDirectory.stringByAppendingPathComponent("record.caf")) {
while let line = aStreamReader.nextLine() {
let dataz = line.dataUsingEncoding(NSUTF8StringEncoding)
println (line)
Then send the data to another device using:
self.appDelegate.mpcDelegate.session.sendData(data: NSData!, toPeers: [AnyObject]!, withMode: MCSessionSendDataMode, error: NSErrorPointer )
I convert line to NSData, then using a dispatch_after 0.5 seconds running constantly, I send it to another device via bluetooth.
It does not seem to work and I don't think this is a practical way of doing it. I have done numerous searches and haven't seen much on streaming data via bluetooth. The key word streaming (understandably) sends me to pages about server streaming.
My question is, how can I take audio from a microphone and send it to another iPhone via bluetooth? I have the bluetooth part all set up and it works great. My question is very similar to THIS except with iPhones and Swift - I want to have a phone call via bluetooth.
Thank you in advance.
To simultaneously record and redirect output you need to use the category AVAudioSessionCategoryMultiRoute.
Here's the link to the categories list:
https://developer.apple.com/library/ios/documentation/AVFoundation/Reference/AVAudioSession_ClassReference/#//apple_ref/doc/constant_group/Audio_Session_Categories
If all else fails you can use a pre-made solution:
http://audiob.us
It has an API that lets you integrate audio streaming from one app to another:
https://developer.audiob.us/
It supports multiple output endpoints.
Hope this helps.

Programatically get the name and artist of the currently playing track in Swift

I have tried the following:
let nowPlaying = MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo
However I get back nil everytime I run it with a song playing.
I would like to be able to grab the track title and artist and display it in my app.
You're going about this completely the wrong way. MPNowPlayingInfoCenter has nothing to do with learning what is currently playing. If you want to know what the Music app is currently playing, ask the "iPod music player" (in iOS 8, it is called MPMusicPlayerController.systemMusicPlayer).
try this, if you are writing an iOS app
let musicPlayer = MPMusicPlayerController.systemMusicPlayer
if let nowPlayingItem = musicPlayer.nowPlayingItem {
print(nowPlayingItem.title)
} else {
print("Nothing's playing")
}
This is a modified version of this answer.
Using Swift, you can get the Now Playing info, including title, artist, artwork and app on an iOS device using the following private API:
// Load framework
let bundle = CFBundleCreate(kCFAllocatorDefault, NSURL(fileURLWithPath: "/System/Library/PrivateFrameworks/MediaRemote.framework"))
// Get a Swift function for MRMediaRemoteGetNowPlayingInfo
guard let MRMediaRemoteGetNowPlayingInfoPointer = CFBundleGetFunctionPointerForName(bundle, "MRMediaRemoteGetNowPlayingInfo" as CFString) else { return }
typealias MRMediaRemoteGetNowPlayingInfoFunction = #convention(c) (DispatchQueue, #escaping ([String: Any]) -> Void) -> Void
let MRMediaRemoteGetNowPlayingInfo = unsafeBitCast(MRMediaRemoteGetNowPlayingInfoPointer, to: MRMediaRemoteGetNowPlayingInfoFunction.self)
// Get song info
MRMediaRemoteGetNowPlayingInfo(DispatchQueue.main, { (information) in
let bundleInfo = Dynamic._MRNowPlayingClientProtobuf.initWithData(information["kMRMediaRemoteNowPlayingInfoClientPropertiesData"])
print("\(information["kMRMediaRemoteNowPlayingInfoTitle"] as! String) by \(information["kMRMediaRemoteNowPlayingInfoArtist"] as! String) playing on \(bundleInfo.displayName.asString!)")
})
Returns SONG by ARTIST playing on APP.
Note this uses the Dynamic package to easily execute private headers.
This cannot be used in an App Store app due to the use of private API.
This is not an API to get the current playing item information from Music or another app, but to tell the system that your app is currently playing something and give it the information needed to display it on lock screen.
So basically what you're trying to do won't work as you expect it.
Did you set them?
var audioPlayer:MPMoviePlayerController=MPMoviePlayerController()
MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo = [
MPMediaItemPropertyAlbumTitle: "Album Title",
MPMediaItemPropertyTitle: "Title",
MPNowPlayingInfoPropertyElapsedPlaybackTime: audioPlayer.currentPlaybackTime,
MPMediaItemPropertyPlaybackDuration: audioPlayer.duration]
The now playing info center supports the following media item property keys:
MPMediaItemPropertyAlbumTitle
MPMediaItemPropertyAlbumTrackCount
MPMediaItemPropertyAlbumTrackNumber
MPMediaItemPropertyArtist
MPMediaItemPropertyArtwork
MPMediaItemPropertyComposer
MPMediaItemPropertyDiscCount
MPMediaItemPropertyDiscNumber
MPMediaItemPropertyGenre
MPMediaItemPropertyPersistentID
MPMediaItemPropertyPlaybackDuration
MPMediaItemPropertyTitle

Resources