I am trying to play a video using MPMoviePlayerController for an iOS app in Swift.
My goal is to be able to play system music with something like apple music, then open my app and have the audio mix in, but I want my app to be able to take control of MPNowPlayingInfoCenter.
How can I use AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, withOptions: .MixWithOthers) while still set the MPNowPlayingInfoCenter?
Google Maps mixes in audio while taking setting MPNowPlayingInfoCenter. Below is how I am trying to set the MPNowPlayingInfoCenter:
func setMeta(){
UIApplication.sharedApplication().beginReceivingRemoteControlEvents()
self.becomeFirstResponder()
if let player = PlayWorkoutViewController.player{
let coverArt = MPMediaItemArtwork(image: UIImage(named: "AlbumArt")!)
let dict: [String: AnyObject] = [
MPMediaItemPropertyArtwork: coverArt,
MPMediaItemPropertyTitle:workout.title,
MPMediaItemPropertyArtist:"Alex",
MPMediaItemPropertyAlbumTitle:workout.program.title,
MPNowPlayingInfoPropertyPlaybackRate: player.currentPlaybackRate,
MPNowPlayingInfoPropertyElapsedPlaybackTime: player.currentPlaybackTime,
MPMediaItemPropertyPlaybackDuration: player.playableDuration
]
MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo = dict
}
}
The above function works when I am not trying to play outside music with an option (.MixWithOthers) at the same time, but while I am trying to play outside music with the option (.MixWithOthers) the info center does not update.
Edit 1: Just to make things super clear, I already having video playing properly I am trying to play video with other background audio while being able to set MPNowPlayingInfoCenter.
This isn't currently possible in iOS. Even just changing your category options to .MixWithOthers causes your nowPlayingInfo to be ignored.
My guess is iOS only considers non-mixing apps for inclusion in MPNowPlayingInfoCenter, because there is uncertainty as to which app would show up in (e.g.) Control Center if there are multiple mixing apps playing at the same time.
I'd very much like it if iOS used a best-effort approach for choosing the "now playing app", something like this:
If there's a non-mixing app playing, pick that. Else..
If there's only one mixing app playing, pick that. Else..
If there are multiple mixing apps playing, just pick one :) Or pick none, I'm fine with either.
If you'd like this behavior as well, I'd encourage you to file a bug with Apple.
Have you tried implementing your own custom function to update the MPNowPlayingInfoCenter? Recently I was using an AVAudioPlayer to play music and needed to do the updating manually.
This is basically the function I called upon a new song being loaded.
func updateNowPlayingCenter() {
let center = MPNowPlayingInfoCenter.defaultCenter()
if nowPlayingItem == nil {
center.nowPlayingInfo = nil
} else {
var songInfo = [String: AnyObject]()
// Add item to dictionary if it exists
if let artist = nowPlayingItem?.artist {
songInfo[MPMediaItemPropertyArtist] = artist
}
if let title = nowPlayingItem?.title {
songInfo[MPMediaItemPropertyTitle] = title
}
if let albumTitle = nowPlayingItem?.albumTitle {
songInfo[MPMediaItemPropertyAlbumTitle] = albumTitle
}
if let playbackDuration = nowPlayingItem?.playbackDuration {
songInfo[MPMediaItemPropertyPlaybackDuration] = playbackDuration
}
if let artwork = nowPlayingItem?.artwork {
songInfo[MPMediaItemPropertyArtwork] = artwork
}
center.nowPlayingInfo = songInfo
}
}
I am not sure if doing this upon a movie being loaded will override the MPMoviePlayerController, but it seems worth a shot.
Additionally, they have depreciated MPMoviePlayerController and replaced it with AVPlayerViewController, so thats also worth looking into.
Edit: Also I would check to make sure that you are properly receiving remote control events, as this impacts the data being displayed by the info center.
To play the video in swift use this:-
func playVideoEffect() {
let path = NSBundle.mainBundle().pathForResource("egg_grabberAnmi", ofType:"mp4")
let url = NSURL.fileURLWithPath(path!)
self.moviePlayer = MPMoviePlayerController(contentURL: url)
if let player = moviePlayer {
let screenSize: CGRect = UIScreen.mainScreen().bounds
player.view.frame = CGRect(x: frame.size.width*0.10,y: frame.size.width/2, width:screenSize.width * 0.80, height: screenSize.width * 0.80)
player.view.sizeToFit()
player.scalingMode = MPMovieScalingMode.Fill
player.fullscreen = true
player.controlStyle = MPMovieControlStyle.None
player.movieSourceType = MPMovieSourceType.File
player.play()
self.view?.addSubview(player.view)
var timer = NSTimer.scheduledTimerWithTimeInterval(6.0, target: self, selector: Selector("update"), userInfo: nil, repeats: false)
}
}
// And Use the function to play Video
playVideoEffect()
Related
I would like to make a 5-band audio equalizer (60Hz, 230Hz, 910Hz, 4kHz, 14kHz) using AVAudioEngine. I would like to have the user input gain per band through a vertical slider and accordingly adjust the audio that is playing. I tried using AVAudioUnitEQ to do this, but I hear no difference when playing the audio. I tried to hardcode in values to specify a gain at each frequency, but it still does not work. Here is the code I have:
var audioEngine: AVAudioEngine = AVAudioEngine()
var equalizer: AVAudioUnitEQ!
var audioPlayerNode: AVAudioPlayerNode = AVAudioPlayerNode()
var audioFile: AVAudioFile!
// in viewDidLoad():
equalizer = AVAudioUnitEQ(numberOfBands: 5)
audioEngine.attach(audioPlayerNode)
audioEngine.attach(equalizer)
let bands = equalizer.bands
let freqs = [60, 230, 910, 4000, 14000]
audioEngine.connect(audioPlayerNode, to: equalizer, format: nil)
audioEngine.connect(equalizer, to: audioEngine.outputNode, format: nil)
for i in 0...(bands.count - 1) {
bands[i].frequency = Float(freqs[i])
}
bands[0].gain = -10.0
bands[0].filterType = .lowShelf
bands[1].gain = -10.0
bands[1].filterType = .lowShelf
bands[2].gain = -10.0
bands[2].filterType = .lowShelf
bands[3].gain = 10.0
bands[3].filterType = .highShelf
bands[4].gain = 10.0
bands[4].filterType = .highShelf
do {
if let filepath = Bundle.main.path(forResource: "song", ofType: "mp3") {
let filepathURL = NSURL.fileURL(withPath: filepath)
audioFile = try AVAudioFile(forReading: filepathURL)
audioEngine.prepare()
try audioEngine.start()
audioPlayerNode.scheduleFile(audioFile, at: nil, completionHandler: nil)
audioPlayerNode.play()
}
} catch _ {}
Since the low frequencies have a gain of -10 and the high frequencies have a gain of 10, there should be a very noticeable difference when playing any media. However, when the media starts playing, it sounds the same as if played without any equalizer attached.
I'm not sure why this is happening, but I tried several different things to debug. I thought that it might be the order of the functions so I tried switching it so that audioEngine.connect is called after adjusting all of the bands, but that did not make a difference either.
I tried this same code with using an AVAudioUnitTimePitch, and it worked perfectly, so I am dumbfounded as to why it does not work with AVAudioUnitEQ.
I do not want to use any third-party libraries or cocoa pods for this project, I would like to do it using AVFoundation alone.
Any help would be greatly appreciated!
Thanks in advance.
AVAudioUnitEQFilterParameters
Looking through the documentation, I noticed that I had messed with all of the parameters except bypass and it seems that changing this flag fixed everything!
So, I believe the main issue here is that each AVAudioUnitEQ band must not be bypassed by the provided system values rather than the values the programmer sets.
So, I changed
for i in 0...(bands.count - 1) {
bands[i].frequency = Float(freqs[i])
}
to
for i in 0...(bands.count - 1) {
bands[i].frequency = Float(freqs[i])
bands[i].bypass = false
bands[i].filtertype = .parametric
}
and everything started working. Furthermore, to make an effective equalizer that allows the user to modify individual frequencies the filtertype for each band should be set to .parametric.
I am still unsure on what I should set the bandwith to, but I can probably check online for that or just mess with it until the sound matches a different equalizer application.
I'm making an iOS app using Swift 3 in which displaying information about the currently playing item on the lock screen and control center would be nice.
Currently, I'm using the following code to attempt to insert this information into the nowPlayingInfo dictionary. I've also included reference to the VideoInfo class that is used in videoBeganPlaying(_:).
class VideoInfo {
var channelName: String
var title: String
}
// ...
var videoInfoNowPlaying: VideoInfo?
// ...
#objc private func videoBeganPlaying(_ notification: NSNotification?) {
// apparently these have to be here in order for this to work... but it doesn't
UIApplication.shared.beginReceivingRemoteControlEvents()
self.becomeFirstResponder()
guard let info = self.videoInfoNowPlaying else { return }
let artwork = MPMediaItemArtwork(boundsSize: .zero, requestHandler:
{ (_) -> UIImage in #imageLiteral(resourceName: "DefaultThumbnail") }) // this is filler
MPNowPlayingInfoCenter.default().nowPlayingInfo = [
MPMediaItemPropertyTitle: info.title,
MPMediaItemPropertyArtist: info.channelName,
MPMediaItemPropertyArtwork: artwork
]
print("Current title: ", MPNowPlayingInfoCenter.default().nowPlayingInfo?[MPMediaItemPropertyTitle])
}
The function is called when it should be, and the print statement executed, outputting Optional("title"). However, the control center and lockscreen do not update their information. Pause/play, and the skip forward button work, as I set them in viewDidLoad() using MPRemoteCommandCenter.
What's going wrong?
EDIT:
As matt pointed out, AVPlayerViewController makes the MPNowPlayingInfoCenter funky. This was my issue. I should have specified that this is the class I am using, not just AVPlayer.
See:
https://developer.apple.com/reference/avkit/avplayerviewcontroller/1845194-updatesnowplayinginfocenter
It does work, and you don't need all that glop about the first responder and so on, as you can readily prove to yourself by just going ahead and setting the now playing info and nothing else:
So why isn't it working for you? Probably because you are using some sort of player (such as AVPlayerViewController) that sets the now playing info itself in some way, and thus overrides your settings.
If you are using AVPlayerViewController, you can specify that the AVPlayerViewController instance doesn't update the NowPlayingInfoCenter:
playerViewController.updatesNowPlayingInfoCenter = false
This caught me out. Found a helpful post regarding this. https://nowplayingapps.com/how-to-give-more-control-to-your-users-using-mpnowplayinginfocenter/
"In order for the show the nowplayinginfo on the notification screen, you need to add this one last piece of code inside your AppDelegate’s didFinishLaunchingWithOptions function."
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
application.beginReceivingRemoteControlEvents()
}
2021 and I was still experiencing this issue when using AVPlayerViewController. I'm working on an app that plays both podcast and videos. Now playing info center works correctly with podcast but it doesn't show any info for videos being played using AVPlayerViewController even though I'm setting MPNowPlayingInfoCenter.nowPlayingInfo correctly.
Found a solution watching Now Playing and Remote Commands on tvOS
even though I'm not working with tvOS.
Basically, instead of filling the info using MPNowPlayingInfoCenter I set externalMetadata on the AVPlayerItem and it works.
let playerItem = AVPlayerItem(url: data.url)
let title = AVMutableMetadataItem()
title.identifier = .commonIdentifierTitle
title.value = "Title" as NSString
title.extendedLanguageTag = "und"
let artist = AVMutableMetadataItem()
artist.identifier = .commonIdentifierArtist
artist.value = "Artist" as NSString
artist.extendedLanguageTag = "und"
let artwork = AVMutableMetadataItem()
artwork.identifier = .commonIdentifierArtwork
artwork.value = imageData as NSData
artwork.dataType = kCMMetadataBaseDataType_JPEG as String
artwork.extendedLanguageTag = "und"
playerItem.externalMetadata = [title, artist, artwork]
This code updates now playing info correctly.
I'm making an iOS app in Swift that plays a video in a loop in a small layer in the top right corner of the screen which shows a video of specific coloured item. the user then taps the corresponding coloured item on the screen. when they do, the videoName variable is randomly changed to the next colour and the corresponding video is triggered. I have no problem raising, playing and looping the video with AVPlayer, AVPlayerItem as seen in the attached code.
Where I'm stuck is that whenever the next video is shown, previous ones stay open behind it. Also, After 16 videos have played, the player disappears altogether on my iPad. I've tried many suggestions presented in this and other sites but either swift finds a problem with them or it just doesn't work.
So question: within my code here, how do I tell it "hey the next video has started to play, remove the previous video and it's layer and free up the memory so i can play as many videos as needed"?
//set variables for video play
var playerItem:AVPlayerItem?
var player:AVPlayer?
//variables that contain video file path, name and extension
var videoPath = NSBundle.mainBundle().resourcePath!
var videoName = "blue"
let videoExtension = ".mp4"
//DISPLAY VIDEO
func showVideo(){
//Assign url path
let url = NSURL(fileURLWithPath: videoPath+"/Base.lproj/"+videoName+videoExtension)
playerItem = AVPlayerItem(URL: url)
player=AVPlayer(playerItem: playerItem!)
let playerLayer=AVPlayerLayer(player: player!)
//setplayser location in uiview and show video
playerLayer.frame=CGRectMake(700, 5, 350, 350)
self.view.layer.addSublayer(playerLayer)
player!.play()
// Add notification to know when the video ends, then replay it again. THIS IS A CONTINUAL LOOP
NSNotificationCenter.defaultCenter().addObserverForName(AVPlayerItemDidPlayToEndTimeNotification, object: player!.currentItem, queue: nil)
{ notification in
let t1 = CMTimeMake(5, 100);
self.player!.seekToTime(t1)
self.player!.play()
}
}
`
I adapted #Anupam Mishra's Swift code suggestion. It wasn't working at first but finally figured I had to take the playerLayer outside the function and close the playerLayer after I set the player to nil. Also. instead of using 'if(player!.rate>0 ...)' which would no doubt still work, I created a variable switch to indicate when to say "kill the player AND the layer" as seen below. It may not be pretty but it works! The following is for absolute newbies like myself - WHAT I LEARNED FROM THIS EXPERIENCE: it seems to me that an ios device only allows 16 layers to be added to a viewController (or superLayer). so each layer needs to be deleted before opening the next layer with its player unless you really want 16 layers running all at once. WHAT THIS CODE BELOW ACTUALLY DOES FOR YOU: this code creates a re-sizable layer over an existing viewController and plays a video from your bundle in an endless loop. When the next video is about to be called, the current video and the layer are totally removed, freeing up the memory, and then a new layer with the new video is played. The video layer size and location is totally customizable using the playerLayer.frame=CGRectMake(left, top, width, height) parameters. HOW TO MAKE IT ALL WORK: Assuming you've already added your videos to you bundle, create another function for your button tap. in that function, first call the 'stopPlayer()' function, change the 'videoName' variable to the new video name you desire, then call the 'showVideo()' function. (if you need to change the video extension, change 'videoExtension' from a let to a var.
`
//set variables for video play
var playerItem:AVPlayerItem?
var player:AVPlayer?
var playerLayer = AVPlayerLayer() //NEW playerLayer var location
//variables that contain video file path, name and extension
var videoPath = NSBundle.mainBundle().resourcePath!
var videoName = "blue"
let videoExtension = ".mp4"
var createLayerSwitch = true /*NEW switch to say whether on not to create the layer when referenced by the closePlayer and showVideo functions*/
//DISPLAY VIDEO
func showVideo(){
//Assign url path
let url = NSURL(fileURLWithPath: videoPath+"/Base.lproj/"+videoName+videoExtension)
playerItem = AVPlayerItem(URL: url)
player=AVPlayer(playerItem: playerItem!)
playerLayer=AVPlayerLayer(player: player!) //NEW: remove 'let' from playeLayer here.
//setplayser location in uiview and show video
playerLayer.frame=CGRectMake(700, 5, 350, 350)
self.view.layer.addSublayer(playerLayer)
player!.play()
createLayerSwitch = false //NEW switch to tell if a layer is already created or not. I set the switch to false so that when the next tapped item/button references the closePlayer() function, the condition is triggered to close the player AND the layer
// Add notification to know when the video ends, then replay it again without a pause between replays. THIS IS A CONTINUAL LOOP
NSNotificationCenter.defaultCenter().addObserverForName(AVPlayerItemDidPlayToEndTimeNotification, object: player!.currentItem, queue: nil)
{ notification in
let t1 = CMTimeMake(5, 100);
self.player!.seekToTime(t1)
self.player!.play()
}
}
//NEW function to kill the current player and layer before playing the next video
func closePlayer(){
if (createLayerSwitch == false) {
player!.pause()
player = nil
playerLayer.removefromsuperlayer()
}
}
`
Why don't just use replaceCurrentItemWithPlayerItem on your player ? You will keep the same player for all your videos. I think it's a better way to do.
Edit : replaceCurrentItemWithPlayerItem has to be call on the main thread
Before moving on next track first check, is player having any videos or music, to stop it do the following checks:
Swift Code-
if player!.rate > 0 && player!.error == nil
{
player!.pause()
player = nil
}
Objective-C Code-
if (player.rate > 0 && !player.error)
{
[player setRate:0.0];
}
I have tried the following:
let nowPlaying = MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo
However I get back nil everytime I run it with a song playing.
I would like to be able to grab the track title and artist and display it in my app.
You're going about this completely the wrong way. MPNowPlayingInfoCenter has nothing to do with learning what is currently playing. If you want to know what the Music app is currently playing, ask the "iPod music player" (in iOS 8, it is called MPMusicPlayerController.systemMusicPlayer).
try this, if you are writing an iOS app
let musicPlayer = MPMusicPlayerController.systemMusicPlayer
if let nowPlayingItem = musicPlayer.nowPlayingItem {
print(nowPlayingItem.title)
} else {
print("Nothing's playing")
}
This is a modified version of this answer.
Using Swift, you can get the Now Playing info, including title, artist, artwork and app on an iOS device using the following private API:
// Load framework
let bundle = CFBundleCreate(kCFAllocatorDefault, NSURL(fileURLWithPath: "/System/Library/PrivateFrameworks/MediaRemote.framework"))
// Get a Swift function for MRMediaRemoteGetNowPlayingInfo
guard let MRMediaRemoteGetNowPlayingInfoPointer = CFBundleGetFunctionPointerForName(bundle, "MRMediaRemoteGetNowPlayingInfo" as CFString) else { return }
typealias MRMediaRemoteGetNowPlayingInfoFunction = #convention(c) (DispatchQueue, #escaping ([String: Any]) -> Void) -> Void
let MRMediaRemoteGetNowPlayingInfo = unsafeBitCast(MRMediaRemoteGetNowPlayingInfoPointer, to: MRMediaRemoteGetNowPlayingInfoFunction.self)
// Get song info
MRMediaRemoteGetNowPlayingInfo(DispatchQueue.main, { (information) in
let bundleInfo = Dynamic._MRNowPlayingClientProtobuf.initWithData(information["kMRMediaRemoteNowPlayingInfoClientPropertiesData"])
print("\(information["kMRMediaRemoteNowPlayingInfoTitle"] as! String) by \(information["kMRMediaRemoteNowPlayingInfoArtist"] as! String) playing on \(bundleInfo.displayName.asString!)")
})
Returns SONG by ARTIST playing on APP.
Note this uses the Dynamic package to easily execute private headers.
This cannot be used in an App Store app due to the use of private API.
This is not an API to get the current playing item information from Music or another app, but to tell the system that your app is currently playing something and give it the information needed to display it on lock screen.
So basically what you're trying to do won't work as you expect it.
Did you set them?
var audioPlayer:MPMoviePlayerController=MPMoviePlayerController()
MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo = [
MPMediaItemPropertyAlbumTitle: "Album Title",
MPMediaItemPropertyTitle: "Title",
MPNowPlayingInfoPropertyElapsedPlaybackTime: audioPlayer.currentPlaybackTime,
MPMediaItemPropertyPlaybackDuration: audioPlayer.duration]
The now playing info center supports the following media item property keys:
MPMediaItemPropertyAlbumTitle
MPMediaItemPropertyAlbumTrackCount
MPMediaItemPropertyAlbumTrackNumber
MPMediaItemPropertyArtist
MPMediaItemPropertyArtwork
MPMediaItemPropertyComposer
MPMediaItemPropertyDiscCount
MPMediaItemPropertyDiscNumber
MPMediaItemPropertyGenre
MPMediaItemPropertyPersistentID
MPMediaItemPropertyPlaybackDuration
MPMediaItemPropertyTitle
I am using an AVPlayer object to play a live stream of an MP3, just using an HTTP link.
As with normal live video playing in iOS, there is a button that you can press to skip to live.
Is it possible to do this with AVPlayer?
E.g I am listening live, pause, then after however long play again. It continues from where I left it. But what if I want to skip to live?
I had the same need and made this extension to AVPlayer:
extension AVPlayer {
func seekToLive() {
if let items = currentItem?.seekableTimeRanges, !items.isEmpty {
let range = items[items.count - 1]
let timeRange = range.timeRangeValue
let startSeconds = CMTimeGetSeconds(timeRange.start)
let durationSeconds = CMTimeGetSeconds(timeRange.duration)
seek(to: CMTimeMakeWithSeconds(startSeconds + durationSeconds, 1))
}
}
}