How to recognize a tap gesture while using an MPMoviePlayerController - ios

I am attempting to recognize a tap gesture while a video is playing so that I can dismiss it similarly to how snapchat does this, however, it says that MPMoviePlayerControllers have no members to add touch gestures, is this true or am I using the incorrect method?
var MP4 : NSData?
var MarkerLong : CLLocationDegrees?
var MarkerLat : CLLocationDegrees?
var Url : String?
var videoPlayer : MPMoviePlayerController!
private var firstAppear = true
override func viewDidAppear(animated: Bool) {
super.viewDidAppear(animated)
if firstAppear {
do {
try playVideo()
firstAppear = false
} catch AppError.InvalidResource(let name, let type) {
debugPrint("Could not find resource \(name).\(type)")
} catch {
debugPrint("Generic error")
}
}
}
private func playVideo() throws {
self.videoPlayer = MPMoviePlayerController()
self.videoPlayer.repeatMode = MPMovieRepeatMode.None
self.videoPlayer.contentURL = NSURL(string: Url!)
self.videoPlayer.controlStyle = MPMovieControlStyle.None
self.view.addSubview(self.videoPlayer.view)
NSNotificationCenter.defaultCenter().addObserver(self, selector: #selector(PlayVideoViewController.videoPlayBackDidFinish(_:)), name: MPMoviePlayerPlaybackDidFinishNotification, object: self.videoPlayer)
self.videoPlayer.view.frame.size = CGSizeMake(640, 1136)
self.videoPlayer.view.center = self.view.center
self.videoPlayer.play()
let gesture = UITapGestureRecognizer(target: self, action: "someAction:")
self.videoPlayer.addGestureRecognizer(gesture)
}

I would reccomend using AVPlayerViewController, but make sure not to subclass it, as Apple states not to.
1) MPMoviePlayer is deprecated (Don't use this code anymore)
2) AVPlayerViewController has a much more intricate set of code to allow more customization.
If you really want to customize something, you can subclass AVPlayer and make your own customized view where the video will be played, but you will have to add your own pause/start, etc...

MPMoviePlayerController is a view controller. Gesture recognizers are added to views. You need to add this gesture to the MPMoviePlayerController's view.
(Even better, stop using MPMoviePlayerController; it is deprecated.)

Related

Sliding across piano keys (swift/iOS)

I'm working on an app (piano) that has a series of buttons that each has a different mp3. The screen shows 12 buttons (piano keys) and I want the user to be able to play an individual sound or swipe across a couple to hear multiple. Just like a real piano. I've seen many apps do this but mine seems to have a problem when the user slides across multiple buttons quickly. At the same speed, other apps will play all the notes, but mine will skip a few. Thank you for any help! This will make all the difference in my app!
A couple quick notes about this code:
-I just have the bare bones here to save space
-I just showed 6 audio players, but you get the idea
-the locationInNote1...Note2...Note3 is just showing 6 here to save place, but you get the idea
-"note1" in the button action is a string that can be changed when the user selects different octaves to play from, but its just a #, so the audio files ultimately are 1.mp3, 2.mp3, etc.
-the button action playNote1 is the same as the other button actions so i didn't repeat them all there.
var audioPlayer = AVAudioPlayer()
var audioPlayer2 = AVAudioPlayer()
var audioPlayer3 = AVAudioPlayer()
var audioPlayer4 = AVAudioPlayer()
var audioPlayer5 = AVAudioPlayer()
var audioPlayer6 = AVAudioPlayer()
func playNote(for locationInView: CGPoint) {
let locationInNote1 = note1Button.convert(locationInView, from: view)
let locationInNote2 = note2Button.convert(locationInView, from: view)
let locationInNote3 = note3Button.convert(locationInView, from: view)
let locationInNote4 = note4Button.convert(locationInView, from: view)
let locationInNote5 = note5Button.convert(locationInView, from: view)
let locationInNote6 = note6Button.convert(locationInView, from: view)
if note1Button.point(inside: locationInButton1, with: nil) {
playNote1(self)
}
if note2Button.point(inside: locationInButton2, with: nil) {
playNote2(self)
}
if note3Button.point(inside: locationInButton3, with: nil) {
playNote3(self)
}
if note4Button.point(inside: locationInButton4, with: nil) {
playNote4(self)
}
if note5Button.point(inside: locationInButton5, with: nil) {
playNote5(self)
}
if note6Button.point(inside: locationInButton6, with: nil) {
playNote6(self)
}
}
#IBAction func playNote1(_ sender: Any) {
let note1mp3 = note1
if let path = Bundle.main.path(forResource: note1mp3, ofType: "mp3") {
let url = URL(fileURLWithPath: path)
do {
audioPlayer = try AVAudioPlayer(contentsOf: url)
audioPlayer.prepareToPlay()
audioPlayer.play()
}
catch {
print(error)
}
}
}
override func viewDidLoad() {
super.viewDidLoad()
let panGesture = UIPanGestureRecognizer(target: self, action: #selector(handlePan(_:)))
view.addGestureRecognizer(panGesture)
Here is a way to hopefully:
Improve your code
Make your code simpler
Fix the problem you're having
In this example, I'll just put 3 audio players.
I'll try to make it as straight forward as possible...
import UIKit
class MyViewController: UIViewController {
var audioPlayer = AVAudioPlayer()
var audioPlayer2 = AVAudioPlayer()
var audioPlayer3 = AVAudioPlayer()
#IBOutlet var notes: [UIButton]!
let references = [note1, note2, note3]
#IBAction func notePressed(_ sender: UIButton) {
play(note: notes.index(of: sender)! + 1)
}
func play(note: Int) {
let reference = references[note - 1]
if let path = Bundle.main.path(forResource: reference, ofType: "mp3") {
let url = URL(fileURLWithPath: path)
do {
audioPlayer = try AVAudioPlayer(contentsOf: url)
audioPlayer.prepareToPlay()
audioPlayer.play()
}
catch {
print(error)
}
}
}
}
Explanation
#IBOutlet var notes: [UIButton]!
This is an Outlet Collection of buttons. To make this, connect a button but select Outlet Collection instead of just Outlet. Make sure you connect each button in order (1, 2, 3), otherwise it will break!
let references = [note1, note2, note3]
This is an array of references to each of the note files. This is so that we only need one function to play a note.
#IBAction func notePressed(_ :sender:)
This function gets called by the buttons. For each button, connect the Touch Drag Enter (for sliding along notes) and Touch Down actions. You can add other ones if you want. The functions compares the sender and notes Outlet Collection to find out which note was pressed, then passes it to the func play(:note:) function.
func play(:note:)
This function takes the note number, and plays the corresponding file. It's almost identical to your original one, but instead of having a fixed note, it has one that is passed by the #IBAction func notePressed(_ :sender:) method.
I hope this is helpful, good luck!

Change string in other class (without making it global) in swift

I would like to modify a string in another class. This class will then use the variable in a function.
Here is what I've tried so far. I always get an error when unwrapping it: fatal error: unexpectedly found nil while unwrapping an Optional value. Does anyone have an idea how to change the urlString (preferably without making it global)? I couldn't find solutions for swift which also involved functions on stackoverflow... If you think I will have to make it global, please let me know!
In class #1
let videoUrl = "https:...sometestvideo.mov"
videoPlayerView.urlString = videoUrl
In class #2
var urlString : String?
//setup video player
func setupPlayerView() {
print(urlString)
//URL needed here
if let videoURL = NSURL(string: urlString!){ //here the error occurs
I would like to add that it is very important that the function is called asap in the second class. Therefore I didn't use setupPlayerView(_urlString)...
Accordingly it currently looks like this (class 2, a UIView):
override init(frame: CGRect){
super.init(frame: frame)
//function below this override init
setupPlayerView()
EDIT:
First of all, thank you for your solution! Nonetheless, one little problem remains (and I thought calling the function immediately would solve it... quite new to swift): namely the video player (which is set up using this function) is now above all the other subviews (one can only see the video covering the entire screen) although the opposite is desired (video using entire screen but subviews cover some parts). I will provide more code below regarding the addition of other subviews (all of these are closures) and the function setting up the view. Is there a way I can keep the videoplayer below all the other subviews (even if it needs to be loaded from a server first)? What would you suggest me to do?
Code below incorporates code from the first answer, but does not necessarily have to start from there
Class 2
override init(frame: CGRect){
super.init(frame: frame)
setupPlayerView()
//add subview with controls (e.g. spinner)
controlsContainerView.frame = frame
addSubview(controlsContainerView)
//add to subview and center spinner in subview
controlsContainerView.addSubview(activityIndicatorView)
activityIndicatorView.centerXAnchor.constraint(equalTo: centerXAnchor).isActive = true
activityIndicatorView.centerYAnchor.constraint(equalTo: centerYAnchor).isActive = true
//add various subviews
controlsContainerView.addSubview(whiteDummyView)
controlsContainerView.addSubview(userProfilePicture)
//... add further subviews
//enable interaction
controlsContainerView.isUserInteractionEnabled = true
//... set to true for other subviews as well
//function below this override init
defInteractions()
//backgorund color of player
backgroundColor = .black
}
//create controls container view (a closure, like the (most of the) other subviews)
let controlsContainerView: UIView = {
//set properties of controls container view
let controlView = UIView()
controlView.backgroundColor = UIColor(white: 0, alpha: 1)
return controlView
}()
function setupPlayerView()
//setup video player
func setupPlayerView() {
//check URL if can be converted to NSURL
if let urlString = self.urlString, let videoURL = NSURL(string: urlString){
print(urlString)
//player's video
if self.player == nil {
player = AVPlayer(url: videoURL as URL)
}
//add sub-layer
let playerLayer = AVPlayerLayer(player: player)
self.controlsContainerView.layer.addSublayer(playerLayer)
playerLayer.frame = self.frame
//when are frames actually rendered (when is video loaded)
player?.addObserver(self, forKeyPath: "currentItem.loadedTimeRanges", options: .new, context: nil)
//loop through video
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: self.player?.currentItem, queue: nil, using: { (_) in
DispatchQueue.main.async {
self.player?.seek(to: kCMTimeZero)
self.player?.play()
}
})
}
}
Without too much context about the relationship between your classes, looks like a potentially good solution is to use a property observer pattern.
class Class2 {
var urlString: String? {
didSet {
setupPlayerView()
}
}
override init(frame: CGRect){
super.init(frame: frame)
setupPlayerView()
}
func setupPlayerView() {
if let urlString = self.urlString, let videoURL = NSURL(string: urlString) {
// do stuff here
}
}
}
While navigating to second view controller.
let viewController2 = ViewController2() //viewController2 == class2
viewController2.urlString = videoUrl // anything you want to pass to class2.
navigationController?.pushViewController(viewController2, animated: true)
In your class 2:
var urlString: String = ""
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
print(urlString) // use it
setupPlayerView()
}
Your func:
func setupPlayerView() {
//check URL if can be converted to NSURL
if NSURL(string: urlString) != nil{
print(urlString)
}
}

tvOS app memory issue: How to resolve it?

I have a tvOS app which has a video playing in it.
There are basically two videos (different speed versions of the same video). One is 12MB in size and another is 1.9MB.
When the app starts, it runs fine (Xcode shows 191MB). However, when clicking normal speed button once, the memory shoots to 350MB. As and when I click normal and fast buttons respectively, this goes on increasing and at one point it becomes 1GB+. You can see the attachment. It even went to 3GB when the video stuttered and the app stopped.
Is there any way to solve the memory issue and save the app from stopping?
Another problem is: when in Apple TV, we go to another app from this app and come back, the video again stops. However, in Simulator, it is not happening. Can someone help me to solve these two issues?
Here is the code I am using:
var avPlayerLayer: AVPlayerLayer!
var paused: Bool = false
func playmyVideo(myString: String) {
let bundle: Bundle = Bundle.main
let videoPlayer: String = bundle.path(forResource: myString, ofType: "mov")!
let movieUrl : NSURL = NSURL.fileURL(withPath: videoPlayer) as NSURL
print(movieUrl)
viewVideo.playVideoWithURL(url: movieUrl)
}
#IBAction func normalPressed(_ sender: Any) {
playmyVideo(myString: "normal")
}
#IBAction func forwardPressed(_ sender: Any) {
playmyVideo(myString: "fast")
}
class VideoPlay: UIView {
private var player : AVPlayer!
private var playerLayer : AVPlayerLayer!
init() {
super.init(frame: CGRect.zero)
self.initializePlayerLayer()
}
override init(frame: CGRect) {
super.init(frame: frame)
self.initializePlayerLayer()
self.autoresizesSubviews = false
}
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
self.initializePlayerLayer()
}
private func initializePlayerLayer() {
playerLayer = AVPlayerLayer()
playerLayer.backgroundColor = UIColor.clear.cgColor
playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
self.layer.addSublayer(playerLayer)
playerLayer.frame = UIScreen.main.bounds
}
func playVideoWithURL(url: NSURL) {
player = AVPlayer(url: url as URL)
player.isMuted = false
playerLayer.player = player
player.play()
loopVideo(videoPlayer: player)
}
func toggleMute() {
player.isMuted = !player.isMuted
}
func isMuted() -> Bool
{
return player.isMuted
}
func loopVideo(videoPlayer: AVPlayer) {
NotificationCenter.default.addObserver(forName: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: nil, queue: nil) { notification in
let t1 = CMTimeMake(5, 100);
self.player.seek(to: t1)
videoPlayer.seek(to: kCMTimeZero)
self.player.play()
}
}
}
I see two problems in your code:
Each time playVideoWithURL method is called, you create new AVPlayer instance, instead of reusing already existing one. You can call replaceCurrentItem(with:) method on your player property when you want to play another URL.
That itself is a bit inefficient, but shouldn't cause the memory issue you described. I think the reason is:
Each time loopVideo method is called, you pass a closure to NotificationCenter.default.addObserver. This closure creates a strong reference to videoPlayer. You never remove the observer from the notification center.
As loopVideo is called each time you create new AVPlayer instance, these instances are never deallocated, leading to the memory issue you described.
To fix it, you can:
initialize player property only once in playVideoWithURL, then use replaceCurrentItem when you want to play another video
also change the "loop" logic, so that you call NotificationCenter.default.addObserver only once
the closure you pass to NotificationCenter.default.addObserver creates a memory leak (see this question). You can get rid of it by capturing self weakly:
NotificationCenter.default.addObserver(forName:
NSNotification.Name.AVPlayerItemDidPlayToEndTime,object: nil, queue: nil) { [weak self], notification in
self?.player.seek(to: kCMTimeZero)
self?.player.play()
}
also remember to call removeObserver in deinit method of VideoPlay class.

AVPlayer stuck video, but audio works

I go to create AVPlayerItem through AVURLAsset, my code:
let asset = AVURLAsset(URL: safeURL, options: [AVURLAssetPreferPreciseDurationAndTimingKey: true])
asset.loadValuesAsynchronouslyForKeys([assetKeyPlayable, self.assetKeyTracks, self.assetKeyHasProtectedContent]) {
() -> Void in
dispatch_async(dispatch_get_main_queue(), {
() -> Void in
// Use the AVAsset playable property to detect whether the asset can be played
if !asset.playable {
let localizedDescription = "Item cannot be played,Item cannot be played description"
let localizedFailureReason = "The assets tracks were loaded, but could not be made playable,Item cannot be played failure reason"
let userInfo = [NSLocalizedDescriptionKey: localizedDescription, NSLocalizedFailureReasonErrorKey: localizedFailureReason]
let error = NSError(domain: "domain", code: 0, userInfo: userInfo)
self.videoPlayerDelegate?.videoPlayer?(self, playerItemStatusDidFail: error)
self.cleanPlayer()
return
}
// At this point we're ready to set up for playback of the asset. Stop observing
if let _ = self.player?.currentItem {
self.cleanPlayer()
}
if asset.URL.absoluteString != safeURL.absoluteString {
return
}
var error: NSError?
let status = asset.statusOfValueForKey(self.assetKeyTracks, error: &error)
var playerItem = AVPlayerItem(URL: safeURL)
if status == .Loaded {
playerItem = AVPlayerItem(asset: asset)
} else {
// You should deal with the error appropriately.If Loaded fails, create an AVPlayerItem directly from the URL
playerItem = AVPlayerItem(URL: safeURL)
}
self.player = self.playerWithPlayerItem(playerItem)
self.registerMonitoring()
self.registerNotification()
self.addTimeObserver()
completionBlock?(loadURLString: playerURL.absoluteString)
})
}
Add AVPlayerLayer display video in my View, my code:
// MARK: - Property
var player: AVPlayer? {
get {
return playerLayer.player
}
set {
playerLayer.player = newValue
}
}
var playerLayer: AVPlayerLayer {
return layer as! AVPlayerLayer
}
When displaying video after completion of loading
self.videoPlayer?.loadPlayer({
[weak self](loadURLString) in
if let strongSelf = self {
strongSelf.player = strongSelf.videoPlayer?.player
strongSelf.startPlay()
}
})
Call seekToTime method to specify the play:
self.player?.currentItem?.seekToTime(CMTimeMakeWithSeconds(time, Int32(NSEC_PER_SEC)), toleranceBefore: kCMTimeZero, toleranceAfter: kCMTimeZero) {
[weak self] finished in
if let weakSelf = self {
if weakSelf.isPlaying {
weakSelf.videoPlayerDelegate?.videoPlayerDidplayerItemSeekToTime?(weakSelf)
}
}
}
Some pictures of the stuck interface:
In the first picture the sound is audible, but the interface is stuck.
In the second picture, the video works, but I get no sound.
My question:
When I call the seekToTime method upon completion, sometimes the video has sound, but the interface is stuck, occasionally the video works. I tried to call the CALayer setNeedsDisplay methods, to update the AVPlayerLayer picture, but that didn't help. I don't know what to do anymore, i would be grateful for every and any help.
Since this is happening for many of us in a different way and is not answered, I am answering it here, For me this was happening when I was trying to play the video in UIView using AVPlayer and AVPlayerLayer but when I used the same AVPlayer to load the video in AVPlayerViewController the video was getting stuck and audio kept on playing.
The trick is to destroy the AVPlayerLayer before using the same AVPlayer somewhere else ,Cant believe I found this enlightenment over here.
https://twitter.com/skumancer/status/294605708073263104
How I implemented this?
I created 2 view controllers in my main story board.
1) View Controller Scene
i) This has a UIView in it referenced to IBOutlet 'videoView'
ii) A button with segue navigation to 'show' AV Player View Controller Scene referenced to IBOutlet 'fullscreen'
2) AV Player View Controller Scene
// Code in your controller class.
// Make sure to reference your IBOutlet in your StoryBoard.
#IBOutlet weak var videoView : UIView!
#IBOutlet weak var fullscreen : UIButton!
var player:AVPlayer? = nil;
var playerLayer:AVPlayerLayer? = nil;
override func viewDidLoad() {
super.viewDidLoad()
// Your Url to play, could be hls type video
let url = URL(string: "https://your_url.m3u8");
let asset = AVAsset(url : url!)
let playerItem = AVPlayerItem(asset: asset)
player = AVPlayer(playerItem: playerItem)
}
override func viewDidAppear(_ animated: Bool) {
playInUIView()
pauseVideo()
}
// Creates An AVPlayerLayer and adds it as a sublayer in UIView.
func playInUIView(){
// Creating player layer to render video.
playerLayer = AVPlayerLayer(player: player)
playerLayer?.frame = self.videoView.bounds
playerLayer?.videoGravity = .resizeAspect
// Adding a sub layer to display it in UIView
self.videoView.layer.addSublayer(playerLayer!)
}
// Destroyes an AVPlayerLayer and Sublayer of your UIView
func destroyPlayInView(){
self.videoView.layer.sublayers=nil
playerLayer = nil
}
// On button click open AVPlayerViewController.
override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
let destination = segue.destination as! AVPlayerViewController
// Destroy AVPlayerLayer before rendering video in AVPlayerViewController.
destroyPlayInView()
// Set same AVPlayer in AVPlayerViewController.
destination.player = self.player
self.present(destination, animated: true) {
destination.player!.play()
}
}
PS - I have not implemented any controls as of yet and there could be coding malpractices as I have just started with swift and IOS development couple of days back, so please let me know wherever I am wrong.
Try this approach as I have encountered the same issue and solve it by this kind approach.
player.pause()
player.currentItem?.seek(to: CMTime(), completionHandler: { (_) in
player.play()
})
The same code of user nferocious76 but in Objective C
[self.player pause];
AVPlayerItem *playerItem = [self.player currentItem];
__weak typeof(self) weakSelf = self;
[playerItem seekToTime:playerItem.currentTime completionHandler:^(BOOL finished){
if (finished){
[weakSelf.player play];
}
}];

VKVideoplayer usage in Swift

I am trying to implement VKVideoplayer in Swift. I have used it as pod and imported the library in bridge class.
Now I am using the below code to start the video player, I am getting the video player in my view however, video stream is not happening
var player:VKVideoPlayer = VKVideoPlayer()
player.view.frame = self.view.bounds
player.delegate = self
self.view.addSubview(player.view)
var videotrack:VKVideoPlayerTrack = VKVideoPlayerTrack()
videotrack.streamURL = NSURL.fileURLWithPath("https://v.cdn.vine.co/r/videos/AA3C120C521177175800441692160_38f2cbd1ffb.1.5.13763579289575020226.mp4")
videotrack.hasNext = true
player.loadVideoWithTrack(video track)
The activity indicator is loading without the video.
#IBAction func play() {
let videoURLString = "http://......."
let videoVC = VKVideoPlayerViewController()
presentViewController(videoVC, animated: true, completion: nil)
videoVC.playVideoWithStreamURL(NSURL(string: videoURLString))
}
I don't know why it does not work if I use segue to switch view controller, so I just put it by code

Resources