I would like to modify a string in another class. This class will then use the variable in a function.
Here is what I've tried so far. I always get an error when unwrapping it: fatal error: unexpectedly found nil while unwrapping an Optional value. Does anyone have an idea how to change the urlString (preferably without making it global)? I couldn't find solutions for swift which also involved functions on stackoverflow... If you think I will have to make it global, please let me know!
In class #1
let videoUrl = "https:...sometestvideo.mov"
videoPlayerView.urlString = videoUrl
In class #2
var urlString : String?
//setup video player
func setupPlayerView() {
print(urlString)
//URL needed here
if let videoURL = NSURL(string: urlString!){ //here the error occurs
I would like to add that it is very important that the function is called asap in the second class. Therefore I didn't use setupPlayerView(_urlString)...
Accordingly it currently looks like this (class 2, a UIView):
override init(frame: CGRect){
super.init(frame: frame)
//function below this override init
setupPlayerView()
EDIT:
First of all, thank you for your solution! Nonetheless, one little problem remains (and I thought calling the function immediately would solve it... quite new to swift): namely the video player (which is set up using this function) is now above all the other subviews (one can only see the video covering the entire screen) although the opposite is desired (video using entire screen but subviews cover some parts). I will provide more code below regarding the addition of other subviews (all of these are closures) and the function setting up the view. Is there a way I can keep the videoplayer below all the other subviews (even if it needs to be loaded from a server first)? What would you suggest me to do?
Code below incorporates code from the first answer, but does not necessarily have to start from there
Class 2
override init(frame: CGRect){
super.init(frame: frame)
setupPlayerView()
//add subview with controls (e.g. spinner)
controlsContainerView.frame = frame
addSubview(controlsContainerView)
//add to subview and center spinner in subview
controlsContainerView.addSubview(activityIndicatorView)
activityIndicatorView.centerXAnchor.constraint(equalTo: centerXAnchor).isActive = true
activityIndicatorView.centerYAnchor.constraint(equalTo: centerYAnchor).isActive = true
//add various subviews
controlsContainerView.addSubview(whiteDummyView)
controlsContainerView.addSubview(userProfilePicture)
//... add further subviews
//enable interaction
controlsContainerView.isUserInteractionEnabled = true
//... set to true for other subviews as well
//function below this override init
defInteractions()
//backgorund color of player
backgroundColor = .black
}
//create controls container view (a closure, like the (most of the) other subviews)
let controlsContainerView: UIView = {
//set properties of controls container view
let controlView = UIView()
controlView.backgroundColor = UIColor(white: 0, alpha: 1)
return controlView
}()
function setupPlayerView()
//setup video player
func setupPlayerView() {
//check URL if can be converted to NSURL
if let urlString = self.urlString, let videoURL = NSURL(string: urlString){
print(urlString)
//player's video
if self.player == nil {
player = AVPlayer(url: videoURL as URL)
}
//add sub-layer
let playerLayer = AVPlayerLayer(player: player)
self.controlsContainerView.layer.addSublayer(playerLayer)
playerLayer.frame = self.frame
//when are frames actually rendered (when is video loaded)
player?.addObserver(self, forKeyPath: "currentItem.loadedTimeRanges", options: .new, context: nil)
//loop through video
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: self.player?.currentItem, queue: nil, using: { (_) in
DispatchQueue.main.async {
self.player?.seek(to: kCMTimeZero)
self.player?.play()
}
})
}
}
Without too much context about the relationship between your classes, looks like a potentially good solution is to use a property observer pattern.
class Class2 {
var urlString: String? {
didSet {
setupPlayerView()
}
}
override init(frame: CGRect){
super.init(frame: frame)
setupPlayerView()
}
func setupPlayerView() {
if let urlString = self.urlString, let videoURL = NSURL(string: urlString) {
// do stuff here
}
}
}
While navigating to second view controller.
let viewController2 = ViewController2() //viewController2 == class2
viewController2.urlString = videoUrl // anything you want to pass to class2.
navigationController?.pushViewController(viewController2, animated: true)
In your class 2:
var urlString: String = ""
override func viewDidLoad() {
super.viewDidLoad()
// Do any additional setup after loading the view.
print(urlString) // use it
setupPlayerView()
}
Your func:
func setupPlayerView() {
//check URL if can be converted to NSURL
if NSURL(string: urlString) != nil{
print(urlString)
}
}
Related
I am using AVPlayer, in Player I want to play video from the Server. I am doing like the code below. Problem I am facing is I set the
automaticallyWaitsToMinimizeStalling = true
According to Documentation
A Boolean value that indicates whether the player should automatically delay playback in order to minimize stalling.
but it took too much time to load the video/audio, for 8minutes video it took almost 2 to 3 minutes wait to play. If during this time (wait time of 2 to 3 minutes), user pause the video and play again, then video will play without any delay. This unnecessary wait should not be happened.
Can anyone guide me how to decrease wait of stalling? I can not use this answer
player.automaticallyWaitsToMinimizeStalling = false
because due to set this value false, my player stops again and again, and user have to play this manually, so this thing is very bad.
// MARK: - Outlets
#IBOutlet weak var audioView: UIView!
// MARK: - Variables
var player: AVPlayer = AVPlayer()
let playerController = AVPlayerViewController()
var obs = Set<NSKeyValueObservation>()
// MARK: - Helper Method
private func settingForAudioPlayer() {
guard let lesson = viewModal.lesson else {
print("lesson not found")
return
}
var path = "\(Server.audioVideoBasurl + (lesson.videoPath ?? ""))"
path = path.replacingOccurrences(of: " ", with: "%20")
print("path:\(path)")
guard let url = URL(string: path) else {
print("Path not converted to url")
return
}
self.player = AVPlayer(url: url)
self.player.automaticallyWaitsToMinimizeStalling = true
self.player.playImmediately(atRate: 1.0)
self.playerController.player = self.player
DispatchQueue.main.async {
self.playerController.view.clipsToBounds = true
self.playerController.view.removeFromSuperview()
self.playerController.delegate = self
self.showSpinner(onView: self.audioView, identifier: "audioView", title: "")
self.audioView.addSubview(self.playerController.view)
self.audioView.layoutIfNeeded() // Previously we were playing only audio but after some time we decided to add videos also so thats why that view name is audioView Don’t get confuse with this view name
self.playerController.view.frame.size.width = self.audioView.frame.width
self.playerController.view.frame.size.height = self.audioView.frame.height
self.playerController.view.backgroundColor = .clear
self.playerController.videoGravity = AVLayerVideoGravity.resizeAspectFill
var ob : NSKeyValueObservation!
ob = self.playerController.observe(\.isReadyForDisplay, options: [.initial, .new]) { vc, ch in
guard let ok = ch.newValue, ok else {return}
self.obs.remove(ob)
DispatchQueue.main.async {
print("finishing")
self.removeSpinner(identifier: "audioView") // This is my Custom Method
vc.view.isHidden = false // The Idea of KVO observer add got from the Internet
}
}
self.obs.insert(ob)
let iv = self.audioBackgroundImageView ?? UIImageView()
let v = self.playerController.contentOverlayView!
iv.translatesAutoresizingMaskIntoConstraints = false
NSLayoutConstraint.activate([
iv.bottomAnchor.constraint(equalTo:v.bottomAnchor),
iv.topAnchor.constraint(equalTo:v.topAnchor),
iv.leadingAnchor.constraint(equalTo:v.leadingAnchor),
iv.trailingAnchor.constraint(equalTo:v.trailingAnchor),
])
NSLayoutConstraint.activate([
v.bottomAnchor.constraint(equalTo:self.playerController.view.bottomAnchor),
v.topAnchor.constraint(equalTo:self.playerController.view.topAnchor),
v.leadingAnchor.constraint(equalTo:self.playerController.view.leadingAnchor),
v.trailingAnchor.constraint(equalTo:self.playerController.view.trailingAnchor),
])
self.view.layoutIfNeeded()
}
}
in Above peace of code I have video url (Path), I passed it to the AVPlayer,
Player pass to AVPlayerController,
add observer to the playerController to check that AVPlayerController is ready for Display, then remove observer.
After that I am only settings the constraint.
Kindly guide me how to decrease the waits on swift, on Android side video/Audio plays with in seconds.
It can be duplicate of this but my scenario is different and the solution here did not work for me. Kindly let me know in-case you need more information to help me.
If you don't want to use automaticallyWaitsToMinimizeStalling = true you can also observe player buffer and decide whether you start playing video. Here are some steps how to do that :
Create observer variable in the class where you intend to handle player:
var playbackBufferEmptyObserver: NSKeyValueObservation?
var playbackBufferFullObserver: NSKeyValueObservation?
var playbackLikelyToKeepUpObserver: NSKeyValueObservation?
Instantiate AVPlayer with AVPlayerItem instead of URL likewise:
let playerItem = AVPlayerItem(url: url)
let player = AVPlayer(playerItem: playerItem)
Create observers and assign them to variables:
playbackBufferEmptyObserver = self.playerItem.observe(\.isPlaybackBufferEmpty, options: [.new, .initial ], changeHandler: { [weak self] (player, bufferEmpty) in
if let self = self {
DispatchQueue.main.async {
if bufferEmpty.newValue == true {
// handle showing loading, player not playing
}
}
}
})
playbackBufferFullObserver = self.playerItem.observe(\.isPlaybackBufferFull, options: [.new, .initial], changeHandler: {[weak self] (player, bufferFull) in
if let self = self {
DispatchQueue.main.async {
if bufferFull.newValue == true {
//handle when player buffer is full (e.g. hide loading) start player
}
}
}
})
playbackLikelyToKeepUpObserver = self.playerItem.observe(\.isPlaybackLikelyToKeepUp, options: [.new, .initial], changeHandler: { [weak self] (player, _) in
if let self = self {
if ((self.playerItem?.status ?? AVPlayerItem.Status.unknown)! == .readyToPlay) {
// handle that player is ready to play (e.g. hide loading indicator, start player)
} else {
// player is not ready to play yet
}
}
})
I'm working on an app (piano) that has a series of buttons that each has a different mp3. The screen shows 12 buttons (piano keys) and I want the user to be able to play an individual sound or swipe across a couple to hear multiple. Just like a real piano. I've seen many apps do this but mine seems to have a problem when the user slides across multiple buttons quickly. At the same speed, other apps will play all the notes, but mine will skip a few. Thank you for any help! This will make all the difference in my app!
A couple quick notes about this code:
-I just have the bare bones here to save space
-I just showed 6 audio players, but you get the idea
-the locationInNote1...Note2...Note3 is just showing 6 here to save place, but you get the idea
-"note1" in the button action is a string that can be changed when the user selects different octaves to play from, but its just a #, so the audio files ultimately are 1.mp3, 2.mp3, etc.
-the button action playNote1 is the same as the other button actions so i didn't repeat them all there.
var audioPlayer = AVAudioPlayer()
var audioPlayer2 = AVAudioPlayer()
var audioPlayer3 = AVAudioPlayer()
var audioPlayer4 = AVAudioPlayer()
var audioPlayer5 = AVAudioPlayer()
var audioPlayer6 = AVAudioPlayer()
func playNote(for locationInView: CGPoint) {
let locationInNote1 = note1Button.convert(locationInView, from: view)
let locationInNote2 = note2Button.convert(locationInView, from: view)
let locationInNote3 = note3Button.convert(locationInView, from: view)
let locationInNote4 = note4Button.convert(locationInView, from: view)
let locationInNote5 = note5Button.convert(locationInView, from: view)
let locationInNote6 = note6Button.convert(locationInView, from: view)
if note1Button.point(inside: locationInButton1, with: nil) {
playNote1(self)
}
if note2Button.point(inside: locationInButton2, with: nil) {
playNote2(self)
}
if note3Button.point(inside: locationInButton3, with: nil) {
playNote3(self)
}
if note4Button.point(inside: locationInButton4, with: nil) {
playNote4(self)
}
if note5Button.point(inside: locationInButton5, with: nil) {
playNote5(self)
}
if note6Button.point(inside: locationInButton6, with: nil) {
playNote6(self)
}
}
#IBAction func playNote1(_ sender: Any) {
let note1mp3 = note1
if let path = Bundle.main.path(forResource: note1mp3, ofType: "mp3") {
let url = URL(fileURLWithPath: path)
do {
audioPlayer = try AVAudioPlayer(contentsOf: url)
audioPlayer.prepareToPlay()
audioPlayer.play()
}
catch {
print(error)
}
}
}
override func viewDidLoad() {
super.viewDidLoad()
let panGesture = UIPanGestureRecognizer(target: self, action: #selector(handlePan(_:)))
view.addGestureRecognizer(panGesture)
Here is a way to hopefully:
Improve your code
Make your code simpler
Fix the problem you're having
In this example, I'll just put 3 audio players.
I'll try to make it as straight forward as possible...
import UIKit
class MyViewController: UIViewController {
var audioPlayer = AVAudioPlayer()
var audioPlayer2 = AVAudioPlayer()
var audioPlayer3 = AVAudioPlayer()
#IBOutlet var notes: [UIButton]!
let references = [note1, note2, note3]
#IBAction func notePressed(_ sender: UIButton) {
play(note: notes.index(of: sender)! + 1)
}
func play(note: Int) {
let reference = references[note - 1]
if let path = Bundle.main.path(forResource: reference, ofType: "mp3") {
let url = URL(fileURLWithPath: path)
do {
audioPlayer = try AVAudioPlayer(contentsOf: url)
audioPlayer.prepareToPlay()
audioPlayer.play()
}
catch {
print(error)
}
}
}
}
Explanation
#IBOutlet var notes: [UIButton]!
This is an Outlet Collection of buttons. To make this, connect a button but select Outlet Collection instead of just Outlet. Make sure you connect each button in order (1, 2, 3), otherwise it will break!
let references = [note1, note2, note3]
This is an array of references to each of the note files. This is so that we only need one function to play a note.
#IBAction func notePressed(_ :sender:)
This function gets called by the buttons. For each button, connect the Touch Drag Enter (for sliding along notes) and Touch Down actions. You can add other ones if you want. The functions compares the sender and notes Outlet Collection to find out which note was pressed, then passes it to the func play(:note:) function.
func play(:note:)
This function takes the note number, and plays the corresponding file. It's almost identical to your original one, but instead of having a fixed note, it has one that is passed by the #IBAction func notePressed(_ :sender:) method.
I hope this is helpful, good luck!
I am trying to make a video play if a certain view is tapped once. The .play() function works well if I directly call it within an if statement inside a function (if which "checks" URL inside the setupPlayerView() function). The first functions (setupPlayerView and defInteractions) I am going to show below are going to be called in an override init which sets the properties and subviews/sublayers etc. for the video player. The last function is triggered by the second function . Pay attention to the declaration of player and my comment in the bottom...
Code: func #1
func setupPlayerView() {
//insert url
let urlString = "https://blurtime.com/images/testvideo.mov"
//check URL if can be converted to NSURL
if let videoURL = NSURL(string: urlString){
//player's video
let player = AVPlayer(url: videoURL as URL)
//add sub-layer
let playerLayer = AVPlayerLayer(player: player)
self.layer.addSublayer(playerLayer)
playerLayer.frame = self.frame
//when are frames actually rendered (when is video loaded)
player.addObserver(self, forKeyPath: "currentItem.loadedTimeRanges", options: .new, context: nil)
//if I call player.play() here the video plays directly
}
}
As I the UITapGestureRecognizers to detect a single tap I have the following function which is called in the override init as well (shortly after the previous function):
Code: func #2
//set interactions
func defInteractions (){
//enable interaction
controlsContainerView.isUserInteractionEnabled = true
//singletap
let singleTap = UITapGestureRecognizer(target: self, action: #selector(singleTapDetected(_:)))
singleTap.numberOfTapsRequired = 1
//controlsContainerView
controlsContainerView.addGestureRecognizer(singleTap)
}
Now, I would like to call player.play() inside the function singleTapDetected which currently looks like this:
Code: func #3
func singleTapDetected(_ sender: UITapGestureRecognizer) {
player.play()
}
However, it does not work of course as this function is outside the override init as opposed to the others and I get the error use of unresolved identifier 'player'. How can I call player.play() and get the same result as if I would call it in the first function? Can I access it within the if? I could need some help...
Make player an instance variable:
var player: AVPlayer?
Then initialize it like this:
if self.player == nil {
player = AVPlayer(url: videoURL as URL)
}
Then you can easily access it from anywhere in your class.
Make the player property Global like this :
let player : AVPlayer?
Initialize it in setupPlayerView() :
func setupPlayerView() {
//insert url
let urlString = "https://blurtime.com/images/testvideo.mov"
//check URL if can be converted to NSURL
if let videoURL = NSURL(string: urlString){
//player's video
self.player = AVPlayer(url: videoURL as URL)
//add sub-layer
let playerLayer = AVPlayerLayer(player: player)
self.layer.addSublayer(playerLayer)
playerLayer.frame = self.frame
//when are frames actually rendered (when is video loaded)
player.addObserver(self, forKeyPath: "currentItem.loadedTimeRanges", options: .new, context: nil)
//if I call player.play() here the video plays directly
}
}
and use it like this:
func singleTapDetected(_ sender: UITapGestureRecognizer) {
// with self
self.player.play()
}
Hope it helps
Declare player at the class level with optional.
var player: AVPlayer?
Instead of declaring player as a local variable in setupPlayerView function, you should declare it a an instance variable to be accessible in the scope of the whole class/struct.
For your case, I would suggest that instead of implementing setupPlayerView function, it would be a good practice to declare it as a lazy property , as follows:
lazy var player: AVPlayer? = {
//insert url
let urlString = "https://blurtime.com/images/testvideo.mov"
guard let videoURL = NSURL(string: urlString) else {
return nil
}
let player = AVPlayer(url: videoURL as URL)
//add sub-layer
let playerLayer = AVPlayerLayer(player: player)
self.layer.addSublayer(playerLayer)
playerLayer.frame = self.frame
//when are frames actually rendered (when is video loaded)
player.addObserver(self, forKeyPath: "currentItem.loadedTimeRanges", options: .new, context: nil)
return player
}()
Usage:
// this means if the urlString is valid
if let player = player {
player.play()
}
That leads to: all the needed setup for player should be ready only if needed. You might want to check this Q&A.
I have a tvOS app which has a video playing in it.
There are basically two videos (different speed versions of the same video). One is 12MB in size and another is 1.9MB.
When the app starts, it runs fine (Xcode shows 191MB). However, when clicking normal speed button once, the memory shoots to 350MB. As and when I click normal and fast buttons respectively, this goes on increasing and at one point it becomes 1GB+. You can see the attachment. It even went to 3GB when the video stuttered and the app stopped.
Is there any way to solve the memory issue and save the app from stopping?
Another problem is: when in Apple TV, we go to another app from this app and come back, the video again stops. However, in Simulator, it is not happening. Can someone help me to solve these two issues?
Here is the code I am using:
var avPlayerLayer: AVPlayerLayer!
var paused: Bool = false
func playmyVideo(myString: String) {
let bundle: Bundle = Bundle.main
let videoPlayer: String = bundle.path(forResource: myString, ofType: "mov")!
let movieUrl : NSURL = NSURL.fileURL(withPath: videoPlayer) as NSURL
print(movieUrl)
viewVideo.playVideoWithURL(url: movieUrl)
}
#IBAction func normalPressed(_ sender: Any) {
playmyVideo(myString: "normal")
}
#IBAction func forwardPressed(_ sender: Any) {
playmyVideo(myString: "fast")
}
class VideoPlay: UIView {
private var player : AVPlayer!
private var playerLayer : AVPlayerLayer!
init() {
super.init(frame: CGRect.zero)
self.initializePlayerLayer()
}
override init(frame: CGRect) {
super.init(frame: frame)
self.initializePlayerLayer()
self.autoresizesSubviews = false
}
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
self.initializePlayerLayer()
}
private func initializePlayerLayer() {
playerLayer = AVPlayerLayer()
playerLayer.backgroundColor = UIColor.clear.cgColor
playerLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
self.layer.addSublayer(playerLayer)
playerLayer.frame = UIScreen.main.bounds
}
func playVideoWithURL(url: NSURL) {
player = AVPlayer(url: url as URL)
player.isMuted = false
playerLayer.player = player
player.play()
loopVideo(videoPlayer: player)
}
func toggleMute() {
player.isMuted = !player.isMuted
}
func isMuted() -> Bool
{
return player.isMuted
}
func loopVideo(videoPlayer: AVPlayer) {
NotificationCenter.default.addObserver(forName: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: nil, queue: nil) { notification in
let t1 = CMTimeMake(5, 100);
self.player.seek(to: t1)
videoPlayer.seek(to: kCMTimeZero)
self.player.play()
}
}
}
I see two problems in your code:
Each time playVideoWithURL method is called, you create new AVPlayer instance, instead of reusing already existing one. You can call replaceCurrentItem(with:) method on your player property when you want to play another URL.
That itself is a bit inefficient, but shouldn't cause the memory issue you described. I think the reason is:
Each time loopVideo method is called, you pass a closure to NotificationCenter.default.addObserver. This closure creates a strong reference to videoPlayer. You never remove the observer from the notification center.
As loopVideo is called each time you create new AVPlayer instance, these instances are never deallocated, leading to the memory issue you described.
To fix it, you can:
initialize player property only once in playVideoWithURL, then use replaceCurrentItem when you want to play another video
also change the "loop" logic, so that you call NotificationCenter.default.addObserver only once
the closure you pass to NotificationCenter.default.addObserver creates a memory leak (see this question). You can get rid of it by capturing self weakly:
NotificationCenter.default.addObserver(forName:
NSNotification.Name.AVPlayerItemDidPlayToEndTime,object: nil, queue: nil) { [weak self], notification in
self?.player.seek(to: kCMTimeZero)
self?.player.play()
}
also remember to call removeObserver in deinit method of VideoPlay class.
I am attempting to recognize a tap gesture while a video is playing so that I can dismiss it similarly to how snapchat does this, however, it says that MPMoviePlayerControllers have no members to add touch gestures, is this true or am I using the incorrect method?
var MP4 : NSData?
var MarkerLong : CLLocationDegrees?
var MarkerLat : CLLocationDegrees?
var Url : String?
var videoPlayer : MPMoviePlayerController!
private var firstAppear = true
override func viewDidAppear(animated: Bool) {
super.viewDidAppear(animated)
if firstAppear {
do {
try playVideo()
firstAppear = false
} catch AppError.InvalidResource(let name, let type) {
debugPrint("Could not find resource \(name).\(type)")
} catch {
debugPrint("Generic error")
}
}
}
private func playVideo() throws {
self.videoPlayer = MPMoviePlayerController()
self.videoPlayer.repeatMode = MPMovieRepeatMode.None
self.videoPlayer.contentURL = NSURL(string: Url!)
self.videoPlayer.controlStyle = MPMovieControlStyle.None
self.view.addSubview(self.videoPlayer.view)
NSNotificationCenter.defaultCenter().addObserver(self, selector: #selector(PlayVideoViewController.videoPlayBackDidFinish(_:)), name: MPMoviePlayerPlaybackDidFinishNotification, object: self.videoPlayer)
self.videoPlayer.view.frame.size = CGSizeMake(640, 1136)
self.videoPlayer.view.center = self.view.center
self.videoPlayer.play()
let gesture = UITapGestureRecognizer(target: self, action: "someAction:")
self.videoPlayer.addGestureRecognizer(gesture)
}
I would reccomend using AVPlayerViewController, but make sure not to subclass it, as Apple states not to.
1) MPMoviePlayer is deprecated (Don't use this code anymore)
2) AVPlayerViewController has a much more intricate set of code to allow more customization.
If you really want to customize something, you can subclass AVPlayer and make your own customized view where the video will be played, but you will have to add your own pause/start, etc...
MPMoviePlayerController is a view controller. Gesture recognizers are added to views. You need to add this gesture to the MPMoviePlayerController's view.
(Even better, stop using MPMoviePlayerController; it is deprecated.)