AVPlayer does not play Video, play Button with Line through - ios

Im new to swift but i like it more than obj-c as it looks a bit like java does to me from syntax wise compared to obj-c.
My problem is now that most of the source code samples are for obj-c so theyre unreadable for me =)
Anyway i managed to run a few code snippets like this(im not at my mac but they were similar):
let steamingURL:NSURL = NSURL(string: "http://....")!
let player = AVPlayer(URL: steamingURL)
player.allowsExternalPlayback = false
PlayWorkoutViewController.playerController = AVPlayerViewController()
PlayWorkoutViewController.playerController.player = player
self.addChildViewController(PlayWorkoutViewController.playerController)
self.view.addSubview(PlayWorkoutViewController.playerController.view)
PlayWorkoutViewController.playerController.view.frame = videoContainerView.frame
PlayWorkoutViewController.playerController.showsPlaybackControls = false
player.play()
resulting in a Play Button with a line trough it, it doesnt play the Stream.
The stream source is a mpg1/2 stream according to VLCPlayer and its coming from a Linux based satellite receiver.
Another thing i tried was to change that "string:" part to "fileURLWithPath:" at the NSURL variable but that didnt work either.
Is there a way to Buffer the stream or is this just a codec issue, what workaround options do i have?
Im hesitating since three days, i hope its not a duplicate question, thanks.
EDIT: content of the stream.m3u file:
EXTM3U
EXTVLCOPT--http-reconnect=true 192.168.178.20:8001/1:0:1:445D:453:1:C00000:0:0:0:

Security setting might be the problem. Go to info.plist and Add "App Transport Security Settings". Under that add "Allow Arbitrary Loads" and set it to YES. I hope this fixes the problem.

Related

iOS: Apply audio modifications to Music library content

I'm working on an iOS/Flutter application, and am trying to work out if it's possible to play audio from the Music library on iOS with audio modifications (e. g. equalization settings) applied.
It seems like I'm looking for a solution that can work with MPMusicPlayerController, since that appears to be the strategy for playing local audio from the user's iOS Music library. I can find examples of applying EQ to audio on iOS (e. g. using AVAudioUnitEQ and AVAudioEngine: SO link, tutorial), but I'm unable to find any resources to help me understand if it's possible to bridge the gap between these resources.
Flutter specific context:
There are Flutter plugins that provide some of the functionality I'm looking for, but don't appear to work together. For example, the just_audio plugin has a robust set of features for modifying audio, but does not work with the local Music application on iOS/MPMusicPlayerController. Other plugins that do work with MPMusicPlayerController, like playify, do not have the ability to modify/transform the audio.
Even though I'm working with Flutter, any general advice on the iOS side would be very helpful. I appreciate any insight someone with more knowledge may be able to share with me!
Updating with my own answer here for future people: It looks like my only path forward (for now) is leaning into into AVAudioEngine directly. This is the rough POC that worked for me:
var audioPlayer = AVAudioPlayerNode()
var audioEngine = AVAudioEngine()
var eq = AVAudioUnitEQ()
let mediaItemCollection: [MPMediaItem] = MPMediaQuery.songs().items!
let song = mediaItemCollection[0]
do {
let file = try AVAudioFile(forReading: song.assetURL!)
audioEngine.attach(audioPlayer)
audioEngine.attach(eq)
audioEngine.connect(audioPlayer, to: eq, format: nil)
audioEngine.connect(eq, to: audioEngine.outputNode, format: file.processingFormat)
audioPlayer.scheduleFile(file, at: nil)
try audioEngine.start()
audioPlayer.play()
} catch {
// catch
}
The trickiest part for me was working out how to bridge together the "Music library/MPMediaItem" world to "AVAudioEngine" world -- which was just AVAudioFile(forReading: song.assetURL!)

How can I save the last paused time in an AVPlayer, and seek to it in another storyboard?

I'm new to xCode/swift so please forgive my inexperience. I have a ViewController where my viewers can listen to some audio playback. The playback is accessed like this when the player clicks a play button:
#IBAction func buttonClicked(_ sender: RoundButton)
{
self.clickedButton = sender
guard let url = sender.url else {
return
}
let player = AVPlayer(url: url)
let controller = AVPlayerViewController()
controller.player = player
present(controller, animated: true) {
player.play()
}
}
I got this code from another StackOverflow question, so I don't completely understand it. My goal is to be able to save the URL and the last played time so that the user can minimize the app, or navigate to a different screen, and then be able to click a "continue listening" button which will pull up another AVPlayer with the last used URL. This "continue listening" AVPlayer will then seek to the last played time.
I know that I need to observe the first AVPlayer somehow, so that when it is paused, stopped, or put in the background, I save the currentTime to a NSUserDefault (I think?). I also need to save the URL, because there are many different URLs that the user could click on.
I tried doing this, and besides not being able to figure out the observation, I also couldn't figure out the type inconsistencies present with NSUserDefault. I tried to retrieve URL NSDefault value as a String after setting it, but when I went to cast the String to a URL using URL(string: lastPlayedURL), xCode complained about "Cannot convert type Data? to expected type String".
My issue with using other StackOverflow questions to solve my problem is that I don't understand where to put the code blocks. Where do I create the observer? Because xCode did not seem happy when I created it inside the body of "buttonClicked".
Thank you for listening to my rambling.
Yes, UserDefaults seem appropriate to store the URL in. Use this method to do that.
To observe the player you need to use an AVPlayerItem. Here's some code that shows that.
About your general issues with Xcode (note the capitalisation) and Swift language, I'm afraid these are things you need to work through yourself by reading/watching tutorials/documentation. Then when you find detailed issues, post your code here and ask.
Good luck and have fun!

Streaming audio with avplayer long buffering - playImmediatelyAtRate doesn't work

I'm streaming audio in my iOS swift app.
The main issue is that avplayer has to load all the file to start the playback.
Using playImmediatelyAtRate doesn't work because playbackBufferEmpty is always true until the file is completely downloaded which can be an issue on long audio files.
Any ideas?
Not really AVPlayer related answer but you could use VLCKit to handle the stream.
Here is a basic sample in Swift:
let mediaPlayer = VLCMediaPlayer()
// replace streamURL by the url of the stream
mediaPlayer.media = VLCMedia(url: streamURL)
// outputView is the view where you want to display the stream
mediaPlayer.drawable = outputView
mediaPlayer.play()
If you have any issue with VLCKit, feel free to ping me!
For iOS >10 I set:
avplayer.automaticallyWaitsToMinimizeStalling = false;
and that seemed to fix it for me. This could have other consequences, but I haven't hit those yet.
I got the idea for it from:
AVPlayer stops playing video after buffering

How to custom WebRTC video source?

Does someone know how to change WebRTC (https://cocoapods.org/pods/libjingle_peerconnection) video source?
I am working on an screen sharing app.
At the moment, I retrieve the rendered frames in real-time in CVPixelBuffer. Does someone know how I could add my frames as video source please?
Is it possible to set an other video source instead of camera device source ? Is yes, which format the video has to be and how to do it ?
Thanks.
var connectionFactory : RTCPeerConnectionFactory = RTCPeerConnectionFactory()
let videoSource : RTCVideoSource = factory.videoSource()
videoSource.capturer(videoCapturer, didCapture: videoFrame!)
Mounis answer is wrong. This leads to nothing. At least not at the time of this writing. There is simply nothing happening.
In fact, you would need to satisfy this delegate
- (void)capturer:(RTCVideoCapturer *)capturer didCaptureVideoFrame:(RTCVideoFrame *)frame;
(Note the difference to the Swift version: didCapture vs. didCaptureVideoFrame)
Since this delegate is for unclear reasons not available at Swift level (the compiler says you have to use didCapture, since it has been renamed from didCaptureVideoFrame with Swift3) you have to put the code int an ObjC class. I did copy this and this (which is a part of this sample project)into my project, made my videoCapturer an instance of ARDBroadcastSampleHandler
self.videoCapturer = ARDExternalSampleCapturer(delegate: videoSource)
and within the capture callback I'm calling it
let capturer = self.videoCapturer as? ARDExternalSampleCapturer
capturer?.didCapture(sampleBuffer)

Amazon polly not playing multiple text inputs

I have integrated amazon polly to one of my project in swift and asking it to TTS multiple set of text strings. Certainly I am using there below set of instructions to play sound:
builder.continueOnSuccessWith { (awsTask: AWSTask<NSURL>) ->
Any? in
// The result of getPresignedURL task is NSURL.
// Again, we ignore the errors in the example.
let url = awsTask.result!
// Try playing the data using the system AVAudioPlayer
self.audioPlayer.replaceCurrentItem(with: AVPlayerItem(url: url as URL))
self.audioPlayer.play()
return nil
}
While debug I found that replaceCurrentItem is adding a new item to play and ignoring the previous. I would like to have some suggestions as how the polly handle such multiple calls within its framework.
Any help appreciated! Thanks
I could able to make this work by inserting each operation to AVQueuePlayer and playing at last, but I am keen to know how amazon handle's multiple file play in polly

Resources