Using multi-VLC Player in the same UIView cause lag - ios

I'm developing an iOS app using swift, in this app I want monitor several IP camera in the same view.
I created two VLCMediaPlayer and feed them with two different rtsp link and both of them result in extremely lag. I've also change the "network-caching" to 10000 for both two VLC Player. If I use just one VLCMediaPlayer in that view, the streaming is fine.
I'm wondering is this the right solution for display multi-vlc player? Or I should use other solution or other media player?

Related

I want to design a screen which will show different number of videos on same screen based on certain condition

This is an iOS problem.
I want to design a screen which will show a video on full screen. After some time based on some backend condition if another video is available I have to show that by splitting the screen into two vertical halves. After some time if one more video is available I have to again split the screen and show the third video horizontally on the bottom of the screen.
I am new to iOS and I am not able to manage the screen split on runtime based on backend condition. please help me in this regard.
Using AVPlayer it possible to play multiple videos in a view. You can use Apple's AVPlayer.
An AVPlayer is a controller object used to manage the playback and
timing of a media asset. You can use an AVPlayer to play local and
remote file-based media, such as QuickTime movies and MP3 audio files,
as well as audiovisual media served using HTTP Live Streaming.

AVPlayerLayer Vs AVPlayerViewController

I am new to iOS app developing field. I am trying to capture video using AVFoundation. I am successful in this. But when I tried to play the video back using MPMoviePlayerController, I got too many issues. So I am trying to play using AVPlayer.
But in AVPlayer there are two approaches AVPlayerLayer and AVPlayerViewController.
I tried searching about those, but I didn't get any particular reason to choose one.
Can anyone suggest me which is better to use?
AVPlayerViewController is an all in one solution. You setup your AVPlayer with a video and present the player controller. It handles all the playing and has its own controls baked in (I'm sure you've used seen this in other apps). It is the simplest way to show a video.
AVPlayerLayer is for when you want to add some customization, like adding your own controls or extra views, or not making the video full screen.

How can we overcome the missing muted/volume property on HTML5 video on UiWebView/iOS Safari?

As many hybrid app developers know, Apple has decided to disallow setting the volume property of HTML5 video elements in JavaScript. This also amounts to the the muted property. The concept of muted videos which autoplay when scrolled into view and with the option of unmuting on tap is growing increasingly popular (pioneered by Vine, Facebook, etc.). I'm trying to find a way around this limitation in design. From what I've been able to read on the subject, there's not any hack or solution that solves this design requirement of mine.
Here's my thoughts so far:
I could split the audio from the video into a separate stream and sync current time with the video and call play() when the user is tapping. However, iOS Safari/UiWebView does not support simultaneous audio/video streams. Thus, this is simply not an option.
I could encode two videos, one with sound and one without. I could then swap the src on tap. However, this requires reloading the entire stream and also nearly doubles the amount of data required. The latency is noticeable. Thus, this won't be a viable solution.
I could embed a native AVPlayer class element in the webview. However, this would be an overlay and not be manageable from within the webview. Custom controls and UI interaction from within the dom would not be possible. Thus, this is not an option.
I could simply disable the output of the app and dynamically switch it on whenever the user taps a video element. However, to my knowledge this is not possible. I could show the native software volume slider, but that would defeat the purpose of this whole thing.
Do you have any suggestions or ways around this limitation?
I managed to find an acceptable solution. I split the videos into three files. One without audio, one without video and then one with both video and audio for desktop browsers/Android.
It seems like running simultaneous streams CAN work as long as they doesn't conflict with each other, which basically means a separate audiotrack and a video with no audio Channels play just fine in unison.

AVComposition breaks on Airplay

I have a video composition which I'd like to play over Airplay (without mirroring). The app works as expected when using normal Airplay mirroring, but I'd like to get the speed, reliability, and resolution bump you get from using Airplay video instead.
The problem is that when I set
player.usesAirPlayVideoWhileAirPlayScreenIsActive = YES;
...the player goes blank.
Notes:
Since I don't create separate windows for each display, they are both trying to use the same AVPlayer.
My AVVideoComposition contains different files and adds opacity ramps between them.
This unanswered question suggests that the problem is more likely due to the fact that I'm playing an AVComposition than the use of a shared player: AVComposition doesn't play via Airplay Video
Two questions:
Do I have to get rid of the player on the iPad?
Can an AVVideoComposition ever be played over AirPlay?
I can't make comments so I had to post this as an answer although it might not fully respond to the questions.
I had similar issue and at the end I found out that when AVPlayer plays AVComposition it simply doesn't display anything on the external display. That's why I had to do it myself by listening to UIScreen connection notifications.
I have to say that all worked pretty perfect. I'm checking first if there are more than one screen and if there are I simply move the AVPlayer on that screen while displaying a simple message on the device's screen that content is played on... plus the name of AirPlay device. This way I can put whatever I want on the external display and is not very complicated. Same thing is when I receive UIScreenDidConnectNotification.
That was fine until I noticed that the composition plays really choppy on the the external display. Even if it consists of only one video without any complex edits or overlays. Same video plays perfectly if I save it to the Camera Roll or if I use MPMoviePlayerController.
I've tried many things like lowering resolutions, lowering renderScale and so on but with no success.
One thing bothers me more is how actually Apple do this in iMovie - if you have AirPlay enabled and you play a project (note it's still not rendered so it must use a composition in order to display it) right after tapping play button it opens a player that plays content really smoothly on the external monitor. If you however activate AirPlay from the player it closes and start rendering the project. After that it plays it I thing by using MPMoviePlayerController.
I'm still trying to find a solution and will post back if I have any success.
So for the two questions:
I don't see why you have to get rid.
Yes it can be played but with different technique and obviously issues.
in the app .plist create a new item called:
required background modes
add a new array element called:
App plays audio or streams audio/video using AirPlay
Not sure if you have already tried this, but you don't mention it in your post.
Cheers!

Play a list of media (video/audio) items on iOS

I'm developing an iPhone App and I need to implement a player to play a list of media (video/audio) items.
Following is a screenshot of player I have captured from an another iPhone app, I found many apps playing video/audio using a player has similar UI like this.
Can this player (including UI) can be implement by official API? Or just implement manually?
How can I implement a same player like this?
To Play audio follow this blog
And to play video, either you can use CustomMovieplayer or go with UIWebView with embedded html.
Movie Player sample code from here
To play video as embedded html in UIWebView, check this..
https://stackoverflow.com/a/7551375/790794
As per the file type, you can modify the source type.

Resources