Applications such as Netflix have the ability to play video out of the dock connector when the video connector is hooked up. Do applications have the ability to put arbitrary data on video out? Or is the video player the only component on iOS that can do so? If the former is possible, how can my app access video output?
To draw to video out, place a window on the screen returned by [[UIScreen screens] objectAtIndex:1];. Call window.screen = theScreen on a window you have already created to assign it to the screen "theScreen".
Related
A client's app displays two versions of a video in the app, one regular, one 360 view. The regular video is handled by AVPlayer. The 360 video is rendered by the open source package Swifty360Player (which works very well, btw). The app would like to be able to display either video on a big screen using AirPlay.
For the normal video, this is no problem. The 360 Video, however, is produced by a SceneKit view, so it's technically more akin to a 3D game than a video. I know that we can display game scenes on an AirPlay device if/when the user manually mirrors his iPhone/iPad to the AirPlay screen.
But I wonder, is there any way to generate a live video stream from the SceneKit view, and then transmit that video stream to the AirPlay device in real time? My wish is that the user could then use an AVRoutePickerView to select an AirPlay device from within the app.
ReplayKit does this for streaming services like Twitch but this app isn't looking to broadcast, just to share video with a single screen in the same room.
Is there anyway to accomplish this?
This is an iOS problem.
I want to design a screen which will show a video on full screen. After some time based on some backend condition if another video is available I have to show that by splitting the screen into two vertical halves. After some time if one more video is available I have to again split the screen and show the third video horizontally on the bottom of the screen.
I am new to iOS and I am not able to manage the screen split on runtime based on backend condition. please help me in this regard.
Using AVPlayer it possible to play multiple videos in a view. You can use Apple's AVPlayer.
An AVPlayer is a controller object used to manage the playback and
timing of a media asset. You can use an AVPlayer to play local and
remote file-based media, such as QuickTime movies and MP3 audio files,
as well as audiovisual media served using HTTP Live Streaming.
Is it possible to stream content/context of a UIView as a direct video stream in Swift? I am not really looking for a “view screenshotting” functionality and than assembling video, this solution is possible but the framerate is far from ideal.
Update: maybe using OpenGL view?
1. View screenshots: What's your current solution of timing function ?
I believe if you use CADisplayLink, you can get better frame rate. As in my project, I can get ~15-20fps live streaming on full screen video view on iPhone 7Plus.
2. Using ReplayKit: I think I don't need to rewrite the introduction in another way because the Apple's docs were so clear.
Record or stream video from the screen, and audio from the app and
microphone.
Using the ReplayKit framework, users can record video from the screen,
and audio from the app and microphone. They can then share their
recordings with other users through email, messages, and social media.
You can build app extensions for live broadcasting your content to
sharing services. ReplayKit is incompatible with AVPlayer content.
The frame rate is quite higher than draw screenshot of views but currently it only supports capturing the whole screen.
So if you want to achieve capturing just a view, may be think about this way: Crop the buffer array of the output CMSampleBufferRef frame.
edit: If it's about mirroring a view to an external screen then we could have other solutions than ReplayKit or view screenshots.
I have created a web page that auto plays a full screen background video.
On top of the video is a div that contains text & a link to an external site - this works fine in all desktop browsers.
How do i recreate the same setup to play on mobile devices - would I need to use javascript in order to achieve this?
I have spent many hours trawling google for a definitive answer and am now very confused.
Thanks in advance.
Autoplay for HTML5 video is not allowed on mobile devices such as iOS or Android. You can read this for the whys and hows on iOS.
On iPhone the video plays in the default (fullscreen) Quicktime player. So there is no real background notion (this could be accomplished in a native app where inline playback of video is allowed but not in the Safari/web browser). You would need to stick to an image I guess.
On the iPad or on Android in order to accomplish what you want you will need to bind your video tag to a touch event/button (like when a user 'touch' to enter your site), and on this event initiate the play sequence for the video (in your case the video being set to occupy the full width and height of the viewport).
Thanks
So I have this player that I am using to video locally. I need to be able to mirror airplay the screen to a tv but every time I set airplay to mirror the player just sends the whole video to the screen instead of just mirroring.
https://github.com/NOUSguide/NGMoviePlayer
I contacted the developers about how to disable the airplay part of it. They told me to remove the airplay layer from the player. Can someone lead me into the right direction into getting this done?
If you want to disable AirPlay, just set the allowsAirPlay property to false in your MoviePlayerController object.
You can access the mirrored screen by using this method :
UIScreen * screen = [[UIScreen screens] objectAtIndex:1];
before calling this method, ensure that the screens property has more than one object