Play video with alpha channel in iOS? - ios

Is it possible to play a video with alpha channel in you iPhone app?
I'm thinking UIView with a subview (the movie view) and play a movie with alpha channel in that view..
Is this possible?

Take a peek at the "APNG" app on the app store. This app is a free demo that shows how animations with an alpha channel can be implemented in an iOS app using the .apng file format. One can also take a peek at h.264 with an alpha channel for a detailed description about how a pair of h.264 videos can be used to implement a video with an alpha channel.
APNG app

It looks like there's no way to do this with iOS provided solutions. But according to this answer to a similar question you might succeed with ffmpeg. The problem with ffmpeg is that GPL/LGPL are incompatible with Apple's terms so you can't use it in an app for the App Store.

Related

Video quality selection in iOS

I am working on video player app which streams video from m3u8 url.
Due to apple limitation, I can't set specific video quality during playtime or startup. I am curious because i can see this feature supported in Youtube iOS app.
Apple provides preferredPeakBitRate but it doesn't guarantee whether video will be played in that bitrate.
How do youtube able to achieve that video quality or maintaining specific video quality in iOS?
Just let me know if i am doing anything wrong or missing anything. Thanks in advance

File Format for Saving Video with alpha channel in iOS

I am using AVFoundation to create a video and have added in an effect to clip the video so there is a clear background. What file format should I save this as to preserve the transparency in my iOS app.
AVAnimator is a library with which you can display video with an alpha channel on iOS, it is however not free to use for commercial products.
I don't think it's natively possible.

iOS 7+ Is there a possibility to capture video from frontal camera while showing another video on the screen?

I have a task.
There is iOS device. There is an app I should create.
The app shows some video file (local video file from the device) while frontal camera captures users' face.
Showing video and capturing user's face via frontal camera are simultaneous.
I see that FaceTime and Skype for iOS can do this. But the former one created by Apple (they can do whatever on their devices) while latter one is owned by Microsoft (big companies/big money sometimes allowed more than usual developers).
Moreover, I doubt on co-existense of video capturing along with video player at the same time.
So, I am not sure that this task is 100% implement-able and publish-able.
Is it possible on iOS 7+?
Is it allowed by Apple to do this (I mean that there are many technical possibilities on iOS but only some of them are OK for Apple. Especially during moderation process)?
Are there good technical references?
I believe so. Doing a search on Appstore shows a number of video conferencing apps:
Zoom cloud
Polycom
VidyoMobile
Fuze
Just search for "video conferencing".

ios overlaying alpha channel video on another video

I have been trying to create a video template which uses alpha channel video overlayed on the mp4 videos and images.
This is how I need to create a video http://viewptch.ptchcdn.com/rendered/52b28a9f8d4f980f3a3f99c3_cb44bf2b/52b28a9f8d4f980f3a3f99c3_lrg_main_main.mov
For overlaying alpha video on another videos, I have used AVAnimator, I was succeeded for playing a preview using AVFoundation, AVSynchronizedLayer and AVAnimator.
When rendering video from composition, frames of alpha channel videos renders very slowly.
I need to create a video with alpha channel video on top of another video.
Can any one please suggest me what are the possible ways to render a video like http://viewptch.ptchcdn.com/rendered/52b28a9f8d4f980f3a3f99c3_cb44bf2b/52b28a9f8d4f980f3a3f99c3_lrg_main_main.mov ?
You mention that you have looked at AVAnimator, did you download the KittyBoom example project and try it out? The specifics of how it works are detailed in this post. One thing to note is that when you build and run on the device, you need to turn Debug mode off otherwise it will not execute quickly because a number of extra checks are done in debug mode. Also, you have to make sure to test on the actual device, the simulator is not a good measure of performance on a real device. Performance is a key problem with video that contains an alpha channel as iOS does not support video with an alpha channel by default.

AVFoundation to overlay two video clips with Alpha compositing in ios5?

I'm hoping to use IOS5 AV Foundation with or without Open GL to record video from the camera and overlay/merge another video clip on top using some form of alpha channel compositing / foreground matting.
A sample use case of the combined output may be a video of an animated character interacting with the the user's recorded video clip from the iPhone/iPad camera.
Is this possible right now with IOS5 or potentially with Brad Larson's GPUImage framework? Can the alpha channels of the two video sources be combined easily?
If anyone has any sample code they could share, or offer any guidance I'd be really appreciative.
The Apple AVEditDemo (+ accompanying WWDC 2010 video) would be a start. Doesn't show video overlays w/ alpha but if you haven't worked with AVFoundation before this is an excellent intro.
Here's another good walkthrough video-composition-with-ios

Resources