I am working on an app that can play some short animated clips. At this point my only option is to embed animated frames. It works just fine. But I am just wondering if there's another way to simply embed the animated files directly as a movie clip in order to avoid importing tons of frames. Perhaps it'd be better for image compression.
In that regards I am aware of these alternatives: the MediaPlayer Framework, MPMoviePlayerController presented here http://mobile.tutsplus.com/tutorials/iphone/mediaplayer-framework_mpmovieplayercontroller_ios4/ and also some other techniques showed here http://www.raywenderlich.com/13418/how-to-play-record-edit-videos-in-ios
However this would mean using Apple's UI. I would like instead to create a more integrated experience that is using my own app UI.
Is there a library for that?
Take Look at AVPlayer Class Reference
https://developer.apple.com/library/mac/#documentation/AVFoundation/Reference/AVPlayer_Class/Reference/Reference.html.
and AVPlayerLayer Class Reference
http://developer.apple.com/library/ios/#documentation/AVFoundation/Reference/AVPlayerLayer_Class/Reference/Reference.html
It has the ability to do what you need. You can aslo animate the layer just like other CALayers
Related
I am doing video editing app. I have one animation video and I am using AVPlayer to play the video. So i need to add some label text on the video frame by frame.
Is it possible to add overlay label text on the video frame by frame?
Yes it is possible, you will have to use AVFoundation to do this. However is can be complex with no previous knowledge of AVFoundation. The best thing I can do is to give you this tutorial and then this one (which is what you want to do but written in Objective-C). These tutorials will explain to you the basics of how the framework works. You can also check the Apple videos about the framework.
I am new to iOS app developing field. I am trying to capture video using AVFoundation. I am successful in this. But when I tried to play the video back using MPMoviePlayerController, I got too many issues. So I am trying to play using AVPlayer.
But in AVPlayer there are two approaches AVPlayerLayer and AVPlayerViewController.
I tried searching about those, but I didn't get any particular reason to choose one.
Can anyone suggest me which is better to use?
AVPlayerViewController is an all in one solution. You setup your AVPlayer with a video and present the player controller. It handles all the playing and has its own controls baked in (I'm sure you've used seen this in other apps). It is the simplest way to show a video.
AVPlayerLayer is for when you want to add some customization, like adding your own controls or extra views, or not making the video full screen.
I want to capture video from screen.
I am unable to capture video from avplayer layer. But, can record from other viewes.
Do I need to search code for OpenGLES?
You can have look at the blog.
I think it covers what you want.
Actually this blog does not use AVPlayer, but rather it uses AVAssetWriter.
Look into the comments there you can get hint using AVPlayer.
Or you can have this project, which is for recording of screen.
These two libraries are the best possible options that you can use to record video of your application screen.
https://github.com/wess/Glimpse
https://everyplay.com/about
Yes I'm working with the same job it comes black screen because of high frameDuration of video .But I've solved this
Don't add AVPlayer to that view. Add to another view and note the time at play and then crop the video of time that has played and after all merge with the recorded video.
I've stated it shortly but I hope you will understand.
You need to work on OpenGLES to render video layer, otherwise it will look black
I would suggest using the GPUImage framework, for a number of reasons:
It IS OpenGL but is wrapped in Objective-C, so there's no need to learn OpenGL to implement it
It is 100% App Store vetted - I have an app in the store that uses it now, as do dozens of others
There is a class named GPUImageMovie that wraps AVAssetReader for playback and another named GPUImageMovieWriter that allows you to write textures to file - these can be the same file, or separate.
There are quite a few examples available within the repository that should be pretty easy to understand...
Source Code / Git Repo
* https://github.com/BradLarson/GPUImage
Blog
* http://www.sunsetlakesoftware.com/2012/02/12/introducing-gpuimage-framework
Hope that helps !
This is good for you testing you achieve that goal through this framework
https://github.com/gabriel/CaptureRecord for screen capture. But, it can be used to record simulator only. You can't submit app to appstore using this code as it use private api.
I'm using AVAudioPlayer to play mp3 files, but I need to implement ui interface as shown following:
I think it maybe a iOS system control I can use, but I can't find which control it is.
That is an MPMoviePlayerController. Compare, for example this illustration from my book:
The same section of my book tells you how to make and work with one of these. Despite the name, it's great for playing audio with a user interface. The only difference between our screen shots is that you had an AirPlay device present on the network at the time.
My iPad app has the option to play videos. I use the MPMoviePlayerViewController class to play my videos.
My question is: if I want to play the videos on an attached external monitor, how do I keep the playback controls on the iPad like YouTube does? If I add the view of the MPMoviePlayerViewController 's player to the external screen's hierarchy I can play the video fine, but I now have no control over it. Is there a way to move or duplicate the view where the controls lie and place it on a view which resides on the iPad?
I'm not aware of an officially supported way of pulling out the original UI in this way. The MPMoviePlayerViewController only exposes the MPMoviePlayerController object it uses via its moviePlayer property. The MPMoviePlayerController in turn only exposes view and backgroundView, which aren't helpful for such a purpose. You could in theory inspect the subviews of the movie player's main view, find the playback controls and try to move them to the other screen. I have a feeling this will not end well though, as they're anything but static. You also never know what will happen in later iOS versions, or if they'll let your hack on the app store. It's probably less trouble to just re-do the UI yourself.
Actually controlling the video playback programmatically is straightforward - the view controller's moviePlayer implements the MPMediaPlayback protocol.