I am working with a sample application like vine. My requirement is that I have to create a 'ghost' filter for video as in vine.
Actual requirement is
-Record a video on long press on the view
-On pause of record, I need to show the last frame of the recorded video above my view. Please see the expected working here
I have checked PBJVision library and found this feature working. But I need to implement this feature separately in my application.
While analysing the code, I found that this can be achieved using Open GLES. I have tried using a GLKView but it just shows a dark shade instead of image frame. Since I am new in this area, please help me.
Related
I am taking picture using UIImagePickerController with custom overlay view. I have a requirement to enable native editing mode of UIImagePickerController when photo is taken and user taps "Edit" button which is on my custom overlay view.
I have set:
imagePickerController.showsCameraControls = NO;
How is this possible to achieve?
Thanks,
You might be trying using UIImagePickerController. But I know this one solution to your problem. You can do it easily using AVCamCaptureManager and AVCamRecorder classes. Apple has a demo program build on its developer site here. It is named AVCam. In simple words what it does is when you click to open the camera, it calls the classes and methods which are responsible for opening the iPhone's camera and record video or capture audio. It calls the same classes which are called by UIImagePickerController. So your camera will open and start taking input.
Now, if you open the xib file of that AVCam project, you'll find a small UIView object. This view is responsible for displaying the camera's feed. You can resize that view as per the size you want and the camera's input will be displayed in that much area. You can also put the frame image around it as per your choice.
It worked for me when I wanted to resize the camera's input feed and capture photos. I hope it works for you as well.
In the iOS 7 camera app, if you click on the icon in the bottom right corner it opens a collection view that is a 3x3 grid showing a live preview of different filters. How can this be recreated for our own app purposes? I have tried using GPUImage but I am only able to display one GPUImageView at a time and I can't figure out what I am doing wrong.
GPUImage comes with a MultiViewImage example that fulfills this exact requirement. I will use it as a starting point.
The example was found in the GPUImage GitHub (https://github.com/BradLarson/GPUImage) under the directory /examples/iOS/MultiViewFilterExample/
Direct Link
The project in that folder shows how to display 4 GPUImageViews, each with their own filter on one UIView.
I have a MPMoviePlayerController instance in my iOS app, that plays a local file in fullscreen mode. This all works fine, but now I want to add a custom button to the window for changing the playback speed. We support both iPhone and iPad in all orientations.
I know how to set the playback speed from code (using setCurrentPlaybackRate), but I need to let the user do it while watching the video, which means adding some kind of button to the playback screen next to the existing buttons, e.g. next to "play", "pause", or in the top bar.
By looking on StackOverflow I have found various replies for questions similar but not quite the same, some saying it cannot be done in fullscreen, some saying it can be done (but is very complex) by creating some kind of overlay, effectively replacing the entire overlay with a custom one.
Although, I have yet to find any code examples (apart from a few snippets without context), getting-started style tutorial or similar for this, so any pointers to example code would be greatly appreciated.
maybe this Apple example could help you
https://developer.apple.com/library/ios/samplecode/MoviePlayer_iPhone/Introduction/Intro.html
I've tried to figure out if the functionality I need is possible with a PhoneGap plugin and haven't found a clear answer.
I'd just like some clarification as far as if this is within the scope of a plugin.
The functionality would be...
Within the webview a user triggers a button.
A camera preview screen of custom size, not full
screen, pops up over the webview.
The preview shows view from front facing camera except it's
cropped to the custom size.
A video automatically starts recording for a set amount of time,
then stops.
Once the video stops recording the preview screen goes away and the
local path to the movie file is returned to a callback.
Is this all reasonable functionality for a phonegap plugin?
Yes, you can definitely implement that functionality as a Phonegap plugin.
Possible steps that you can take are the following:
- Write the native code of the functionality first (Note: You can call and start activities through Phonegap plugin).
- Create a layout (Maybe, something that will overlay the webview with transparent background).
- Use the File API of Phonegap to access the movie file you just captured.
I have an array of images that I am passing to this class:
https://github.com/jberlana/iOSKenBurns
This basically takes all of the images and builds a slideshow from them and then adds an automatic and random ken burns effect.
Everything works awesome; however, I am now trying to add the ability to export the slide show to a movie file. There will be just a simple button to accomplish this, but I have no idea where to start for turning this into a video.
I have found this: QTMovie Class Reference which is EXACTLY what I'm looking for (compile array of images into a movie file), but I need to retain the ken burns effect the class has added, and I don't know if this applied to iOS either.
Not really sure if this is even possible, or what to do with it. Any help would be great, at least a point in the right direction! Thanks in advance!
The QTMovie class does not function under iOS. You can take a look at an example Xcode project I created to show off this sort of movie composition logic in my library, the Xcode project is at AVRender. This example creates a lossless output file that basically contains all the images as frames of video. If you want to actually convert that to h.264 when finished, there is another example on that same page named AVDecodeEncode that shows how to do that and play the results with AVPlayer. You could also roll your own code to do all this using the AVAssets API under iOS, but be warned that those APIs are not simple to use.