How to get frame by frame images from movie file in iPhone - ios

In iOS, I would like to get frame-by-frame images from a movie file,
I tried it by using AVAssetImageGenerator. But it gets one image per second for a 30fps movie. It should be 30images!
I heard that there is a way to use FFmpeg.
But in newer OSs like iOS7, is there a new API to do this without using external libraries like FFmpeg?

You can also try OpenCV to capture the frames. But the speed will be dependent on the device and processor. Hope for iPhone5 the speed is perfect to capture all the frames. You can have a look at this link if it helps.

Related

Processing live video and still images simultaneously at two different resolutions on iPhone?

I'm working on video processing app for the iPhone using OpenCV.
For performance reasons, I wan't to process live video at a relatively low resolution. I'm doing object-detection on each frame in the video. When the objects are found in the low-resolution video frame, I need to acquire that exact same frame at a much higher resolution.
I've been able to semi-accomplish this using a videoDataBufferOutput and a stillImageOutput from AVFoundation, but the still image is not the exact frame that I need.
Are there any good implementations of this or ideas on how to implement it myself?
In AVCaptureSessionPresetPhoto it use small video preview(about 1000x700 for iPhone6) and high resolution photo(about 3000x2000).
So I use modified 'CvPhotoCamera' class to process small preview and take photo of full-size picture. I post this code here: https://stackoverflow.com/a/31478505/1994445

AVFoundation max render size

I've searched quite a lot and it seems that couldn't find a definite answer to what is the maximum render size of a video on iOS using AVFoundation.
I need to stitch two or more videos side by side or above each and render them in one new video with a final size larger than 1920 x 1080. So for example if I have two full hd videos (1920 x 1080) side by side the final composition would be 3840 x 1080.
I've tried with AVAssetExportSession and it always shrinks the final video proportionally to max 1920 in width or 1080 in height. It's quite understandable because of all possible AVAssetExportSession settings like preset, file type etc.
I tried also using AVAssetReader and AVAssetWriter but the results are the same. I only have more control over the quality, bitrate etc.
So.. is there a way this can be achieved on iOS or we have to stick to max Full HD?
Thanks
Well... Actually the answer should be YES and also NO. At least of what I've found until now.
H.264 allows higher resolutions only using a higher level profile which is fine. However on iOS the max profile that can be used is AVVideoProfileLevelH264High41 which according the specs, permits a max resolution of 1,920×1,080#30.1 fps or 2,048×1,024#30.0 fps.
So encoding with H.264 won't do the job and the answer should be NO.
The other option is to use other compression/codec. I've tried AVVideoCodecJPEG and was able to render such a video. So the answer should be YES.
But.. the problem is that this video is not playable on iOS which again changes the answer to NO.
To summarise I'd say: it is possible if that video is meant to be used out of the device otherwise the video will simply not be useable.
Hope it will help other people as well and if someone else gives a better, even different answer I'll be glad.

Phonegap video capture reduce output file size

When capturing video with phonegap on iOS the file size for a even a 1min capture is ridiculously large. Well, far to large to upload over a 3G connection reliably. I read that there is a native AVCaptureSession object that will allow the bitrate to be altered in order to reduce the file size. Has anyone implemented this in the phonegap video capture or can give me any pointers?
Found the deatils I needed here:
How can I reduce the file size of a video created with UIImagePickerController?
The important line for Phonegap users is in CDVCapture.m:
pickerController.videoQuality = UIImagePickerControllerQualityTypeHigh;
There are several presets that can be used e.g.
UIImagePickerControllerQualityTypeLow
phonegap provide quality argument when to navigate camera set :
quality:50
this is recommended for ios. if u need to reduce mean reduce quality.It have integer value (0-100) use this .....

How can I make AVCaptureSessionpresetPhoto take pictures at a lower resolution?

I'm currently using AVCaptureSessionpresetPhoto to take my pictures and I'm adding filters to them. Problem is that the resolution is so big that I have memory warnings ringing all over the place. The picture is simply way to large to process. It crashes my app every single time. Is there anyway I can specify the resolution to shoot at?
EDIT**
Photography apps like Instagram or the Facebook Camera app for example can do this without any problems. These applications can take pictures at high resolutions, scale them down and process them without any delay. I did a comparison check, the native iOS camera maintains a much higher quality resolution when compared to pictures taken by other applications. The extreme level of quality isn't really needed required for a mobile platform so it seems as if these images are being taken at lower resolution to allow for faster processing and quick upload times. Thus there must be a way to shoot at a lower resolution. If anyone has a solution to this problem, it would greatly be appreciated!
You need to re-size image after capture image using AVCaptureSession and store it's image after resizing.
You found lots of similar question in to StackOverlow i just putting some link bellow that makes help's you.
One More thing As my suggestion that using SDWebImage for Displaying Images asynchronously Becouse App working smoothly. There are also some other way for example(Grand Central Dispatch (GCD) Reference , NSOperationQueue etc) in iOS for asynchronous Tast
Re-size Image:-
How to resize an image in iOS?
UIImage resizing not working properly
How to ReSize Image with Good Quality in iPhone
How to resize the image programmatically in objective-c in iphone

Exporting PNG-Sequences for iOS right!

I made an animation for an iOS-loading-screen. The devs tell me now, that the files are way to big (100 frames/pngs with 22.5mb for the whole bunch) and I tried to reduce it by rastering the PNGs by export (helped a bit) and then I shortened the video by 50frames and I'm on 10mb now. This is still huge. I made the animation in Flash. Is this the problem? Each PNG is 235kb of size. I need your help :-)
first try to reduce the size of your png using pngcrush or Photohop export for web. The other idea is to use an mpeg video, create a png with just the first frame, open the video when the application run and you should be able to switch from the static png to the first frame of the video without glitches. In this way the user don't understand that you switched from image to video and see the video playing.
Hope this helps

Resources