Processing live video and still images simultaneously at two different resolutions on iPhone? - ios

I'm working on video processing app for the iPhone using OpenCV.
For performance reasons, I wan't to process live video at a relatively low resolution. I'm doing object-detection on each frame in the video. When the objects are found in the low-resolution video frame, I need to acquire that exact same frame at a much higher resolution.
I've been able to semi-accomplish this using a videoDataBufferOutput and a stillImageOutput from AVFoundation, but the still image is not the exact frame that I need.
Are there any good implementations of this or ideas on how to implement it myself?

In AVCaptureSessionPresetPhoto it use small video preview(about 1000x700 for iPhone6) and high resolution photo(about 3000x2000).
So I use modified 'CvPhotoCamera' class to process small preview and take photo of full-size picture. I post this code here: https://stackoverflow.com/a/31478505/1994445

Related

Swift send compressed video frames using GPUImage

I'm writing a Swift app that sends an iPhone camera video input (frames) through the network, so I can later display them on a macOS app.
Currently, I'm grabbing video frames from an AVCaputreSession, and get a PixelBuffer from the captureOutput method.
Since each frame is huge (RAW pixels) I'm converting the CVPixelBuffer that to a CGImage with VTCreateCGImageFromCVPixelBuffer and later to a UIImage with JPEG compression (50%). I then send that JPEG through the network and display it on the Mac OS app.
As you can see this is far from ideal and runs at ~25 FPS on an iPhone 11. After some research, I came up with GPU Image 2. It seems that I could get the data from the camera and apply something like this (so that the transformation is done in GPU):
camera = try Camera(sessionPreset:AVCaptureSessionPreset640x480)
let pictureOutput = PictureOutput()
pictureOutput.encodedImageFormat = .JPEG
pictureOutput.imageAvailableCallback = {image in
// Send the picture through the network here
}
camera --> pictureOutput
And I should be able to transmit that UIImage and display it on the macOS app. Is there a better way to implement this whole process? Maybe I could use the iPhone's H264 hardware encoding instead of converting images to JPEG, but it seems that it's not that straightforward (and it seems that GPUImage does something like that from what I read).
Any help is appreciated, thanks in advance!
I understand that you want to do this operation in a non-internet environment.
What are your project constraints;
Minimum fps?
Minimum video resolution?
Should sound be transmitted?
What is your network environment?
Minimum iOS and OSX version?
Apart from these, GPUImage is not a suitable solution for you. If you are going to transfer videos, you have to encode H264 or H265 (HEVC) in every moment. In this way, you can transmit video in a performance way.
The solution you are doing now is CMSampleBuffer-> CVPixelBuffer-> JPEG-> Data conversion seriously burden the processor. It also increases the risk of memory leak.
If you can tell a little bit, I would like to help. I have experience with video processing.
Sorry for my english.

AVCaptureSessionPreset Photo and High Optimization

I have been trying all kinds of settings for the AVCaptureSessionPreset to match my desired output, but I don't seem to be able to get it right.
The Photo preset captures a photo where the resolution is too high, so it takes some time before the image is finished processing.
The High preset is perfect in the sense of performance. The image gets processed and returned almost instantaneously. But the aspect ratio is not right, it is 16:9 compared to the Photo preset which is 4:3.
I have also tried changing the AVCaptureDevice's activeFormat to a lower resolution. But the performance is just not as good as when using the High preset.
Someone with a similar problem from 2014:
AVCaptureSession preset creates a photo that is too big
The problem seems to be that you are attempting to perform some kind of time-consuming processing on the large photo data returned from the capture. Don't. It's large! Instead, when you configure the session, ask for a preview image at the desired size, and when the capture takes place, obtain the preview image and operate on that.

How to get frame by frame images from movie file in iPhone

In iOS, I would like to get frame-by-frame images from a movie file,
I tried it by using AVAssetImageGenerator. But it gets one image per second for a 30fps movie. It should be 30images!
I heard that there is a way to use FFmpeg.
But in newer OSs like iOS7, is there a new API to do this without using external libraries like FFmpeg?
You can also try OpenCV to capture the frames. But the speed will be dependent on the device and processor. Hope for iPhone5 the speed is perfect to capture all the frames. You can have a look at this link if it helps.

Phonegap video capture reduce output file size

When capturing video with phonegap on iOS the file size for a even a 1min capture is ridiculously large. Well, far to large to upload over a 3G connection reliably. I read that there is a native AVCaptureSession object that will allow the bitrate to be altered in order to reduce the file size. Has anyone implemented this in the phonegap video capture or can give me any pointers?
Found the deatils I needed here:
How can I reduce the file size of a video created with UIImagePickerController?
The important line for Phonegap users is in CDVCapture.m:
pickerController.videoQuality = UIImagePickerControllerQualityTypeHigh;
There are several presets that can be used e.g.
UIImagePickerControllerQualityTypeLow
phonegap provide quality argument when to navigate camera set :
quality:50
this is recommended for ios. if u need to reduce mean reduce quality.It have integer value (0-100) use this .....

How can I make AVCaptureSessionpresetPhoto take pictures at a lower resolution?

I'm currently using AVCaptureSessionpresetPhoto to take my pictures and I'm adding filters to them. Problem is that the resolution is so big that I have memory warnings ringing all over the place. The picture is simply way to large to process. It crashes my app every single time. Is there anyway I can specify the resolution to shoot at?
EDIT**
Photography apps like Instagram or the Facebook Camera app for example can do this without any problems. These applications can take pictures at high resolutions, scale them down and process them without any delay. I did a comparison check, the native iOS camera maintains a much higher quality resolution when compared to pictures taken by other applications. The extreme level of quality isn't really needed required for a mobile platform so it seems as if these images are being taken at lower resolution to allow for faster processing and quick upload times. Thus there must be a way to shoot at a lower resolution. If anyone has a solution to this problem, it would greatly be appreciated!
You need to re-size image after capture image using AVCaptureSession and store it's image after resizing.
You found lots of similar question in to StackOverlow i just putting some link bellow that makes help's you.
One More thing As my suggestion that using SDWebImage for Displaying Images asynchronously Becouse App working smoothly. There are also some other way for example(Grand Central Dispatch (GCD) Reference , NSOperationQueue etc) in iOS for asynchronous Tast
Re-size Image:-
How to resize an image in iOS?
UIImage resizing not working properly
How to ReSize Image with Good Quality in iPhone
How to resize the image programmatically in objective-c in iphone

Resources