Phonegap video capture reduce output file size - ios

When capturing video with phonegap on iOS the file size for a even a 1min capture is ridiculously large. Well, far to large to upload over a 3G connection reliably. I read that there is a native AVCaptureSession object that will allow the bitrate to be altered in order to reduce the file size. Has anyone implemented this in the phonegap video capture or can give me any pointers?

Found the deatils I needed here:
How can I reduce the file size of a video created with UIImagePickerController?
The important line for Phonegap users is in CDVCapture.m:
pickerController.videoQuality = UIImagePickerControllerQualityTypeHigh;
There are several presets that can be used e.g.
UIImagePickerControllerQualityTypeLow

phonegap provide quality argument when to navigate camera set :
quality:50
this is recommended for ios. if u need to reduce mean reduce quality.It have integer value (0-100) use this .....

Related

Swift send compressed video frames using GPUImage

I'm writing a Swift app that sends an iPhone camera video input (frames) through the network, so I can later display them on a macOS app.
Currently, I'm grabbing video frames from an AVCaputreSession, and get a PixelBuffer from the captureOutput method.
Since each frame is huge (RAW pixels) I'm converting the CVPixelBuffer that to a CGImage with VTCreateCGImageFromCVPixelBuffer and later to a UIImage with JPEG compression (50%). I then send that JPEG through the network and display it on the Mac OS app.
As you can see this is far from ideal and runs at ~25 FPS on an iPhone 11. After some research, I came up with GPU Image 2. It seems that I could get the data from the camera and apply something like this (so that the transformation is done in GPU):
camera = try Camera(sessionPreset:AVCaptureSessionPreset640x480)
let pictureOutput = PictureOutput()
pictureOutput.encodedImageFormat = .JPEG
pictureOutput.imageAvailableCallback = {image in
// Send the picture through the network here
}
camera --> pictureOutput
And I should be able to transmit that UIImage and display it on the macOS app. Is there a better way to implement this whole process? Maybe I could use the iPhone's H264 hardware encoding instead of converting images to JPEG, but it seems that it's not that straightforward (and it seems that GPUImage does something like that from what I read).
Any help is appreciated, thanks in advance!
I understand that you want to do this operation in a non-internet environment.
What are your project constraints;
Minimum fps?
Minimum video resolution?
Should sound be transmitted?
What is your network environment?
Minimum iOS and OSX version?
Apart from these, GPUImage is not a suitable solution for you. If you are going to transfer videos, you have to encode H264 or H265 (HEVC) in every moment. In this way, you can transmit video in a performance way.
The solution you are doing now is CMSampleBuffer-> CVPixelBuffer-> JPEG-> Data conversion seriously burden the processor. It also increases the risk of memory leak.
If you can tell a little bit, I would like to help. I have experience with video processing.
Sorry for my english.

Processing live video and still images simultaneously at two different resolutions on iPhone?

I'm working on video processing app for the iPhone using OpenCV.
For performance reasons, I wan't to process live video at a relatively low resolution. I'm doing object-detection on each frame in the video. When the objects are found in the low-resolution video frame, I need to acquire that exact same frame at a much higher resolution.
I've been able to semi-accomplish this using a videoDataBufferOutput and a stillImageOutput from AVFoundation, but the still image is not the exact frame that I need.
Are there any good implementations of this or ideas on how to implement it myself?
In AVCaptureSessionPresetPhoto it use small video preview(about 1000x700 for iPhone6) and high resolution photo(about 3000x2000).
So I use modified 'CvPhotoCamera' class to process small preview and take photo of full-size picture. I post this code here: https://stackoverflow.com/a/31478505/1994445

AVFoundation max render size

I've searched quite a lot and it seems that couldn't find a definite answer to what is the maximum render size of a video on iOS using AVFoundation.
I need to stitch two or more videos side by side or above each and render them in one new video with a final size larger than 1920 x 1080. So for example if I have two full hd videos (1920 x 1080) side by side the final composition would be 3840 x 1080.
I've tried with AVAssetExportSession and it always shrinks the final video proportionally to max 1920 in width or 1080 in height. It's quite understandable because of all possible AVAssetExportSession settings like preset, file type etc.
I tried also using AVAssetReader and AVAssetWriter but the results are the same. I only have more control over the quality, bitrate etc.
So.. is there a way this can be achieved on iOS or we have to stick to max Full HD?
Thanks
Well... Actually the answer should be YES and also NO. At least of what I've found until now.
H.264 allows higher resolutions only using a higher level profile which is fine. However on iOS the max profile that can be used is AVVideoProfileLevelH264High41 which according the specs, permits a max resolution of 1,920×1,080#30.1 fps or 2,048×1,024#30.0 fps.
So encoding with H.264 won't do the job and the answer should be NO.
The other option is to use other compression/codec. I've tried AVVideoCodecJPEG and was able to render such a video. So the answer should be YES.
But.. the problem is that this video is not playable on iOS which again changes the answer to NO.
To summarise I'd say: it is possible if that video is meant to be used out of the device otherwise the video will simply not be useable.
Hope it will help other people as well and if someone else gives a better, even different answer I'll be glad.

Compress iOS video using Phonegap

Is there any tips, suggestions, or available plugins for Phonegap that might be available to Capture a video using Phonegap's capture feature, then programmatically compress or reduce the quality of the video such that it can be uploaded to a server?
The video file sizes can become rather large with videos longer than 5 minutes. I would strongly prefer not to have save the video to the iPhone.
Currently it is not possible. You might have a look at this plugin, but up to Cordova 3.5 it is not possible to compress a video, and you can set its duration only on iOS

How to get frame by frame images from movie file in iPhone

In iOS, I would like to get frame-by-frame images from a movie file,
I tried it by using AVAssetImageGenerator. But it gets one image per second for a 30fps movie. It should be 30images!
I heard that there is a way to use FFmpeg.
But in newer OSs like iOS7, is there a new API to do this without using external libraries like FFmpeg?
You can also try OpenCV to capture the frames. But the speed will be dependent on the device and processor. Hope for iPhone5 the speed is perfect to capture all the frames. You can have a look at this link if it helps.

Resources