Two parts:
Correct me if I'm wrong, but there isn't a standard video file format that holds 2048 x 1536 frames? (i.e. recording the full resolution of the iPad retina is impossible?)
My app uses a glReadPixels call to record the screen, and appends the pixel buffers to an AVAssetWriterInputPixelBufferAdaptor. If the video needs to be resized to export, what's the best way to do this? I'm trying right now with AVMutableVideoCompositionLayerInstructions and CGAffineTransforms, but it's not working. Any ideas?
Thanks
Sam
Yes , it is possible. My app is also taking big frame video .
Don't use glReadpixels it causes a lot of delay especially if you record big frames as 2048 x 1536
Since iOS 5.0 you can use a faster way using texture cash (link)
Related
I'm using this code (thanks damikdk) to render lottie animation to video:
https://github.com/damikdk/LottieExportDemo/blob/master/LottieExportDemo/ViewController.swift
I'm using the oldExport() in the previous file and these two methods (append and fill) to fill the pixelbuffer with the image:
https://github.com/damikdk/LottieExportDemo/blob/master/LottieExportDemo/Helpers.swift
It works great on a iphone 5s and exports 30sec video in approximately 1min. But on a iPhone X it takes up to 10 min to export the same video with the same resolution settings. Is there a way to optimize this to work better on newer devices?
This repository was created as demonstration of bugs, you should not use it even as start point. Sorry if it's not obvious from description.
I am not sure what was wrong in your case, but I will check, if you share your code in issues.
I' trying to capture video by AVAssetWriter and AVCaptureOutput
You can find sample project here.
The video should be in portrait mode with any resolution. The main problem that it should be in portrait mode.
I'm trying to set different setting, but in the end, video is rotated and scaled to size (1920x1080) on iPhone SE.
Is it possible to control this resolution? Or at least orientation?
Video resolution is determined by the AVCaptureSession sessionPreset. You're setting that to medium, so you're getting the resolution that comes with that. If you want a different resolution, pass a different session preset, or use AVCaptureDevice to set a specific capture format. (For a good overview of capture session presets vs device formats, go back to this WWDC13 video.)
Per this Apple Developer Q&A, you need to set an orientation on the capture connection after you start the capture session in order to get "physically" rotated frame buffers (at a capture performance cost), or set the transform property on your asset writer (so that buffers are recorded in the sensor's native orientation, but clients display it in your intended orientation).
I'm using openGL ES to display CVPixelBuffers on iOS. The openGL pipeline uses the fast texture upload APIs (CVOpenGLESTextureCache*). When running my app on the actual device the display is great but on the simulator it's not the same (I understand that those APIs don't work on the simulator).
I noticed that, when using the simulator, the pixel format is kCVPixelFormatType_422YpCbCr8 and I'm trying to extract the Y and UV components and use the glTexImage2D to upload but, I'm getting some incorrect results. For now I'm concentrating on the Y component only, and the result looks like the image is half of the expected width and is duplicated - if it makes sense.
I would like to know from some one that has successfully displayed YUV422 video frames on iOS simulator if I'm on the right track and/or if I can get some pointers on how to solve my problem.
Thanks!
I've searched quite a lot and it seems that couldn't find a definite answer to what is the maximum render size of a video on iOS using AVFoundation.
I need to stitch two or more videos side by side or above each and render them in one new video with a final size larger than 1920 x 1080. So for example if I have two full hd videos (1920 x 1080) side by side the final composition would be 3840 x 1080.
I've tried with AVAssetExportSession and it always shrinks the final video proportionally to max 1920 in width or 1080 in height. It's quite understandable because of all possible AVAssetExportSession settings like preset, file type etc.
I tried also using AVAssetReader and AVAssetWriter but the results are the same. I only have more control over the quality, bitrate etc.
So.. is there a way this can be achieved on iOS or we have to stick to max Full HD?
Thanks
Well... Actually the answer should be YES and also NO. At least of what I've found until now.
H.264 allows higher resolutions only using a higher level profile which is fine. However on iOS the max profile that can be used is AVVideoProfileLevelH264High41 which according the specs, permits a max resolution of 1,920×1,080#30.1 fps or 2,048×1,024#30.0 fps.
So encoding with H.264 won't do the job and the answer should be NO.
The other option is to use other compression/codec. I've tried AVVideoCodecJPEG and was able to render such a video. So the answer should be YES.
But.. the problem is that this video is not playable on iOS which again changes the answer to NO.
To summarise I'd say: it is possible if that video is meant to be used out of the device otherwise the video will simply not be useable.
Hope it will help other people as well and if someone else gives a better, even different answer I'll be glad.
I am a Cocos2d game developer. I am developing a game using retina display images.
I have created texture files with and without HD suffix using zwoptex.
I have added those zwoptex plist texture files in app-delegate like [[CCSpriteFrameCache sharedSpriteFrameCache] addSpriteFramesWithFile:#"Background.plist"];
I have enabled the retina display to YES [director enableRetinaDisplay:YES];.
I have used the png files from the plist wherever i want using ccsprite *background = [CCSprite spriteWithSpriteFrameName:#"sample.png"];.
All those png files which I have included are high resolution images with both sizes 960*640 and 480 * 320. But in no reason the images look blurry and fuzzy when i run the game in simulator or iPhone. Anyone please help me to solve this issue …..
(The following image was posted as an example in a comment.)
cocos2d applies anti-aliasing to sprites by default. You need to turn that off:
[background.texture setAliasTexParameters];
hope this helps.
The screenshot you posted (I took the liberty of adding it to your question) shows that it was taken from the iPhone Simulator and not the iPhone (Retina) Simulator. Therefore it will not use the HD images.
With the iPhone Simulator running go to the Hardware -> Device menu and select iPhone (Retina) as the device. Then restart your app.
Note also that the iPhone Simulator will only render the game with a color depth of 16-Bit, regardless of settings in cocos2d or your Mac. The iOS Simulator renderer is limited to 16-Bit rendering for performance reasons (it only uses software rendering, no hardware acceleration). Only by looking at the game on an actual device can you make judgement calls about image quality.
To test whether the game is actually loading the HD assets or for some reason just loads the SD images, try running it without the SD images. If the game tries to load the SD images it will cause an error. If not, it is loading the HD images and the "blur issue" has a different cause. You could also log which files are loaded by adding a NSLog statement to the CCFileUtil class method fullPathFromRelativePath which performs the file name changes to load -hd images whenever possible.
You'll find that even miniscule amount of scaling or rotation applied to a sprite may have it look blurred, so check if you happen to do that. Any change in blend modes (using ccBlendFunc) could also cause blurred images. Also check that your images are fully opaque (opacity == 255).