Playing YUV on VLC Player - vlc

When playing a raw video (.yuv) file on VLC Player. How can I tell VLC the width, height and frame-rate of the video?

Try this:
vlc --demux rawvideo --rawvid-fps 25 --rawvid-width 1920 --rawvid-height 1080 --rawvid-chroma I420 input.yuv

--demux rawvideo was neither helpful or harmful, for my purposes, but the correct chroma choice happened to be --rawvid-chroma UYVY. If uncertain, just experiment: you won't get crashes, just wonky imagery.

Related

Recorded video from iPhone rotated 180 degrees on FB, Vimeo, Youtube

it's my first question.
My question is this. I developed some app and the app records a video.
The problem is that the recorded video on iPhone looks rotated 180 degrees in FB, Vimeo, Youtube after sharing it. But it appears normally on iMessage, Instagram. I'm using FFmpeg while recording it.
Could you let me know the cause of the problem and the solution on the code?
Depending on which version of ffmpeg you have and how it's compiled, one of the following should work...
ffmpeg -vfilters "rotate=90" -i input.mov output.mp4
...or...
ffmpeg -vf "transpose=1" -i input.mov output.mp4
Please try this hope it helps!

Scale the video up from a smaller size(Re-scale)

I'm actually looking for is to not just the quality but resize the entire video to a greater resolution using the AV foundation.
I have a videos in 320x240 and 176x144 in mp4 quality and I want to resize video upto size 1280x720, but AVAssetExportSession class not allow to scale the video up from a smaller size.
try AVMutableVideoCompositionLayerInstruction and CGAffineTransform.
This code will help the understanding.
https://gist.github.com/zrxq/9817265

iOS Video File Sizes and Bandwidth Considerations

I'm building an app whose core functionality is centered around 1-10 second videos. Currently, I'm recording video using PBJVision with the preset set to AVCaptureSessionPresetMedium. A 10 second video is around ~3-5MB. Considering each user could theoretically download hundreds or even thousands of videos a day, I was wondering if there was a more bandwidth efficient way of packing these videos up.
Could WebM be a more suitable container format?
I searched across the web, but couldn't find any articles pertaining to this specific question.
Edit: this looks promising
Modern video codecs (include WebM VP8) usually has compression ratio around 1/50. By adjusting codec parameters we can archive ~ 1/100 (IMHO), but very difficult and horrible picture quality.
Roughly, we can think of 1 camera pixel consist of 1.5 bytes (YUV 12 or 16 bits).
If the resolution is 720x480 and the frame rate is 30/sec,
720 x 480 x 1.5 x 30 = 15,552,000
x 10 sec = 155,520,000
/ 50 = 3,110,400
~= 3MB
It seems PBJVision doing good.
Reducing resolution, or lowering frame rate would be the first consideration, I think.
ios wont playback webm unless you use a software decoder. A software decoder will take more CPU/battery and produce more heat. And webm will not even solve your problem. What you want is to reduce the bitrate, but this will also reduce the quality. So its a trade off.

Export frames/images from compressed video

I have a compressed movie (mp4) and I want to extract every single frame / image from it. I know that each individual frame of the video only contains the changed pixels regarding to the last keyframe, because of the video compression. But that is exactly what I want. I just want to see those differences. I want to visualy see how the compressor works.
Is there some tool like imagemagick out there what can things like that?
found a solution by my own using ffmpeg:
ffmpeg -i video.mp4 -f image2 -vf "select=eq(pict_type\,PICT_TYPE_P)" -vsync vfr pframe_%04d.png

AVFoundation max render size

I've searched quite a lot and it seems that couldn't find a definite answer to what is the maximum render size of a video on iOS using AVFoundation.
I need to stitch two or more videos side by side or above each and render them in one new video with a final size larger than 1920 x 1080. So for example if I have two full hd videos (1920 x 1080) side by side the final composition would be 3840 x 1080.
I've tried with AVAssetExportSession and it always shrinks the final video proportionally to max 1920 in width or 1080 in height. It's quite understandable because of all possible AVAssetExportSession settings like preset, file type etc.
I tried also using AVAssetReader and AVAssetWriter but the results are the same. I only have more control over the quality, bitrate etc.
So.. is there a way this can be achieved on iOS or we have to stick to max Full HD?
Thanks
Well... Actually the answer should be YES and also NO. At least of what I've found until now.
H.264 allows higher resolutions only using a higher level profile which is fine. However on iOS the max profile that can be used is AVVideoProfileLevelH264High41 which according the specs, permits a max resolution of 1,920×1,080#30.1 fps or 2,048×1,024#30.0 fps.
So encoding with H.264 won't do the job and the answer should be NO.
The other option is to use other compression/codec. I've tried AVVideoCodecJPEG and was able to render such a video. So the answer should be YES.
But.. the problem is that this video is not playable on iOS which again changes the answer to NO.
To summarise I'd say: it is possible if that video is meant to be used out of the device otherwise the video will simply not be useable.
Hope it will help other people as well and if someone else gives a better, even different answer I'll be glad.

Resources