I want to achieve something like this-
Reference Video
Say I have a video which is a vertical video(Dimension- 720x1280). I want to create a horizontal video with adaptive background like the video I've shown.
I have written some code for reading and writing to a file.
video_index = 0
cap = cv2.VideoCapture(videofiles[0])
# video resolution: 1920x1080 px
out = cv2.VideoWriter("video.mp4v",
cv2.VideoWriter_fourcc(*'MP4V'),
30, (1920, 1080), 1)
What is the effect of having background to the video which smudges on the sides called in opencv/ffmpeg or otherwise?
How do I achieve this effect using code or tools(I am open to using OSS desktop tools)?
That effect is simply realized by scaling the initial video to the desired size, bluring it, and overlaying the original video on top at the center.
For the blur, I suggest starting with Gaussian blur, available in OpenCV.
As suggested by Gyan, I used the following piece of code-
ffmpeg -i video720x1280.mp4
-filter_complex
"[0]scale=hd1080,setsar=1,boxblur=20:20[b];
[0]scale=-1:1080[v];[b][v]overlay=(W-w)/2" video1920x1080.mp4
It worked like charm!
Link to original answer.
Related
Basically I have an SKScene, and I want to play a video over the scene. The video is confetti falling with an alpha background. It will play when the player gets a high score. I am using an SKScene with shapes and images drawn with shape nodes and image nodes. I just was wondering if anyone could please tell me how to play the video over the screen and still see the game in the back, and be able to touch the buttons through the video. It is supposed to look like an animation playing.
I am using a video because I was just thinking that playing a video would be more processor efficient than having the game generate particles.
There is no built-in iOS solution. You can play 24BPP (fully opaque) movies under iOS, but the only built-in way to display alpha channel video would be to load a series of PNG images with alpha. Downside is that takes up a huge amount of memory and it bloats the app download. If you want to have a look at some working examples of this kind of functionality with a 3rd party app then see Alpha Channel Examples. You might also be interested in this blog post which shows example code of how to impl Alpha channel textures in OpenGL would could be implemented on top of SpriteKit too. The cube example shows rendering an alpha channel movie onto a cube, it was adapted from a Ray Wenderlich tutorial.
Here as an answer how to do that with GPUImageView. Also project on GitHub here and similar question from stackoverflow
The video stack doesn't yet support formats with alpha. For confetti, you should use SKEmitterNode. Size it to the area you envisioned for your video, and see Creating Particle Effects, i.e., its link to Add a particle emitter to your project and try out the "Snow" effect. It looks more like confetti when you give it a different color than white. Click the dot under "Color Ramp" to set the color.
I want to develop a feature into an application, by which an vintage projector effect can be given to a recording video or a pre recorded video. If you want i an share an image. i want the effect similar to it, one part of the recording video is showing in the bottom frame and the bottom part of the recording video is showing in the top frame. Along with this the whole view should shake like a real vintage projector recording.![enter image description here][1]
Did you take a look at GPUImage?
It has lots of options for video recording/processing and allows to add/combine different filters.
I have issue with cropping video and also need to expand video
cropping frame like native application, whenever User tapped on
cropping fame I need to expand video frames(thumbnails) and also want
to crop it, but I don't understand how to do it.
Here's a link to an open source control that you can use as a foundation.
https://github.com/andrei200287/SAVideoRangeSlider
This control already generates thumbnails from the source video and draws a linear trim bar with draggable handles that let you set the start/end point of the video. All you'd need to do is add some code to reload the trim bar with detailed images from around the time of the handle that's being dragged at the moment.
This control also comes with a sample app that exports the trimmed video from the selected start and end points.
Hopefully that's what you're looking for.
I am looking for a way to slice the middle part of the video and remove it , and merge the other two parts and render a new track. Is there any way in AVFoundation to perform this?
For eg:- please see the below illustration, if the full video height is from 1 to 4, I need to crop a slice from 2 to 3 and combine that [1-2] and [3-4] parts into a new video track. (Following is an illustration of ONE frame not the whole clip)
1----------------------------------
2----------------------------------
3----------------------------------
4----------------------------------
I think you should try to overlay a cropped video in the middle of your background video instead of trying to "make a whole".
Take a look at GPUImage's blend filters and then use a movie writer.
i am able to show square preview of recording using AVCaptureVideoPreviewLayer class, But it saving video in rectangular in library. Instead of rectangle i want square video. I have used composition class to crop video and it is working but taking too much time. I check vine app which has square video output.
Please give me suggestion.
its late answer but it is used for another.. see the my ans
https://github.com/piemonte/PBJVision
record video in square