is there a option in JavaCV to capture 1080p videos from images?
If i use the FFmpegFrameRecorde, i only find 480p.
Or is there a alternate library for Java?
I want to use it to create a video from kind of pictures (with zooming and rotating effects)
greetings
//EDIT
Okay,
now i've tested a very simple code:
FrameRecorder recorder = FFmpegFrameRecorder.createDefault("out.avi", 1920, 1080);
recorder.start();
recorder.record(iplImage);
recorder.stop();
and it's works! But the file is very large (10sec around 300MB...)
Now i want to add a codec like xvid. I've get the following eyxception:
com.googlecode.javacv.FrameRecorder$Exception: codec not found
But i've installed the xvid paket. Must i add the codec in a special folder like the ffmpeg bin?
Okay,
now i test the mp4 codec and all works fine :)
//UPDATE
for JavaCV 0.2
FrameRecorder recorder = FFmpegFrameRecorder.createDefault("out.avi", 1920, 1080);
recorder.setCodecID(CODEC_ID_MPEG4);
recorder.setPixelFormat(PIX_FMT_YUV420P);
recorder.start();
.....
recorder.stop();
for JavaCV 0.3
FrameRecorder recorder = FFmpegFrameRecorder.createDefault("out.avi", 1920, 1080);
recorder.setVideoCodec(CODEC_ID_MPEG4);
recorder.setFrameRate(fps);
recorder.setFormat("avi");
recorder.start();
.....
recorder.stop();
Related
I am facing an issue using the library to convert from m4a to wav. I am using FormatConverter class, implementation of the function is correct, but the output is still m4a and it does not convert to wav.
This is the conversion code I am using, similar to the sample provided in the library.
var options = FormatConverter.Options()
options.format = "wav"
options.sampleRate = 48000
options.bitDepth = 24
let converter = FormatConverter(inputURL: selectedFileURL, outputURL: sandboxFileURL, options: options)
converter.start { error in
if error == nil {
print("Conversion successed.")
}
The file changes actually, it goes from "selectedFileURL" to "sandboxFileURL" with an increased size, the sample I am using is 76 KB, after the conversion it goes to 1.3 MB but with the same format m4a.
I am originally thinking of taking a long voice recorded using Voice Memos app, then upload it to the application and do the conversion from m4a to wav, but it stays m4a.
I uploaded two screenshots before and after the conversion from files application.
Before the conversion
After the conversion
Is there something wrong I am missing?
Thank you.
I write a video using cv2.VideoWriter with fps=5; VLC plays it back at 5fps; and ffprobe reports tbr=5.
I write it with fps=4.5; VLC plays it back at 4.5fps; but ffprobe reports tbr=9.
How can I detect 4.5 fps?
EDIT: BTW the metadata shown in windows file manager and using cv2 get(cv2.CAP_PROP_FPS) is 600fps.
EDIT2:
Turns out the issue is only on Raspberry pi. Looks like cv2 does not write the metadata correctly as rpi/ffprobe works fine on a file created on laptop. However even the rpi created file plays fine so there must be a way VLC detects fps.
import cv2
source = "test.avi"
reader = cv2.VideoCapture(source)
writer = cv2.VideoWriter("temp.avi", cv2.VideoWriter_fourcc(*'DIVX'), 4.5, (800, 600))
while True:
res, img = reader.read()
if not res:
break
writer.write(img)
reader.release()
writer.release()
I'm coding something that:
record video+audio with the built-in camera and mic (AVCaptureSession),
do some stuff with the video and audio samplebuffer in realtime,
save the result into a local .mp4 file using AVAssetWritter,
then (later) read the file (video+audio) using AVAssetReader,
do some other stuff with the samplebuffer (for now I do nothing),
and write the result into a final video file using AVAssetWriter.
Everything works well but I have an issue with the audio format:
When I capture the audio samples from the capture session, I can log about 44 samples/sec, which seams to be normal.
When I read the .mp4 file, I only log about 3-5 audio samples/sec!
But the 2 files look and sound exactly the same (in QuickTime).
I didn't set any audio settings for the Capture Session (as Apple doesn't allow it).
I configured the outputSettings of the 2 audio AVAssetWriterInput as follow:
NSDictionary *settings = #{
AVFormatIDKey:#(kAudioFormatLinearPCM),
AVNumberOfChannelsKey:#(2),
AVSampleRateKey:#(44100.),
AVLinearPCMBitDepthKey:#(16),
AVLinearPCMIsNonInterleaved:#(NO),
AVLinearPCMIsFloatKey:#(NO),
AVLinearPCMIsBigEndianKey:#(NO)
};
I pass nil to the outputSettings of the audio AVAssetReaderTrackOutput in order to receive samples as stored in the track (according to the doc).
So, the sample rate should be 44100Hz from the CaptureSession to the final file. Why I am reading only a few audio samples? And why is it working anyway? I have the intuition that it will not work well when I'll have to work with the samples (I need to update their timestamps for example).
I tried several other settings (such as kAudioFormatMPEG4AAC), but AVAssetReader can't read compressed audio formats.
Thanks for your help :)
I'm doing a project which requires converting SWF movies to H.264 video on server-side, to be able to play them both in Flash player and on iPhone/iPad. And I really got stuck.
I'm using Melt from http://www.mltframework.org/ and this is my command-line:
melt movie.swf -consumer avformat:video.mp4 r=30 s=640x360 f=mp4 acodec=aac ab=128k ar=48000 vcodec=libx264 b=1000k an=1
It does play in Flash player, but fails to play on iDevices. I googled for iPhone video requirements and it seems my video files do satisfy them(frame size, framerate and bitrate). What settings should I change to make it play?
I've spent a lot of time in google but managed to gather all the pieces, so these are parameters that work for iPhone:
r=30 s=640x360 f=mp4 acodec=aac ab=128k ar=48000 vcodec=libx264 level=30 b=1024k flags=+loop+mv4 cmp=256 partitions=+parti4x4+parti8x8+partp4x4+partp8x8+partb8x8 me_method=hex subq=7 trellis=1 refs=1 bf=0 flags2=+mixed_refs-wpred-dct8x8 coder=0 wpredp=0 me_range=16 g=250 keyint_min=25 sc_threshold=40 i_qfactor=0.71 qmin=10 qmax=51 qdiff=4 maxrate=10M bufsize=10M an=1 threads=0
Also, I use faac -w to convert audio to appropriate format and MP4Box to join video and sound.
I'm trying to get a video to play in an Away3d texture on iOS. It's fine on Android and Windows. The video will play in Starling on iOS so I know it's not the video.
Here is how I add the video:
sphereGeometry = new SphereGeometry(5000, 64, 48);
panoTextureMaterial = new TextureMaterial(panoTexture2DBase, false, false, false);
panoVideoMesh = new Mesh(sphereGeometry, panoTextureMaterial);
panoVideoMesh.scaleX *= -1;
panoVideoMesh.rotate(Vector3D.Y_AXIS,-90);
scene.addChild(panoVideoMesh);
panoTexture2DBase.player.play();
view.render();
On iOS I get this from the netstats when I try and load it as a video texture.
NetStream.Play.Start
NetStream.Play.Failed
NetStream.Play.Stop
I'm using the Away3d NativeVideoTexture class
texture = context.createVideoTexture();
texture.attachNetStream(_player.ns);
I think it might be do with MP4 encoding, and I've had a good look around and can't find anything that works, currently I'm trying this in FFMEG
-vcodec libx264 -profile:v main -level 3.1 -crf 23 -s 1024:768 -movflags +faststart
But what I set doesn't seem to make a lot of difference.
Any idea why my video is failing to load as a VideoTexture on iOS?