Is there any way to change (increase/decrease) audio playback speed using Flutter?
Couldn't find any info about it and it seems like writing native code is the only option.
The package audioplayers has incorporated support for playback rate:
_audioPlayer.setPlaybackRate(playbackRate: 1.0);
Try an audioplayers_with_rate fork of audioplayers. It worked great for me, but I couldn't find where its code is located (it links to audioplayers). The same functionality provides this fork or this fork of audioplayers.
Related
As it's mentioned in Release highlights, OpenCV (4.5.5) now have Audio support in videoio module. However, there's no documentation related on this topic.
I've tried a few things on my own like:
cv::VideoCapture cap(fileName,cv::CAP_MSMF);
However, no results so far.
How can I activate Audio Support? Am I missing something?
(Does not work neither for camera nor video files)
Additionally, I don't use pre-built binaries but, tried with pre-built ones(for Windows) and it didn't work neither.
As far as I see, this question does not make any sense unless they implement an interface. That's why they have no documentation about this topic. Hope they'll bring that feature with 4.5.6.
So I am running VLC on a Pi 4 and I have installed an extension to VLC that shows the elapsed seconds for the video. However, when I use python-vlc to launch VLC it does not enable the extension. Is there a way to do this using python-vlc?
I think you're confusing VLC and LibVLC? By "extension", I believe you mean VLC app add-ons. I don't think you can easily have these running when using standard libvlc builds.
However, there is a way to achieve what you want using the LibVLC APIs. Look into the marquee APIs, which is a video filter that allows you to display custom text at custom locations on the video.
Docs: https://www.videolan.org/developers/vlc/doc/doxygen/html/group__libvlc__video.html#ga53b271a8a125a351142a32447e243a96
I absolutely need to play remote files in an iOS app I'm developing, so using AVPlayer seems to be my only option. (I don't want to download/load files as NSData, then implement the AVAudioPlayer because this doesn't allow the files to start playing immediately.) These remote files sometimes vary greatly in output volume from one to another. How can I implement some form of automatic gain control for the AVPlayer? It's seeming like it's not even possible.
Also: I've explored the MTAudioProcessingTap, but couldn't get it to work using information from the following blogs:
http://venodesigns.net/2014/01/08/recording-live-audio-streams-on-ios/
and
https://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/
I'm open to any ideas that involve the AVPlayer. Can it be done? (Thanks in advance - cheers!)
I've hit a roadblock with using GPUImage. I'm trying to apply a filter (SepiaFilter or OpacityFilter) on a prerecorded video. What I'm expecting to see is the video played back with the filter applied to it. I followed the SimpleFileVideoFilter example for my code. What I ended up with is a video that is unplayable by Quicktime (m4v extension) and the live preview of the rendering all skewed. I thought it was my code at first so I ran the example app from the examples directory and lo and behold I got the same issue. Is the library broken? I just refreshed from master out of GitHub.
Thanks!
Here's a sample output of the video generated
http://youtu.be/SDb9GfVf9Lc
No matter what filter is applied the resultant video are all similar. (all skewed )
#Brad Larson (I hope you see this message), do you know what I can be doing wrong? I am using the latest XCode and source code of GPUImage. I also tried using the latest from CocoaPods as well. Both end up the same.
I assume you're trying to run this example via the Simulator. Movie playback in the Simulator has been broken for as long as I can remember. You need to run this on an actual device to get movie playback to work.
Unfortunately, one of the recent pull requests that I brought in appears to have introduced some crashing bugs even there, and I may need to revert those changes and figure out what went wrong. Even that's not an iOS version thing, it's a particular bug with a recent code addition. I haven't had the time to dig into it and fix it, though.
Could you point me to any examples where JUCE library has been used process Audio in iOS. Thanks in advance.
Regards,
Waruna.
Look at the JUCE demo included with JUCE. This runs just fine on iOS. Just edit that code and register an AudioIODeviceCallback with your AudioDeviceManager object to do some custom audio processing.