Can I access a VLC extension using python-vlc on a Raspberry Pi 4? - vlc

So I am running VLC on a Pi 4 and I have installed an extension to VLC that shows the elapsed seconds for the video. However, when I use python-vlc to launch VLC it does not enable the extension. Is there a way to do this using python-vlc?

I think you're confusing VLC and LibVLC? By "extension", I believe you mean VLC app add-ons. I don't think you can easily have these running when using standard libvlc builds.
However, there is a way to achieve what you want using the LibVLC APIs. Look into the marquee APIs, which is a video filter that allows you to display custom text at custom locations on the video.
Docs: https://www.videolan.org/developers/vlc/doc/doxygen/html/group__libvlc__video.html#ga53b271a8a125a351142a32447e243a96

Related

Changing audio playback speed

Is there any way to change (increase/decrease) audio playback speed using Flutter?
Couldn't find any info about it and it seems like writing native code is the only option.
The package audioplayers has incorporated support for playback rate:
_audioPlayer.setPlaybackRate(playbackRate: 1.0);
Try an audioplayers_with_rate fork of audioplayers. It worked great for me, but I couldn't find where its code is located (it links to audioplayers). The same functionality provides this fork or this fork of audioplayers.

Using the AVPlayer in swift (xcode 6), how can I implement automatic gain control (AGC) while playing remote files?

I absolutely need to play remote files in an iOS app I'm developing, so using AVPlayer seems to be my only option. (I don't want to download/load files as NSData, then implement the AVAudioPlayer because this doesn't allow the files to start playing immediately.) These remote files sometimes vary greatly in output volume from one to another. How can I implement some form of automatic gain control for the AVPlayer? It's seeming like it's not even possible.
Also: I've explored the MTAudioProcessingTap, but couldn't get it to work using information from the following blogs:
http://venodesigns.net/2014/01/08/recording-live-audio-streams-on-ios/
and
https://chritto.wordpress.com/2013/01/07/processing-avplayers-audio-with-mtaudioprocessingtap/
I'm open to any ideas that involve the AVPlayer. Can it be done? (Thanks in advance - cheers!)

Videosink on Gstreamer 1.2.3 (iOS)

Does gstreamer for ios currently support displaying video. I'm following the tutorial which calls for creating a pipeline
gst_parse_launch("videotestsrc ! warptv ! videoconvert ! autovideosink", &error);"
and then connecting the video overlay.
video_sink = gst_bin_get_by_interface(GST_BIN(pipeline), GST_TYPE_VIDEO_OVERLAY);
Howerver, video_sink is always nil. If I change the pipeline to just playbin that works, but playbin is for playing from a URI, but I need to construct a full gstreamer video pipeline.
I also can't find any video sinks other than autovideosink. Is displaying a gstreamer video pipeline currently supported for ios?
This is on ios 7.1 with gstreamer 1.2.3.
With some help from the mailing list I have got test video displaying. I put up my working version of the ios video tutorial app.
The short answer is that gstreamer 1.2.3 does have support for video displaying using eglglessink. However, you need to modify the #defines in gst_ios_init.h to make sure eglglessink is included. You also need to use a GLKView to provide GL primitives and the video_overlay methods to set this up.
I found it difficult to discover this from the documentation so hopefully some others may find the tutorial useful.

Play RTSP stream on iOS with LIVE555

How to modify the example "openRTSP" of Live555 to display a RTSP stream in a iOS App ?
You can look at dropcam https://github.com
Personally I find it harder to use than some other frameworks including the ones we use, but the dropcam example should give yo what you need.
You can use MobileVLCKit which use live555 internally.
You can find more information here: https://stackoverflow.com/a/48340854/2003734

iOS Video Player FFmpeg

So I want to make an app for the iPhone that will play live mms:// video streams.
I have look around, and everywhere says that I'll need FFmpeg in order to accomplish it. So I successfully compiled the FFmpeg libraries, but now
Do I have to convert the mms:// link to a .m3u8 link? Or I can just use apples AV Foundation Framework ?
Thanks!
You need libmms as well as the ffmpeg libs ,however as think the latest versions of ffmpeg has the code built in so you may not need libmms mms is just a streaming protocol so the actual format is likely some mpeg variant, mp4, h264.
Once you have that you extract the frames , and use ffmpeg avcodec_decode_video2 to decode to an avFrame. Just like any other video.

Resources