Play RTSP stream on iOS with LIVE555 - ios

How to modify the example "openRTSP" of Live555 to display a RTSP stream in a iOS App ?

You can look at dropcam https://github.com
Personally I find it harder to use than some other frameworks including the ones we use, but the dropcam example should give yo what you need.

You can use MobileVLCKit which use live555 internally.
You can find more information here: https://stackoverflow.com/a/48340854/2003734

Related

OpenCV Audio support (MSMF and GStreamer backends)

As it's mentioned in Release highlights, OpenCV (4.5.5) now have Audio support in videoio module. However, there's no documentation related on this topic.
I've tried a few things on my own like:
cv::VideoCapture cap(fileName,cv::CAP_MSMF);
However, no results so far.
How can I activate Audio Support? Am I missing something?
(Does not work neither for camera nor video files)
Additionally, I don't use pre-built binaries but, tried with pre-built ones(for Windows) and it didn't work neither.
As far as I see, this question does not make any sense unless they implement an interface. That's why they have no documentation about this topic. Hope they'll bring that feature with 4.5.6.

How to add .h and .so files to iOS project

I have to make an app in Objective-C (iOS) for handle a stream video from an ip camera.
After lots of research I have not idea where to start :(
RTSP protocol is difficult thus i looked for a library and i
found these
the site where i download it said that the library is for macOS.I'd like to know if is possible add .so library to my xcode/iOS project, if so, how do it?
Or have you other solutions for RTSP stream in iOS?
Sorry for my bad English.
Thanks in advice.
No this is not possible, the .so file is a binary and compiled for the x86 (x86_64) architecture.
iOS device run on an ARM architecture.

How Does VideoToolbox.framework work?

iOS8 just released beta version, I'm very interested in Video directly Encoding / Decoding.
Video Toolbox Framework
The Video Toolbox framework (VideoToolbox.framework) includes direct
access to hardware video encoding and decoding.
but I can not find any tutorial documents for this right now
as I know it's a private framework before, and some people already using it in some JB apps
so does anyone can share a very simple tutorial code for this ?
Try this link: https://github.com/davidliu/VideoTimeLine (the video which I loaded seems to be broken, but you can get a feel how to use it).
Write in XCode: import VideoToolbox and CMVideoFormatDescriptionCreateFromH264ParameterSets and Cmd+click on it to display documentation :)
Take a look at this video from WWDC: https://developer.apple.com/videos/wwdc/2014/#513
Check out this code: https://github.com/manishganvir/iOS-h264Hw-Toolbox
Check out this code: https://github.com/McZonk/VideoToolboxPlus
You might also want to see a detailed description on how to decompress H264 using VideoToolbox: How to use VideoToolbox to decompress H.264 video stream
Hope that helps :)
Import the framework and look at the headers, they're all documented.
Apple also released a sample using VTDecompressionSession

Samples/file extensions supported by iOS sampler

I'm writing an iOS app which can play MIDI and output its content using the AUSampler and AUGraph classes. I know for sure it supports files like Soundfont (.sf2) but this one seems to be quite antiquated.
Question: Are there any other files or sample types which this framework supports?
Thanks.
The AUSampler also supports DLS format (.dls) and AUpreset format (.aupreset)

iOS Video Player FFmpeg

So I want to make an app for the iPhone that will play live mms:// video streams.
I have look around, and everywhere says that I'll need FFmpeg in order to accomplish it. So I successfully compiled the FFmpeg libraries, but now
Do I have to convert the mms:// link to a .m3u8 link? Or I can just use apples AV Foundation Framework ?
Thanks!
You need libmms as well as the ffmpeg libs ,however as think the latest versions of ffmpeg has the code built in so you may not need libmms mms is just a streaming protocol so the actual format is likely some mpeg variant, mp4, h264.
Once you have that you extract the frames , and use ffmpeg avcodec_decode_video2 to decode to an avFrame. Just like any other video.

Resources