My partner and I are developing an app for a client where they have a camera that connects to the app. Currently it is running in http but we want to use RTSP. We scraped the internet today to look for possible ideas, but they all seem outdated.
We tried incorporating DFURTSPPlayer but kept getting compile errors that were related to the actual SDK.
We want to use something like VideoStreamSDK.
Does anyone have any ways they can point us towards?
Thanks!
SOLVED
We found a RTSP player essentially that we were able to just incorporate it easily into our project.
[edit:
This requires to purchase their license :( , currently it is not working and throws error: Error Domain=com.imoreapps.avplayer.errordomain Code=-149]
Have you tried with MobileVLCKit? it's really easy and work well!
I wrote a small example here: https://github.com/rvi/ONVIFCamera
If you want to try it, just type pod try ONVIFCamera in your terminal.
Here is how to do it:
var mediaPlayer = VLCMediaPlayer()
// Associate the movieView to the VLC media player
mediaPlayer.drawable = self.movieView
let url = URL(string: "rtsp://IP_ADDRESS:PORT/params")
let media = VLCMedia(url: url)
mediaPlayer.media = media
mediaPlayer.play()
I successfully compiled and used DFURTSPPlayer but it crashed on some iOS devices.
Some days ago I have integrated this solution for playing rtsp video stream from IP-cameras: https://github.com/Bilibili/ijkplayer
The only issue I have is not stable connection. Player established connection only after few attempts.
Related
I have a video hosted in Azure Media Services. I have encoded the video using the H264 Multiple Bitrate 1080p Encoding preset (I've tried others as well). After publishing for streaming I get the following endpoints. (I replaced my actual site name with mysite below)
Smooth Streaming
http://mysite.streaming.mediaservices.windows.net/eaaa9f34-e39a-4393-a93b-14a7609ebd27/sampleVid.ism/manifest
MPEG-DASH
http://mysite.streaming.mediaservices.windows.net/eaaa9f34-e39a-4393-a93b-14a7609ebd27/sampleVid.ism/manifest(format=mpd-time-csf)
HLS(v3)
http://mysite.streaming.mediaservices.windows.net/eaaa9f34-e39a-4393-a93b-14a7609ebd27/sampleVid.ism/manifest(format=m3u8-aapl-v3)
HLS(v4)
http://mysite.streaming.mediaservices.windows.net/eaaa9f34-e39a-4393-a93b-14a7609ebd27/sampleVid.ism/manifest(format=m3u8-aapl)
I have successfully streamed the video for android using the HLS(v4) url, so I know the video works.
For ios I followed this tutorial. https://developer.xamarin.com/recipes/ios/media/video_and_photos/play_a_video_using_avplayer/
I could successfully play a local video. I also was able to play a remote video following this apple tutorial. https://developer.apple.com/library/prerelease/content/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/02_Playback.html
There must be something wrong with the URL I'm using, but I can't figure out what it is. Here is the code in my ViewDidLoad method.
var myUrl = NSUrl.FromString("http://myurl.streaming.mediaservices.windows.net/eaaa9f34-e39a-4393-a93b-14a7609ebd27/sampleVid.ism/manifest(format=m3u8-aapl)");
var appleUrl = NSUrl.FromString("http://devimages.apple.com/iphone/samples/bipbop/bipbopall.m3u8");
_playerItem = new AVPlayerItem(myUrl);
_player = new AVPlayer(_playerItem);
_playerLayer = AVPlayerLayer.FromPlayer(_player);
_playerLayer.Frame = ProfileVideoContainerView.Frame;
ProfileVideoContainerView.Layer.AddSublayer(_playerLayer);
_player.Play();
myUrl does not work, but appleUrl does.
It turns out I just needed to use https instead of http. I thought I had tried this, but apparently not. Also, I'm not sure why the appleUrl works with http.
There is an iOS "feature" that disallows a connection to an http endpoint unless the endpoint is listed in the info.plist as an allowed unsecured connection. I had forgotten about this and got caught by it a few days/weeks ago. I think this feature was implemented with iOS9, but am not 100% sure on this. It may have been implemented before then.
I have successfully got the live video streaming URL from facebook.com using live_videos API. Now i am receiving a URL in the form of RTMP. I don't know how to play that because AVPlayer is unable to do that and i got the library VideoCore which was mentioned somewhere but somehow unable to build that. Anyone have idea?
I have tried https://github.com/jgh-/VideoCore this but unable to compile because of some CocoaPod issue
you can try to use kxmovie!
I hope that would be useful for you!
Comparing to KxMovie, maybe ijkplayer is more suitable for you and it still is developing actively. But KxMovie has more simple code to read. Both are based on FFmpeg. Hope it will help.
This question already has an answer here:
Audio Information of Current Track iOS Swift
(1 answer)
Closed 6 years ago.
How can I access another app's currently playing audio - the actual audio item but metadata is welcome too. I can see that this question has been asked a lot but with few solutions being offered over the years. I understand apple's philosophy for probably not wanting an app to be able to do this. I also understand that such a request is probably outside of the iOS API. With that being said, I would really like some kind of solution.
Logically, I feel that
MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo
should return the info for whatever is currently playing; however, as others have mentioned, this value is always nil for audio being played outside of your app. Notably, popular audio apps seem to fail to use the MPNowPlayingInfoCenter class, making such audio fail to appear.
If using the default music app, one can use
MPMusicPlayerController.systemMusicPlayer().nowPlayingItem
However, what is a more consistent way to access audio playing through the podcasts app, Spotify, Pandora, Safari, etc?
Has anyone found a solution to this? Are there any old Objective-C frameworks that support this functionality?
One approach might be viable if there is there some way I can access the audio path of the item currently being played. For example, if I could get the path of the currently playing item, I could create an AV object from it:
AVAudioPlayer(contentsOfURL: audioUrl)
So is there a way I can get the audio url of the currently playing item and use it that way?
Is another approach better?
If a native solution does not exist, is it possible to bodge something together for it to work? Any advice or ideas welcome.
Edit: I don't believe anybody has been able to achieve this; however, I think a lot of people would like to. If this has been addressed, please link it! :)
This isn't currently possible in iOS. Just changing your AVAudioSession category options to .MixWithOthers, what will be an option to get info Song Info from other apps, causes your nowPlayingInfo to be ignored.
iOS only considers non-mixing apps for inclusion in MPNowPlayingInfoCenter, because there is uncertainty as to which app would show up in (e.g.) Control Center if there are multiple mixing apps playing at the same time.
Proposal: An option would be to use a music fingerprinting algorithm to recognize what is being played by recording it from your App.
Some interesting projects in this direction:
Gracenote https://developer.gracenote.com/ Gracenote (which is owned by Sony) has opened up it's SDKs and APIs and has a proper dev portal.
EchoNest & Spotify API http://developer.echonest.com/ Fusioned with Spotify since March 2016
ACRCloud https://www.acrcloud.com/ offers ACR solutions for custom files such as TV commercials, music tec
I'm trying to find out how i can use wowza media server(which i just installed) to receive a live audio stream from an iOS device(Xcode simulator) and play it on a web browser(safari) using http live streaming. i just need direction, guidance or a tutorial to start with this, or just the basic concept of how it works.
sorry for the newbie question, but i really really did try digging up the documentation, their is nothing about iOS http live streaming specifically, they concentrate more on Flash streaming(flv).
Thanks.
Use Wowza's MPEG-TS capabilities
There are a couple of Google hits and topics on the Wowza forums. This one looks fine for getting you started: Using an MPEG Transport Stream (MPEG-TS) encoder with Wowza Pro (MPEG-TS).
You can then just click a simple link in Safari on iOS devices and it should work fine. The like will look something like:
http://myserver.com/appName/myStream/playlist.m3u8
Wowza recently released an addon called "Gocoder". By using it you can encode your live streaming from wowza supported ios devices and broadcast it to any screen.
You can download the encoder from here
https://itunes.apple.com/us/app/wowza-gocoder/id640338185?ls=1&mt=8
Here is the step to configure the Gocoder addon with live application
http://www.wowza.com/forums/content.php?500
I want to be able to record footage using my iOS device and stream it directly to a server.
There's quite a few articles on S.O. that talk about this, but I'm not sure any have answered the question very well.
Should I be using HTTP Live Streaming, or is this just for sending data to an iPhone?
Should I be using AVCaptureSession to grab the video (a segment at a time?), sending each segment to the server?
Should I be using AVCaptureVideoDataOutput and ffmpeg for streaming?
I'm a little lost with all this, so any sample code or docs or links would be really appreciated.
Thanks for your help guys.
Duncan
You have to choose a network protocol for that purpose and find an appropriate media server to receive and process the stream. If the RTMP format is ok for your project, check angl library which supports RTMP streaming from iOS. Currently it's compatible with iOS 6 and 7.