Live stream implementation using RTSP protocol - ios

I'm trying to gain access to a live stream through the RTSP protocol on iOS. I'm trying to run the example from this website: http://www.gdcl.co.uk/2013/02/20/iOS-Video-Encoding.html and it's advertised that you can just take the url (rtsp://) and paste it into quicktime player, VLC or some other means, but whenever I try it fails. When I try in quicktime player it gives me this error: The document “Macintosh HD” could not be opened. The file may be damaged or may not be a movie file that is compatible with QuickTime Player.
What am I doing wrong? Is the example broken or do I need to update some specs in the code. I'm running iOS 9.3 and it's told to work > 7.0.

I was able to play this back on VLC when compiling and running on my iOS device. You need to ensure that you are on WiFi (vs LTE or 3G). I'm on iOS 9.2.1 and played back with VLC version 2.2.2.
You can then take it a step further as I was successful in ingesting it into Wowza via Stream file with the following configuration:
{
uri : "rtsp://[rtsp-address-as-published-on-the-app]",
streamTimeout:12000,
reconnectWaitTime:12000,
rtpTransportMode:"udp",
rtspValidationFrequency:15000,
rtspFilterUnknownTracks:true,
rtspStreamAudioTrack:false,
rtspStreamVideoTrack:true,
rtpDebugSession:true,
rtspSessionTimeout:12000,
rtspConnectionTimeout:12000
}
I would suggest reviewing what the console logs say in your iOS application (xcode) and then also take a look at your VLC error messages/logs as well to see what the exact issue is when you try to playback.

Related

Mp4 file created from ffmpeg can play on Simulator/PC browser but can't play on iOS app/safari in real device

The backend team uses ffmpeg to create video from images.
The strange thing is that the video can be played on Mac browser/iPhone Simulator but not on browser/iOS app on real phone.
I tried using AVPlayer to print the error but error = nil
here is the file: https://firebasestorage.googleapis.com/v0/b/chatchatapp123.appspot.com/o/image_rendered.mp4?alt=media
here is its metadata: https://www.metadata2go.com/result/46e72635-7fac-46ee-acfe-cb6ffda49692
Has anyone encountered this before and if so, any ideas as to why?
Thanks.
I've noticed, field order in metadata is tt, which means
Interlaced video, top field coded and displayed first
But according to the document of “HTTP Live Streaming” from Apple, there is a description,” Important: Interlaced video is not supported on iOS devices.”, which is written on the official document.
https://developer.apple.com/documentation/http_live_streaming/hls_authoring_specification_for_apple_devices

HLS/fMP4 (CMAF) redundant/fallback (Primary/Backup) workflow on Apple Devices is not working

I'm trying to publish my HLS/fMP4 (CMAF) stream to Akamai with Primary/Backup workflow.
When I tested with Shaka player, it works fine: Whenever publishing to Primary entry point stopped, player will properly switch to Backup stream and keep playing back.
However, somehow it does not work on safari on iOS11 nor macOS High Sierra.
I'm wondering if it is a limitation in Apple Devices, or there's compatibility issue in my master playlist.
Here's the sample master playlist file.
#EXTM3U
#EXT-X-VERSION:7
#EXT-X-INDEPENDENT-SEGMENTS
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="default-p",NAME="audio-eng-p",LANGUAGE="eng",DEFAULT=YES,URI="https://foo.akamaized.net/cmaf/live/123456/FailOverTest/index_bitrate128K.m3u8"
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="default-b",NAME="audio-eng-b",LANGUAGE="eng",DEFAULT=NO,URI="https://foo.akamaized.net/cmaf/live/123456-b/FailOverTest/index_bitrate128K.m3u8"
#EXT-X-STREAM-INF:BANDWIDTH=928000,RESOLUTION=640x360,CODECS="avc1.4d401f,mp4a.40.2",AUDIO="default-p"
https://foo.akamaized.net/cmaf/live/123456/FailOverTest/index_bitrate800K.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=928000,RESOLUTION=640x360,CODECS="avc1.4d401f,mp4a.40.2",AUDIO="default-b"
https://foo.akamaized.net/cmaf/live/123456-b/FailOverTest/index_bitrate800K.m3u8
NOTE: for readability, I included only 1 video and audio pair.
Please let me know if you notice something.

Google Cast plugin for Unity can't stream Video clip

I'm trying to play a Video inside Unity, and stream it to Google Cast.
Google provides a plugin that enables the connection to a Cast device, and it works fine once a correct Cast App ID is given.
Recently Unity provided a component 'VideoPlayer' that enables video playback inside a mobile device. And I tried to use both of them to stream video content on the Cast device. But when I play the video, the app stops responding with a signal 'SIGABRT' at
reinterpret_cast<PInvokeFunc>(_native_GCKUnityRenderRemoteDisplay)();
I also tried to play the video using AVPro plugin but the same issue appeared.
The plugin works just fine without a video, and the last update of the plugin is Apr 2016 so I think the plugin has some issue with the Unity's latest VideoPlayer component.
Is there something I can do about it?
There are currently no plans to update the Google Cast Unity plugin.

get shared/server url using CocoaHTTPServer

I am creating application that play video using chromecast device on TV (Apple TV or monitor that support HDMI port). Application is playing url like "https://serveraddress/video.mp4" but its not playing the video stored in app bundle(local video).
I have found that to make video play in remote machine (here it is TV) have to create local server on iphone.
I found this SDK CocoaHTTPServer I have run the sample application but not getting how to take video URL that is stored in app bundle.
Can someone please help me in this.
Thanks in advance.
This work great for me.
https://github.com/swisspol/GCDWebServer
Initialize as given in sample code (give document directory as root).
Than to access local file copy that file in document folder and than access using http://localhost:8080/download?path=new.jpeg

iOS "Cannot Decode" m3u8

I have a live stream that in the past was playable on iOS devices (using the URL for the m3u8 file). Now, when I try to view the live stream on an iOS device, I get a message that says "Cannot Decode". I am still able to use this file on Android devices though. Does anyone have any idea why iOS devices would not be able to play this file?
The live stream is being encoded by Adobe Flash Media Live Encoder 3.2 and we are using Adobe Media Server 5. I followed the steps here to get everything setup initially (when it was working). Once it stopped working on iOS, I verified that none of the settings had changed.
iOS does not support FLASH, so iOS cannot decode a FLV stream, it only support HLS(HTTP Live Streaming).
You can set the Live Encoder as following
Preset: H.264
Video Format (H.264), click the right spanner, set Profile(Main), Level(3.1), Keyframe Frequency(4 seconds)
FMS URL: rtmp://yoursever/livepkgr and Stream: livestream?adbe-live-event=liveevent
Then, open iDevice Safari and go to http://yoursever/hls-live/livepkgr/_definst_/liveevent/livestream.m3u8,it will be played.

Resources