A/V sync issue - Apple's HTTP Live Streaming - ipad

When I try to stream TS chunks generated by 3rd party multiplexers (Mainconcept/Elecard) from Safari browser in IPad 2.0/1.0 I always see Audio Video synchronization issue over a period of time.
But the same clips are playing fine in standard media player in Windows PC or Macbook.
I also observe that there is no issue in IPad when I try to stream TS chunks generated by Media File Segmenter tool in MAcbook.
What is that IPad is expecting from 3rd party multiplexers?
For Ex: When I try to stream a set TS chunks in Ipad where the overall chunk duration is 5mts 35 secs (including all TS chunks), I observe audio goes out of sync after 2 mts 40 secs.
Following is the media pipeline used to generate TS chunks
Video.mp4 (Source)-> Mainconcept MPEG4 DeMultiplexer-> Mainconcept MPEG Multiplxer-> Mainconcept Sink Filter (Generates TS chunks based on time)
Can someone share some points on IPad HLS behaviour? Does IPad expects some additional parameters for synchronization?
Thanks.

In Mainconcept Multiplexer settings, enable "optimized packing". This will resolve the AV sync issue

Related

Video.JS Recommended Encoder Setting for Cell Streaming

I'm on the final steps of finalizing a video.js heavy website and am running into one problem that I really don't want to leave to experimentation. I was wondering if there were any recommendations for streaming over 4G/3G.
The problem occurs specifically when streaming HLS using the hls-contrib tech over 4G/3G. Both Android (Lollipop) and iOS 9 phones immediately pick up audio but never get video, or do several minutes later (a normal user would stop watching by that point). When I plug in an Android or iOS device for console debugging, there are no console errors (symptoms persist), and turning on WiFi gets me both audio and video. Which leads me to believe I'm just dealing with an encoder settings problem here. Their signal chain is encoder (setting below) to a Wowza server, and out to ScaleEngine as a CDN. They're of course using the CDN HLS and RTMP links for public playback.
Encoder Settings:
Video: H.264 # 1,250Kbps CBR, 720x480#30i
Audio: AAC # 96Kbps Stereo
Profile: Main
Level: 3.1
B-Frames: 0
Don't see any other pertinent info.
Appreciate the help/opinions/general information. Thanks.

AVFoundation, limiting video streaming playback to WiFi only

I'm developing a player using AVFoundation. The source videos I have no control over but they certainly violate:
https://developer.apple.com/streaming/
REQUIREMENTS FOR APPS
...If your app delivers video over cellular networks, and the video exceeds either 10 minutes duration or 5 MB of data in a five minute period, you are required to use HTTP Live Streaming. (Progressive download may be used for smaller clips.)...
I'm talking mp4 videos that are 100Mb+ for 3 to 5 min clips.
Again, 0 control over the source material but I have to play them.
Looking at AVFoundation, none of the classes I'm using, AVPlayer, AVPlayerItem, AVQueuePlayer and so on have a similar property to a NSURLSessionConfiguration's .allowsCellularAccess (at least that I can see)
So agreement with the client was to limit the streaming to Wifi only, but I see no way around at the moment to force streaming over WiFi only.
Any pointers on how to get around it?
Any help is MUCH appreciate

AVPlayer seekToTime download an insane amount of media segment files consuming a lot of data

I'm working in an app where I'm able to play a HLS m3u8 playlist of a streaming radio (audio only) without any problem using an instance of AVPlayer. Using Charles I can see how the playlist is updated properly in a normal pace (each 9-10 seconds, which takes one media segment file). When I perform a seekToTime: (back in time), the player success playing the stream from when I want to, but in Charles I observe how the player starts dowloading a huge amount of media segment files, consuming a lot of data. It seems that the player downloads all the media segment files until that time and then keeps again with the normal behaviour.
I understand that the correct behaviour would be to download the media segment file for the time I'm seeking to, start playing it and then download constantly 1 or 2 media segment files each 9-10 seconds, as it does when I play the stream without timeshift.
My question is if this is a normal behaviour, or if something could be wrong with my m3u8 playlist or the client implementation. Anyone could help me to clarify this?
UPDATED: I can confirm this doesn't happen in iOS 7, so it seems to be a bug introduced by iOS 8.
I've been told by Apple that this is not a bug, but a feature. They've made the buffer bigger since iOS 8.

IOS: Is it possible to record compressed video and uncompressed audio simultaneously

I'm writing an IOS-App which should record video, using front camera, and audio of the user working with the app. Later I want to analyse the user behavior offline. This App should run on an iPad 3.
Remark: The observed users will be people form my office. Code & data is only needed for the development process and won't be included in the final APP.
My requirements: Video and audio should be uncompressed, at least audio must be uncompressed. I think uncompressed video recording without skipping frames is not possible on an iPad (See: where can i find an uncompressed video recording from iPhone 3G/3GS/4 ), but uncompressed audio is possible.
Here are my questions:
Is it possible to record a video (compressed) and audio (uncompressed / kAudioFormatLinearPCM) simultaneously?
Is it possible to save video and audio in seperate files?
If one of the two questions is YES then what should I do in AVCam-Example http://developer.apple.com/library/ios/#samplecode/AVCam/Introduction/Intro.html to solve my problems :-)
Thank you all in advance!
The AVCam sample code isn't flexible enough to do what you want. You need to use AVAssetWriter to write out the media. I'm not 100% sure on the uncompressed audio bit but the VideoSnake sample code from WWDC 2012 session 520 is a great place to start with AVAssetWriter. I can't speak to performance but you could have 2 AVAssetWriters for video and audio, just modify that code to vend the samplebuffers to the appropriate writer.

How to avoid audio only screen when using Apple's HTTP Live Streaming for video?

I'm working with Apple's HTTP Live Streaming protocol which, when submitting apps to the App Store, required that there's an audio only stream as part of the multiplex. As a result of this the first segment (10 seconds) is always audio only and the image below is shown instead of the beginning of the video stream, regardless of the amount of bandwidth that's actually available.
I know I can show a static image instead but I'm wondering if there's a way to delay the stream starting until it's determined if there's enough bandwidth to go straight to the video stream.
The order of your bitrates in the manifest file is key, as the device try to play the bitrates segments in order. We recommend listing the bitrates from highest to lowest to avoid starting the video off with the worst bitrate and then switching only when iOS had detected sufficient bandwidth.

Resources