Using AVPlayer and SKVideoNode, code / app that was working fine with iOS 9.x is now not playing video on 10 Beta x. The file seems to be playing, but getting the error message
Failed to create IOSurface image (texture)
This is for all for formats that have been working before (mp4, HLS, etc).
This error message seems to be coming from under the hood.
Related
I am unable to play HLS video with 5.1 audio track on Chromecast 2 and Chromecat Ultra. The same files worked just fine on previous firmware releases before 1.28.100555.
In order to avoid any Objective-C programming errors all the tests I made were done using a slightly modified "CastVideos-ios" (official sample from Google Cast SDKs github).
The only modification I made is related to "media_list_1_url" in order to make the application load a custom JSON file pointing to my video sample files.
The JSON list is available at this link:
http://144.76.13.14/GoogleCastTestList.json
When I try to play the 5.1 audio track version on Chromecast 2 and Chromecast Ultra the video simply dose not load. Both of the test files play without problems on Chromecast 1.
Can someone help me find out if this is a Chromecast firmware issue or I must modify the Cast sender app code in order to play 5.1 audio files ?
I am having issues with downloading and playing an HLS URL at the same time.
I have followed WWDC'16 video and also below link.
But either I am able to play the audio or I am only able to download it. I am not able to achieve doing both the tasks at the same time.
And the download is also working only on iOS 11 beta releases, not on iOS 10 devices.
Am I missing something or there is some other way of achieving it?
There are a lot of changes related to HLS and FPS for iOS 11
And there are two examples from Apple for developers:
FairPlay Streaming Server SDK beta - only for XCode 9
FairPlay Streaming Server SDK - for XCode 8
As for me: on iOS 10 I can play and download HLS+FPS without any issues, but the same code for iOS 11 is not working (as I wrote - there a lot of new instances and workflows in it).
I have a audio streaming app that streams music from a network resource with audio files as mp4's encoded as AAC using Azure's Media Encoder. I'm using open source library's https://github.com/muhku/FreeStreamer https://github.com/douban/DOUAudioStreamer, which both use Apple's Stream Parser from the AudioToolBox framework to stream audio, but on iOS9 the framework throws a kAudioFileStreamError_InvalidFile error. Interestingly apps compiled with the iOS9 SDK continue to stream the same file perfectly on iOS7/8 devices, but not iOS9.
Now i can't figure out if Apple broke something in iOS9, or we have the files encoded wrong on our end, but they play just fine on both iOS 7/8 but not 9. Here is a sample file from our server which plays just fine on iOS 7/8 but not 9 http://patarims.streaming.mediaservices.windows.net/858959cc-a7c2-4053-9f41-9ad32b406a8b/eyhqG1mC_AAC_und_ch2_128kbps.mp4
If anyone could point me in the right direction, how do i go about debugging the issue? Do we need to re-encode our files in some specific way?
We are running a wifi network with captive portal and users need to watch video before they get wifi. The problem is that the video cannot play on iPhone websheet (the page that pops up when you connect to wifi signal). It shows the following error:
"The video could not be loaded, either because the server or network failed or because the format is not supported"
The video is mp4 format with H.264 codec. Framerate is 30. Encoder profile is baseline and level 1.3. More details as follows:
x264 Unparse: level=1.3:bframes=0:cabac=0:8x8dct=0:weightp=0:vbv-bufsize=2000:vbv-maxrate=768
We are using JWPlayer 5.10.
The iPhone that showed the error was iPhone 5c with iOS 8.4.1. The error only occurs on websheet. Video can play without any problem on Safari. Also video can play successfully on both websheet and Safari on iPhone 6.
Any idea and help would be much appreciated!
It is documented all over the web that the iOS does not support multiple HTML5 audio. I ran a test and here are the observation:
iOS 5 (iPad1) - can play only one audio at a time.
iOS 6 (iPad2) - can play multiple audio.
iOS 7 (iPad3) - can play multiple audio.
It looks like on iOS 6 & 7, we can play multiple HTML5 audio, but to my surprise i could not find any discussion regarding this over the web, also apple documentation still says "Currently, all devices running iOS are limited to playback of a single audio or video stream at any time"
I am not sure if multiple HTML5 audio will work across all devices which support iOS6 and iOS7, also will apple continue to support this with next version of iOS, does anyone has any idea about this?