NetStream http Video not playing on IOS device - ios

I am trying to play a video on iPad, my code is below :
public function init_RTMP():void
{
videoURL = "http://rest************_iphone_high.mp4";
vid = new Video();
nc = new NetConnection();
nc.addEventListener(NetStatusEvent.NET_STATUS, onConnectionStatus);
nc.connect(null);
}
private function onConnectionStatus(e:NetStatusEvent):void
{
if (e.info.code == "NetConnection.Connect.Success")
{
trace("Creating NetStream");
netStreamObj = new NetStream(nc);
metaListener = new Object();
metaListener.onMetaData = received_Meta;
netStreamObj.client = metaListener;
netStreamObj.play(videoURL);
vid.attachNetStream(netStreamObj);
addChild(vid);
}
}
when i play it on my system it is working fine, but when i create a IOS app of it and installs on device, it shows white blank screen.
If anyone have same problem or any idea please share with me.

As VC.One pointed out, AIR for iOS does not play most (but not all, it will occasionally play a very specific encode type) h.264 encoded videos. There are three solutions:
As VC.One said, you encode as FLV. Doing this is not good and I would not recommend it. FLV is not hardware accelerated (unless things have changed recently and I have not seen the updates) and will run entirely off the CPU meaning your app will run slowly and the app will eat battery much quicker than normal.
Use StageWebView, in which case you just plug in the URL to the video and it will play the video using the native video player. This has the down side in that you cannot skint he player and you cannot control it. Once it begins playing, you have no control over it except for unloading the page. This works very well, however, and is fairly easy to implement, though the video will appear on top of the stage (it is not in the Display List).
The last option is to use StageVideo. This will play videos using the native framework, so you can easily play h.264 and it will be hardware accelerated. Additionally, this is just a NetStream player so you have full control over it. And best yet, it has no chrome so you can build a player around the video screen. However, like StageWebView, StageVideo is not in the Display List. But unlike StageWebView, it is rendered directly on the stage, below everything else. So the app itself will cover the video. You can get around this by creating a class to mask your app around the video, but it is incredibly difficult to properly pull off. It took me about 12 hours to create my StageVideo player and the masking class, plus another half day later on fixing issues with the masking class and how it handles DPI changes (hint: do NOT set applicationDPI if you are using Flex)
As always, make sure your AIR SDK is up to date as well. 3.5-3.7 have all added a ton of new features and bug fixes for iOS applications so updating to AIR 3.7 might actually solve or make your issue less of a problem (I don't think it will, but it is always worth a shot, right?)

See this link:
Netstream video not playing on iPad
Basically it was fixed by encoding the video file as FLV not MP4.

Related

YTPlayerView Play video in full screen ipad

I am using "YTPlayerView" to play video in iOS. When I am using this in iPhone it automatically play video in full screen but when I am using same thing in the iPad it does not switch to full screen. I also tried the below parameters but no success.
NSDictionary *playerVars = #{ #"playsinline" : #0 };
[self.playerView loadWithVideoId:videoIDURL playerVars:playerVars];
We have developed a very video intensive app for both iPhone and iPad in the past, and actually had the opposite problem - we wanted to use the YTPlayerView for displaying videos inline (non-fullscreen) on both devices, but weren't able to get it done on the iPhone. We ended up using XCDYouTubeVideoPlayerViewController, which seemed to have resolved most of our issues. This is not really the recommended way to approach this, since it pretty much breaks YouTube's terms of use (it parses the html page, finds the .mp4 url, and plays it in MPMoviePlayer, which gives all all of the flexibility you need for full screen, loading times and other notifications.
Throughout the process of trying to find a way to work with YTPlayerView, I contacted engineers form Google who work on the YouTube helper framework, and they recommended to work directly with the iframe player API, which lets you customize the controls and receive events about the video. This means you'll have to use your own web view instead of the YouTube helper. The communication between the JavaScript and the Objective-C parts can be tricky (and that's what the helper library actually tries to save from the developer), but it will give you more flexibility.
This can also help with working with UIWebView to display YouTube videos: Playing YouTube Videos in a WebView Inline.
I know it's not exactly what you asked, but I hope it can help lead you in the right direction. I've had a lot of issues working with YouTube videos on iOS, so I know how frustrating it can be. Good Luck!

AVComposition breaks on Airplay

I have a video composition which I'd like to play over Airplay (without mirroring). The app works as expected when using normal Airplay mirroring, but I'd like to get the speed, reliability, and resolution bump you get from using Airplay video instead.
The problem is that when I set
player.usesAirPlayVideoWhileAirPlayScreenIsActive = YES;
...the player goes blank.
Notes:
Since I don't create separate windows for each display, they are both trying to use the same AVPlayer.
My AVVideoComposition contains different files and adds opacity ramps between them.
This unanswered question suggests that the problem is more likely due to the fact that I'm playing an AVComposition than the use of a shared player: AVComposition doesn't play via Airplay Video
Two questions:
Do I have to get rid of the player on the iPad?
Can an AVVideoComposition ever be played over AirPlay?
I can't make comments so I had to post this as an answer although it might not fully respond to the questions.
I had similar issue and at the end I found out that when AVPlayer plays AVComposition it simply doesn't display anything on the external display. That's why I had to do it myself by listening to UIScreen connection notifications.
I have to say that all worked pretty perfect. I'm checking first if there are more than one screen and if there are I simply move the AVPlayer on that screen while displaying a simple message on the device's screen that content is played on... plus the name of AirPlay device. This way I can put whatever I want on the external display and is not very complicated. Same thing is when I receive UIScreenDidConnectNotification.
That was fine until I noticed that the composition plays really choppy on the the external display. Even if it consists of only one video without any complex edits or overlays. Same video plays perfectly if I save it to the Camera Roll or if I use MPMoviePlayerController.
I've tried many things like lowering resolutions, lowering renderScale and so on but with no success.
One thing bothers me more is how actually Apple do this in iMovie - if you have AirPlay enabled and you play a project (note it's still not rendered so it must use a composition in order to display it) right after tapping play button it opens a player that plays content really smoothly on the external monitor. If you however activate AirPlay from the player it closes and start rendering the project. After that it plays it I thing by using MPMoviePlayerController.
I'm still trying to find a solution and will post back if I have any success.
So for the two questions:
I don't see why you have to get rid.
Yes it can be played but with different technique and obviously issues.
in the app .plist create a new item called:
required background modes
add a new array element called:
App plays audio or streams audio/video using AirPlay
Not sure if you have already tried this, but you don't mention it in your post.
Cheers!

Mobile Safari: Audio + cache manifest

I'm having a small web app, which plays really short sound bits on the click of several buttons. It explicitly targets mobile Safari on iOS (iPad).
After reading here and elsewhere about the several "shortcomings" of HTML5 audio in this context on mobile Safari and trying a few "hacks" and tricks, I'm stuck with a situation where Safari seems simply (for the lack of a better word) broken:
I can play sound A (it takes a long time for it to start — I'm assuming it's downlading [again]?) on the click of button A. After that, clicking on button B will immediately play the sound A again. Same for button C. In some cases it will play a different sound, sometimes even the right one. But mostly sound A. The format in use was .aiff, is now .m4a .
After writing a few tiny versions myself, I decided to go with the Buzz library to handle the sound loading/playing/etc..
Funnily enough, their demo includes a game, which does pretty much exactly what I need and triggers the same faulty behavior. I even ended up in a situation where any audio player in mobile Safari in any tab would play a certain sound out of the Buzz demo game (!).
I was hoping a cache manifest might help overcome Apples preloading limitations and force the app to play the sound right after hitting the button in offline mode. But after confirming that the whole app had been cached, I can't play/hear any sound in offline mode.
Has anyone managed to get something like this to work somehow? (— Having seen how Apple handles certain things, I' don't expect much response, though… )
Update 1:
The example in this answer causes the same effect: How to synthesize audio using HTML5/Javascript on iPad
Update 2:
Updating iOS (and so Safari) seems to resolve the audio bug. The cache manifest doesn't seem to effect audio files, though. These files are just not available at all.
After removing the cache manifest the app works okay, but adding it to the "home screen" and reloading it prevents the audio from playing as well.
I wish I could tell you there's a magic formula to get all your html5 media to work perfectly, but there isn't. Mobile support for HTML5 audio/video is pretty poor right now; much further behind than its desktop predecesors. To make matters worse, each of the different platforms handle it differently and most of them only support it in semi-recent versions.
However, there are some tricks you can employ to get media files to work correctly in mobile Safari. In order to explain them, you'll need to understand some of its shortcomings.
1) You can't load multiple audio / video files
Its been my experience that the browser will only load one file at a time. If you play one file, go and play another, and then come back and it'll just load that file all over again. And, although I didn't try it myself, I don't believe a cache manifest will help you here.
What I had to do is combine all my audio files into one large audio file. Then, depending on which audio track was requested, I'd move the play position to the appropriate starting position and play that track. Then, I'd use a setInterval to examine the playback every few milliseconds to determine if the current play position hit the end of the track. Once it did, I paused it. Pluis, I gave myself a few seconds (2-3) between each track, just in case the phone's CPU was under a lot of load and checked the feed a little too late.
2) You can't auto-play audio / video files
Apple built into their HTML5 media tag technology the limitation that these tracks would only load and play in response to a user click event. That way, developers couldn't auto-play annoying tracks when you went to their websites.
When I was using audio/video tags, I was trying to build a rich media advertisement. So I hooked my audio/tracks loading to the banner click event, when you click a banner and expand the advertisement.
What I'd suggest you should do is have a small lightbox popup come up, asking the user if they want to turn on/off sound. You can attach your load functions to the click event of that popup box, regardless if the user turns the sound on or off.
Personally, I didn't have much luck using the load() function. I'd run that function to load the audio and then click play and it would just load it again. It could have always been that I just didn't do it right, so feel free to prove me wrong and get that working. What did is I told the track to play, so that way it would start loading, and then I'd use a setInterval to see if the current play time incremented just a few milliseconds. Once I noticed that it started to play the track, I'd immediately pause it.
3) Audio/Video tags are only supported in iOS 4.0 and higher
There's no trick to getting around this. Its just the facts.
Here's a few sites that helped when when I was developing with audio/video tags:
http://www.w3.org/wiki/HTML/Elements/video
http://dev.w3.org/html5/spec-author-view/video.html#media-elements
Good luck!

Interactive flash content on iOS

I want to play interactive (user inputs/actions) flash contents-videos on the iOS devices. I am having flv files in which user can have their inputs like option selection, page turn etc.
I am having 2 approach about the functionality. Please correct if I am wrong.
1.Adobe-air can be used on the iPad devices. Does it have the ability to parse flash content run time? (use flash content as resources/bundle)
2.With the help of FFMPEG lib flash files/videos will work, but will it provide user actions/interactions?
No to both. Adobe Air can only be used to create applications. You can't actually play flash files using it. FFMPEG will only play flash videos, it will not allow interaction.
Basically, if you have flash interactive content that you want to display on the iPhone you are going to have to think of a different way to present it.
Since air 3.8 it got much more easier to run swf content from ios - but it is a pain if your content is external yes...
Also I was able to play dynamic video using starling and flv in the background at 12fps on ipod and 24fps on iphones and ipads.
The key there is how you upload the bitmapData to the GPU:
flash.display3D.textures.Texture(videoLayer.texture.base).uploadFromBitmapData(bitmapData);
Where videoLayer is a Starling Image and bitmapData is a drawn bitmapData from the video.
The rest of the code is trivial stuff that you can find easily online, but the Texture.uploadFromBitmapData is what really made a performance difference.
See it in action in the app store: https://itunes.apple.com/app/id723967141?mt=8

How do I make my HTML5 video player autoplay on iPad, like filmon.com?

I'm trying to create an HTML5 video player to automatically start streaming video. I searched a lot but I didn't achieve my goal.
Then I found www.filmon.com, where all videos start to play automatically on iPad.
Does anyone know how they did it? I looked at their JS files, but I cannot make mine start automatically.
Apple has specifically disabled every method and workaround to autoplay video on iPads and iPhones (the "autoplay" attribute, and Javascript solutions like triggering hidden link's "onclick" event).
I have yet to find a way to autoplay on iPads and it looks like Apple is continuing to squash all efforts to do it. They state, "In Safari on iPhone OS (for all devices, including iPad), where the user may be on a cellular network and be charged per data unit, autobuffering and autoplay are disabled. No data is loaded until the user initiates it."
As a quick update I just checked out Filmon.com and the videos there no longer seem to be autoplaying on an iPad. Example: http://demand.filmon.com/distant-roads-173-cnd-ontario-ca-1 autoplays on Chrome, but not on the iPad.
I don't think that iphone or ipad play streams automatically due to high traffic.
Why don't you play it manually using script at document ready?
somewhat like this:
window.onload=function(){
var audio = document.getElementById('audio');
audio.play();
}

Resources