I'm currently trying to get an mp4 to play in my app. The MP4 plays fine in vlc (basic check to ensure it's not broken).
The code I'm using looks like this
private void startAnimation()
{
using (var pool = new NSAutoreleasePool())
{
InvokeOnMainThread(() =>
{
player = new MPMoviePlayerController(NSUrl.FromFilename("Graphics/videos/noaudio-data-download.mp4"))
{
AllowsAirPlay = true,
Fullscreen = true,
ScalingMode = MPMovieScalingMode.Fill,
RepeatMode = MPMovieRepeatMode.One,
SourceType = MPMovieSourceType.File,
ShouldAutoplay = true,
ControlStyle = MPMovieControlStyle.Fullscreen,
};
player.View.Frame = View.Bounds;
View.AddSubview(player.View);
View.BringSubviewToFront(player.View);
player.PrepareToPlay();
player.Play();
});
}
}
private void stopAnimation()
{
player.Stop();
player.Dispose();
}
All I get is a black screen and the incredibly unhelpful error
2014-01-05 22:00:44.995 ftrack2ios[85614:80b] _itemFailedToPlayToEnd: {
kind = 1;
new = 2;
old = 0;
}
From what I've read on here and other forums, this error can be down to a pile of different reasons, most of them seem to be down to resizing.
The error occurs on a device as well as on the simulator. I'm not sure if this in iOS 7 issue as it used to work on iOS 6 after some playing around.
I have same problem, and I found that
https://discussions.apple.com/message/23203292#23203292
http://www.techisky.com/how-to/play-mp4-on-ios-7-ipad.html
Related
I have a Xamarin iOS app that I want to stream videos from an API endpoint that supports HTTP range requests. I've reviewed many similar questions here on SO, but I cannot seem to get the AVPlayer to start playing the video file before it is downloaded fully no matter what I try.
I've tried:
KVO on playbackLikelyToKeepUp, playbackBufferEmpty and playbackBufferEmpty to play the video as soon as it is ReadyToPlay
set AutomaticallyWaitsToMinimizeStalling = false on the AVPlayer
set CanUseNetworkResourcesForLiveStreamingWhilePaused = true, and PreferredForwardBufferDuration = 1 on the AVPlayerItem
called PlayImmediatelyAtRate(1) on the AVPlayer
But still the file is downloaded fully before the video starts to play, which causes a delay for the user.
Is it possible to get AVPlayer to start playing a video file before it has completed downloading it, similar to how the HTML video tag does it?
Here is my current code:
private void SetUpPlayer()
{
if (ViewModel.VideoStreamUrl == null)
{
return;
}
// See https://stackoverflow.com/questions/38865797/how-to-play-video-with-avplayerviewcontroller-avkit-in-xamarin-ios
_aVPlayerItem = new AVPlayerItem(ViewModel.VideoStreamUrl)
{
CanUseNetworkResourcesForLiveStreamingWhilePaused = true,
PreferredForwardBufferDuration = 1,
};
// See https://stackoverflow.com/questions/38867190/how-can-i-check-if-my-avplayer-is-buffering/38867386#38867386
_playbackLikelyToKeepUpObserver?.Dispose();
_playbackLikelyToKeepUpObserver = (NSObject)_aVPlayerItem.AddObserver("playbackLikelyToKeepUp",
NSKeyValueObservingOptions.New,
AVPlayerItem_BufferUpdated);
_playbackBufferEmptyObserver?.Dispose();
_playbackBufferEmptyObserver = (NSObject)_aVPlayerItem.AddObserver("playbackBufferEmpty",
NSKeyValueObservingOptions.New,
AVPlayerItem_BufferUpdated);
_playbackBufferFullObserver?.Dispose();
_playbackBufferFullObserver = (NSObject)_aVPlayerItem.AddObserver("playbackBufferFull",
NSKeyValueObservingOptions.New,
AVPlayerItem_BufferUpdated);
_aVPlayer = new AVPlayer(_aVPlayerItem)
{
AutomaticallyWaitsToMinimizeStalling = false,
};
var playerViewController = new AVPlayerViewController
{
Player = _aVPlayer,
};
AddChildViewController(playerViewController);
View.AddSubview(playerViewController.View);
playerViewController.View.Frame = View.Frame;
playerViewController.ShowsPlaybackControls = true;
_aVPlayer.PlayImmediatelyAtRate(1);
}
private void AVPlayerItem_BufferUpdated(NSObservedChange obj)
{
ReportVideoBuffering();
}
private void ReportVideoBuffering()
{
bool isBufferEmpty = _aVPlayerItem != null && _aVPlayerItem.PlaybackBufferEmpty;
Console.WriteLine($"Buffer empty? {isBufferEmpty}");
Console.WriteLine($"Player status? {_aVPlayer.Status}");
if (_aVPlayer.Status == AVPlayerStatus.ReadyToPlay)
{
Console.WriteLine($"Playing video.");
_aVPlayer.Play();
}
}
The simple answer to the question is no, the AVPlayer doesn't support streaming using http range requests and partial content (206) responses. In our case, we have decided to use Azure Media Services to provide a streaming endpoint which we can then use in the iPad app as well as on the web.
I have an Ionic app that is a metronome. Using Web Audio API I have made everything work using the oscillator feature, but when switching to use a wav file no audio is playing on a real device (iPhone).
When testing in the browser using Ionic Serve (chrome) the audio plays fine.
Here is what I have:
function snare(e) {
var audioSource = 'assets/audio/snare1.wav';
var request = new XMLHttpRequest();
request.open('GET', audioSource, true);
request.responseType = 'arraybuffer';
// Decode asynchronously
request.onload = function() {
audioContext.decodeAudioData(request.response, function(theBuffer) {
buffer = theBuffer;
playSound(buffer);
});
}
request.send();
}
function playSound(buffer) {
var source = audioContext.createBufferSource();
source.buffer = buffer;
source.connect(audioContext.destination);
source.start(0);
}
The audio sample is in www/assets/audio.
Any ideas where this could be going wrong?
I believe iOS devices require a user-gesture of some sort to allow playing of audio.
It's July 2017, iOS 10.3.2 and we're still finding this issue on Safari on iPhones. Interestingly Safari on a MacBook is fine.
#Raymond Toy's general observation still appears to be true. But #padenot's approach (via https://gist.github.com/laziel/7aefabe99ee57b16081c) did not work for me in a situation where I wanted to play a sound in response to some external event/trigger.
Using the original poster's code, I've had some success with this
var buffer; // added to make it work with OP's code
// keep the original function snare()
function playSound() { // dropped the argument for simplicity.
var source = audioContext.createBufferSource();
source.buffer = buffer;
source.connect(audioContext.destination);
source.start(0);
}
function triggerSound() {
function playSoundIos(event) {
document.removeEventListener('touchstart', playSoundIos);
playSound();
}
if (/iPad|iPhone/.test(navigator.userAgent)) {
document.addEventListener('touchstart', playSoundIos);
}
else { // Android etc. or Safari, but not on iPhone
playSound();
}
}
Now calling triggerSound() will produce the sound immediately on Android and will produce the sound on iOS after the browser page has been touched.
Still not ideal, but better than no sound at all...
I had a similar issue in current iOS (15). I tried to play base64 encoded binary data which worked in all browsers, but not on iOS.
Finally reordering of the statements solved my issue:
let buffer = Uint8Array.from(atob(base64), c => c.charCodeAt(0));
let context = new AudioContext();
// these lines were within "play()" before
audioSource = context.createBufferSource();
audioSource.connect(context.destination);
audioSource.start(0);
// ---
context.decodeAudioData(buffer.buffer, play, (e) => {
console.warn("error decoding audio", e)
});
function play(audioBuffer) {
audioSource.buffer = audioBuffer;
}
Also see this commit in my project.
I assume that calling audioSource.start(0) within the play() method was somehow too late because it's within a callback after context.decodeAudioData() and therefore maybe "too far away" from a user interaction for the standards of iOS.
How does one loop a video using AVPlayer under Xamarin iOS? ObjectiveC solution suggests the use of an observable notification. It's not clear how to do this with C# syntax or the Xamarin API of the AVPlayer.
You can see the ObjectiveC question and answer here: Looping a video with AVFoundation AVPlayer?
Sorry, #Pandalink, we should have updated an answer here once we figured it out...
We found something rather quite simple:
NSNotificationCenter.DefaultCenter.AddObserver (AVPlayerItem.DidPlayToEndTimeNotification, (notify) => {
player.Seek(CoreMedia.CMTime.Zero);
notify.Dispose ();
});
This worked for me:
AVAsset videoAsset;
AVPlayerItem videoPlayerItem;
AVPlayer videoPlayer;
AVPlayerLayer videoPlayerLayer;
NSObject videoEndNotificationToken;
public override void ViewDidLoad()
{
videoAsset = AVAsset.FromUrl(NSUrl.FromFilename("video.mp4"));
videoPlayerItem = new AVPlayerItem(videoAsset);
videoPlayer = new AVPlayer(videoPlayerItem);
videoPlayerLayer = AVPlayerLayer.FromPlayer(videoPlayer);
videoPlayerLayer.Frame = View.Frame;
View.Layer.AddSublayer(videoPlayerLayer);
videoPlayer.Play();
// Subscribe to video end notification
videoPlayer.ActionAtItemEnd = AVPlayerActionAtItemEnd.None;
videoEndNotificationToken = NSNotificationCenter.DefaultCenter.AddObserver(AVPlayerItem.DidPlayToEndTimeNotification, VideoDidFinishPlaying, videoPlayerItem);
}
private void VideoDidFinishPlaying(NSNotification obj)
{
Console.WriteLine("Video Finished, will now restart");
videoPlayer.Seek(new CMTime(0, 1));
}
The following code in Xamarin for iOS was working fine prior to the Xamarin for iOS update to v2.0.50727
This is the code in a custom renderer in a Xamarin Forms app
class WatchVideoRenderer : PageRenderer
{
MPMoviePlayerController moviePlayer;
protected override void OnElementChanged(VisualElementChangedEventArgs e)
{
base.OnElementChanged(e);
var url = new NSUrl("http://192.168.12.4:8085/MediaUploads/1/211/520140731170618/DPM202.mp4");
moviePlayer = new MPMoviePlayerController();
moviePlayer.ContentUrl = url;
moviePlayer.View.Frame = new CGRect((float)((NativeView.Bounds.Width - 600) / 2), (float)((NativeView.Bounds.Height - 450) / 2), 600, 400);
MPMoviePlayerController.Notifications.ObserveLoadStateDidChange(OnLoadStateChanged);
MPMoviePlayerController.Notifications.ObservePlaybackDidFinish(OnPlaybackComplete);
View.AddSubview(moviePlayer.View);
moviePlayer.PrepareToPlay();
moviePlayer.ShouldAutoplay = true;
moviePlayer.Play();
}
private void OnLoadStateChanged(object sender, NSNotificationEventArgs e)
{
if (moviePlayer.LoadState == MPMovieLoadState.Playable)
{
}
}
private void OnPlaybackComplete(object sender, MPMoviePlayerFinishedEventArgs e)
{
}
}
As i said this was working till day before yesterday, after which I installed 2 updates on Xamarin. iOS & this is now failing. All i see is a black canvas & the video never loads.
No notifications from the MPMoviePlayerController are ever raised.
There is a release of this app scheduled for next week & this last minute bug is causing me headaches. Any help is really appreciated.
I resolved a similar issue by pushing a new image context and that did the trick. I modified the code to include the BeginImageContext() and EndImageContext().
UIGraphics.BeginImageContext(new CGSize(1,1));
var url = new NSUrl("http://192.168.12.4:8085/MediaUploads/1/211/520140731170618/DPM202.mp4");
moviePlayer = new MPMoviePlayerController();
moviePlayer.ContentUrl = url;
moviePlayer.View.Frame = new CGRect((float)((NativeView.Bounds.Width - 600) / 2), (float)((NativeView.Bounds.Height - 450) / 2), 600, 400);
UIGraphics.EndImageContext();
Try switching to AVPlayerViewController instead of MPMoviePlayerController. Throw this code inside your OnElementChanged method.
My renderer was a view renderer and not a page renderer so you might have to tweak it a bit.
if(Control == null)
{
AVPlayerViewController avpvc;
AVPlayer avp;
var url = NSUrl.FromString("SOME URL HERE"); //or NSUrl.FromFile
avp = new AVPlayer(url);
avpvc = new AVPlayerViewController();
avpvc.Player = avp;
avpvc.ShowsPlaybackControls = true;
avp.Play();
this.SetNativeControl(avpvc.View);
}
I'm using Xamarin with MvvmCross to create an iPad application. In this application I use the PictureChooser plugin to take a picture with the camera. This all occurs in the way that can be seen in the related youtube video.
The code to accomplish this is fairly simple and can be found below. However when testing this on the actual device, the camera might be rotated.
private readonly IMvxPictureChooserTask _pictureChooserTask;
public CameraViewModel(IMvxPictureChooserTask pictureChooserTask)
{
_pictureChooserTask = pictureChooserTask;
}
private IMvxPictureChooserTask PictureChooserTask { get { return _pictureChooserTask; } }
private void TakePicture()
{
PictureChooserTask.TakePicture(400, 95,
async (stream) =>
{
using (var memoryStream = new MemoryStream())
{
stream.CopyTo(memoryStream);
var imageBytes = memoryStream.ToArray();
if (imageBytes == null)
return;
filePath = ProcessImage(imageBytes, FileName);
}
},
() =>
{
/* no action - we don't do cancellation */
}
);
}
This will lead to unwanted behavior. The camera should remain steady and be prevented in rotating within the App. I have been trying some stuff out, like preventing the app from rotating in the override bool ShouldAutorotate method while in camera mode, but unfortunately without any results.
Is there any setting that I forgot to set on the PictureChooser, or is the override method the item where I should perform some magic?
Thanks in advance.
Answer to this question has been raised in the comments of the question by user3455363, many thanks for this! Eventually it seemed to be a bug in iOS 8. The iOS 8.1 upgrade fixed this issue in my App!