I'm streaming O365 videos using Azure Media Player in a web app that must be used only in mobile devices. It works with WP and Android, but the player stuck on iOS.
This is my code
client.DefaultRequestHeaders.Add("Authorization", "Bearer " + bearer);
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
var response = await client.GetAsync($"{url}/GetPlaybackUrl('1')");
var content = await response.Content.ReadAsStringAsync();
var firstVal = JsonConvert.DeserializeObject<VideoToken>(content);
response = await client.GetAsync($"{url}/GetStreamingKeyAccessToken");
content = await response.Content.ReadAsStringAsync();
var secondVal = JsonConvert.DeserializeObject<VideoToken>(content);
Client Side
<video id="newsVideoAMP" class="azuremediaplayer amp-default-skin amp-big-play-centered" tabindex="0"></video>
var initVideoPlayer = function (playbackUrl, streamingKeyAccessToken) {
try {
var myOptions = {
"nativeControlsForTouch": false,
controls: true,
autoplay: false,
techOrder: ["azureHtml5JS", "flashSS", "html5FairPlayHLS", "silverlightSS", "html5"],
logo: { enabled: false }
}
newsVideoPlayer = amp("newsVideoAMP", myOptions,
function () {
this.addEventListener(amp.eventName.error, function () {
window.alert('error');
console.log('Error: amp init');
var errorDetails = newsVideoPlayer.error();
window.alert(errorDetails);
var code = errorDetails.code;
var message = errorDetails.message;
$("#log").append("<li><span>code: " + code + " - detail: " + message + "</span></li>')");
});
});
newsVideoPlayer.src([
{
"src": playbackUrl,
"type": "application/vnd.ms-sstr+xml",
"protectionInfo": [
{
"type": "AES",
"authenticationToken": streamingKeyAccessToken
}
]
}]);
}
catch (err) {
console.log(err);
}
}
I think the issue is related to video encoding. So I tried to use GetPlaybackUrl('0') (and avoid the next token request), but the player stops to work on WP and Android and still not work on iOS.
The logger in callback function doesn't tell me some useful and I have also tried to change the tech order.
Is there a console to manage video encoding in order to avoid the AES algorithm and the decrypt token? Because this doc explain that iOS works with HTML5 tech will no token request. How can I solve? Thanks
I found a workaround to reproduce videos on iOS devices.
Instead of use REST API I put an iframe element in my page with the video embedded.
Like this (using MVC Razor):
var url = "{YourWebsiteUrl}/portals/hub/_layouts/15/VideoEmbedHost.aspx?chId={YourChannelId}&vId={YourVideoId}&width=853&height=480&autoPlay=false&showInfo=false";
<iframe width=853 height=480 id="videoframe" src="#url" allowfullscreen data-spresponsive style='position: absolute; top: 0; left: 0; right: 0; bottom: 0; height: 100%; max-width: 100%;'></iframe>
I get this code from the popup of "embed" menu in the video page of Office365 Video.
If somebody else knows another (and better) method, please let me know. Thanks
Related
I am trying to get audio capture from the microphone working on Safari on iOS11 after support was recently added
However, the onaudioprocess callback is never called. Here's an example page:
<html>
<body>
<button onclick="doIt()">DoIt</button>
<ul id="logMessages">
</ul>
<script>
function debug(msg) {
if (typeof msg !== 'undefined') {
var logList = document.getElementById('logMessages');
var newLogItem = document.createElement('li');
if (typeof msg === 'function') {
msg = Function.prototype.toString(msg);
} else if (typeof msg !== 'string') {
msg = JSON.stringify(msg);
}
var newLogText = document.createTextNode(msg);
newLogItem.appendChild(newLogText);
logList.appendChild(newLogItem);
}
}
function doIt() {
var handleSuccess = function (stream) {
var context = new AudioContext();
var input = context.createMediaStreamSource(stream)
var processor = context.createScriptProcessor(1024, 1, 1);
input.connect(processor);
processor.connect(context.destination);
processor.onaudioprocess = function (e) {
// Do something with the data, i.e Convert this to WAV
debug(e.inputBuffer);
};
};
navigator.mediaDevices.getUserMedia({audio: true, video: false})
.then(handleSuccess);
}
</script>
</body>
</html>
On most platforms, you will see items being added to the messages list as the onaudioprocess callback is called. However, on iOS, this callback is never called.
Is there something else that I should do to try and get it called on iOS 11 with Safari?
There are two problems. The main one is that Safari on iOS 11 seems to automatically suspend new AudioContext's that aren't created in response to a tap. You can resume() them, but only in response to a tap.
(Update: Chrome mobile also does this, and Chrome desktop will have the same limitation starting in version 70 / December 2018.)
So, you have to either create it before you get the MediaStream, or else get the user to tap again later.
The other issue with your code is that AudioContext is prefixed as webkitAudioContext in Safari.
Here's a working version:
<html>
<body>
<button onclick="beginAudioCapture()">Begin Audio Capture</button>
<script>
function beginAudioCapture() {
var AudioContext = window.AudioContext || window.webkitAudioContext;
var context = new AudioContext();
var processor = context.createScriptProcessor(1024, 1, 1);
processor.connect(context.destination);
var handleSuccess = function (stream) {
var input = context.createMediaStreamSource(stream);
input.connect(processor);
var recievedAudio = false;
processor.onaudioprocess = function (e) {
// This will be called multiple times per second.
// The audio data will be in e.inputBuffer
if (!recievedAudio) {
recievedAudio = true;
console.log('got audio', e);
}
};
};
navigator.mediaDevices.getUserMedia({audio: true, video: false})
.then(handleSuccess);
}
</script>
</body>
</html>
(You can set the onaudioprocess callback sooner, but then you get empty buffers until the user approves of microphone access.)
Oh, and one other iOS bug to watch out for: the Safari on iPod touch (as of iOS 12.1.1) reports that it does not have a microphone (it does). So, getUserMedia will incorrectly reject with an Error: Invalid constraint if you ask for audio there.
FYI: I maintain the microphone-stream package on npm that does this for you and provides the audio in a Node.js-style ReadableStream. It includes this fix, if you or anyone else would prefer to use that over the raw code.
Tried it on iOS 11.0.1, and unfortunately this problem still isn't fixed.
As a workaround, I wonder if it makes sense to replace the ScriptProcessor with a function that takes the steam data from a buffet and then processes it every x milliseconds. But that's a big change to the functionality.
Just wondering... do you have the setting enabled in Safari settings? It comes enabled by default in iOS11, but maybe you just disabled it without noticing.
I have problem playing local video on iOS on my Cordova based app. At the beginning I want to stress out that this problem is happening only when I'm using WKWebView, and if UiWebView is used, video plays fine. This is scenario I have:
-User comes to screen to which video url is passed
-Via FileTransfer I download it to phone and store it at desired location
-Using JS video is loaded to <video> tag and played.
Basically I'm doing everything as described in answer to this SO question.
The problem with UiWebView was that if relative path was set to src, video for some reason couldn't be loaded (no matter which combination I used), so this solution worked great for me, because it is based on this line of code:
entry.toURL()
This returns full path of the downloaded video which is great, at least for the UiWebView.
The problem for WkWebView is that entry.toURL() returns smth. like this:
file:///var/mobile/Containers/Data/Application/3A43AFB5-BEF6-4A0C-BBDB-FC7D2D98BEE9/Documents/videos/Dips.mp4
And WKWebView doesn't work with file:// protocol. Also, neither WKWebView works wit relative paths :(
Can anyone help me to fix this ?
I got this working today with the following but only when deployed to my device in Release mode. When deploying the app in Debug mode to my device it would not work.
iOS 9.3.2
Cordova 4.0.0 (iOS 3.8.0)
Telerik WKWebView Polyfill 0.6.9
Video list load method:
var path = window.cordova.file.documentsDirectory, //iTunes File Sharing directory
href = 'http://localhost:12344/Documents', //WKWebView default server url to documents
list = [];
function fsSuccess(dir) {
var reader = dir.createReader();
reader.readEntries(function (entries) {
for (var i = 0; i < entries.length; i++) {
list.push({ name: entries[i].name, path: href + entries[i].fullPath });
}
});
}
function fsError(error) {
console.log('error', error)
}
window.resolveLocalFileSystemURL(path, fsSuccess, fsError);
Video list click handler:
var video = $('#video')[0],
source = $('#source');
function play(index) {
source.attr('src', list[index].path);
video.load();
video.play();
}
Video player markup:
<video id="video" autoplay controls loop webkit-playsinline>
<source id="source" type="video/mp4" />
</video>
I was banging my head on my desk a la Ren Hoek while debugging until I attempted a release buid and it worked.
Sample snippet that uses cordova file opener plugin to open the download file from device.(Not tested in WKWebView though)
var fileTransfer = new FileTransfer();
var cdr;
if (sessionStorage.platform.toLowerCase() == "android") {
window.resolveLocalFileSystemURL(cordova.file.externalRootDirectory, onFileSystemSuccess, onError);
} else {
// for iOS
window.requestFileSystem(LocalFileSystem.PERSISTENT, 0, onFileSystemSuccess, onError);
}
function onError(e) {
navigator.notification.alert("Error : Downloading Failed");
};
function onFileSystemSuccess(fileSystem) {
var entry = "";
if (sessionStorage.platform.toLowerCase() == "android") {
entry = fileSystem;
} else {
entry = fileSystem.root;
}
entry.getDirectory("Cordova", {
create: true,
exclusive: false
}, onGetDirectorySuccess, onGetDirectoryFail);
};
function onGetDirectorySuccess(dir) {
cdr = dir;
dir.getFile(filename, {
create: true,
exclusive: false
}, gotFileEntry, errorHandler);
};
function gotFileEntry(fileEntry) {
// URL in which the pdf is available
var documentUrl = "http://localhost:8080/testapp/test.pdf";
var uri = encodeURI(documentUrl);
fileTransfer.download(uri, cdr.nativeURL + "test.pdf",
function(entry) {
// Logic to open file using file opener plugin
openFile();
},
function(error) {
navigator.notification.alert(ajaxErrorMsg);
},
false
);
};
function openFile() {
cordova.plugins.fileOpener2.open(
cdr.nativeURL + "test.pdf",
"application/pdf", //mimetype
{
error: function(e) {
navigator.notification.alert("Error Opening the File.Unsupported document format.");
},
success: function() {
// success callback handler
}
}
);
};
I embed Youtube videos in my angular app using two directives which make use of the YouTube Iframe API. The first loads the library async
angular.module('myApp')
.service('youTubeService', function($rootScope, $window) {
var self = this;
self.ready = false;
$window.onYouTubeIframeAPIReady = function () {
self.ready = true;
console.log("Youtube service ready");
$rootScope.$broadcast('youTubeServiceReady', true);
};
var tag = document.createElement('script');
tag.src = '//www.youtube.com/iframe_api';
var firstScriptTag = document.getElementsByTagName('script')[0];
firstScriptTag.parentNode.insertBefore(tag, firstScriptTag);
});
I then embed the video using the javascript library
angular.module('myApp')
.directive('youtube', function (youTubeService) {
return {
link: function (scope, element, attrs) {
var player;
var playerReady = false;
var playerState;
var callback;
var carouselScope = element.parent().parent().scope();
function createPlayer() {
player = new YT.Player(element[0], {
height: attrs.height,
width: attrs.width,
videoId: attrs.youtube,
playerVars: { 'start' : attrs.starttime, 'end' : attrs.endtime, 'origin': 'https://', showinfo: 0, modestbranding: 1 },
events: {
onReady: function () {
playerReady = true;
// if (callback !== null) {
// callback();
// }
},
onStateChange: function (event) {
//console.log("Time:" + getCurrentTime() + ", Duration:" + getDuration() );
playerState = event.data;
if (playerState === YT.PlayerState.PAUSED) {
carouselScope.play();
}
}
}
});
}
if (youTubeService.ready) {
createPlayer();
} else {
scope.$on('youTubeServiceReady', function () {
createPlayer();
});
}
...
This was working for months up until yesterday but now I get the following video as my embed in all desktop browsers as documented here
https://support.google.com/youtube/answer/6098135?hl=en-GB
My problem is I can't figure out what I should be changing because as far as I understand the iframe api is the correct one. Does anyone know what I should be changing?
So we were having the exact same issue with our site.
It turns out that our client, which uses code very similar to yours above is functioning correctly. Our problem ended up being the way in which we were adding videos and video meta data to our database.
This might not be your issue, but we were using
http://gdata.youtube.com/feeds/api/videos/<video id>?v=2&alt=json
to add videos to our system. As this turns out to be a deprecated endpoint, we had to upgrade to the v3 system which is explained here: https://developers.google.com/youtube/v3/docs/videos/list
On iOS 7.1, I keep getting a buzzing / noisy / distorted sound when playing back audio using the Web Audio API. It sounds distorted like this, in place of normal like this.
The same files are fine when using HTML5 audio. It all works fine on desktop (Firefox, Chrome, Safari.)
EDIT:
The audio is distorted in the iOS Simulator versions iOS 7.1, 8.1, 8.2. The buzzing sound often starts before I even playback anything.
The audio is distorted on a physical iPhone running iOS 7.1, in both Chrome and Safari.
The audio is fine on a physical iPhone running iOS 8.1, in both Chrome and Safari.
i.e.: the buzzing audio is on iOS 7.1. only.
Howler.js is not the issue. The problem is still there using pure JS like so:
var context;
var sound;
var extension = '.' + ( new Audio().canPlayType( 'audio/ogg' ) !== '' ? 'ogg' : 'mp3');
/** Test for WebAudio API support **/
try {
// still needed for Safari
window.AudioContext = window.AudioContext || window.webkitAudioContext;
// create an AudioContext
context = new AudioContext();
} catch(e) {
// API not supported
throw new Error( 'Web Audio API not supported.' );
}
function loadSound( url ) {
var request = new XMLHttpRequest();
request.open( 'GET', url, true );
request.responseType = 'arraybuffer';
request.onload = function() {
// request.response is encoded... so decode it now
context.decodeAudioData( request.response, function( buffer ) {
sound = buffer;
}, function( err ) {
throw new Error( err );
});
}
request.send();
}
function playSound(buffer) {
var source = context.createBufferSource();
source.buffer = buffer;
source.connect(context.destination);
source.start(0);
}
loadSound( '/tests/Assets/Audio/En-us-hello' + extension );
$(document).ready(function(){
$( '#clickme' ).click( function( event ) {
playSound(sound);
});
}); /* END .ready() */
A live version of this code is available here: Web Audio API - Hello world
Google did not bring up any result about such a distorted sound issue on iOS 7.1.
Has anyone else run into it? Should I file a bug report to Apple?
I believe the issue is caused due to resetting the audioContext.sampleRate prop, which seem to happen after the browser/OS plays something recorded in a different sampling rate.
I've devised the following workaround, which basically silently plays a short wav file recorded in the sampling rate that the device currently does playback on:
"use strict";
var getData = function( context, filePath, callback ) {
var source = context.createBufferSource(),
request = new XMLHttpRequest();
request.open( "GET", filePath, true );
request.responseType = "arraybuffer";
request.onload = function() {
var audioData = request.response;
context.decodeAudioData(
audioData,
function( buffer ) {
source.buffer = buffer;
callback( source );
},
function( e ) {
console.log( "Error with decoding audio data" + e.err );
}
);
};
request.send();
};
module.exports = function() {
var AudioContext = window.AudioContext || window.webkitAudioContext,
context = new AudioContext();
getData(
context,
"path/to/short/file.wav",
function( bufferSource ) {
var gain = context.createGain();
gain.gain.value = 0;
bufferSource.connect( gain );
gain.connect( context.destination );
bufferSource.start( 0 );
}
);
};
Obviously, if some of the devices have different sampling rates, you would need to detect and use a specific file for every rate.
it looks like iOS6+ Safari defaults to a sample rate of 48000. If you type this into the developer console when you first open mobile safari, you'll get 48000:
var ctx = new window.webkitAudioContext();
console.log(ctx.sampleRate);
Further Reference: https://forums.developer.apple.com/thread/20677
Then if you close the initial context on load: ctx.close(), the next created context will use the sample rate most other browsers use (44100) and sound will play without distortion.
Credit to this for pointing me in the right direction (and in case the above no longer works in the future): https://github.com/Jam3/ios-safe-audio-context/blob/master/index.js
function as of post date:
function createAudioContext (desiredSampleRate) {
var AudioCtor = window.AudioContext || window.webkitAudioContext
desiredSampleRate = typeof desiredSampleRate === 'number'
? desiredSampleRate
: 44100
var context = new AudioCtor()
// Check if hack is necessary. Only occurs in iOS6+ devices
// and only when you first boot the iPhone, or play a audio/video
// with a different sample rate
if (/(iPhone|iPad)/i.test(navigator.userAgent) &&
context.sampleRate !== desiredSampleRate) {
var buffer = context.createBuffer(1, 1, desiredSampleRate)
var dummy = context.createBufferSource()
dummy.buffer = buffer
dummy.connect(context.destination)
dummy.start(0)
dummy.disconnect()
context.close() // dispose old context
context = new AudioCtor()
}
return context
}
I have set up github's Electron with ReactJs. So I got a BrowserWindow and a react app playing nicely in that window. What I'm trying to achieve is to get authenticated with GitHub. So when a user presses the Login with Github button, a new BrowserWindow opens and goes to the github authorize app url. The issue I have has to do with the callback and how I will get the code returned from the callback. I've done it with Apache Cordova and the InAppBrowser but it was different since I was able to use localhost as a callback.
What I've done so far with electron is opening the new BrowserWindow but after the authorization I cannot get the code from the callback.
var authWindow = new BrowserWindow({ width: 800, height: 600, show: true, 'always-on-top': true });
var githubUrl = 'https://github.com/login/oauth/authorize?';
var authUrl = githubUrl + 'client_id=' + options.client_id + '&scope=' + options.scope;
authWindow.loadUrl(authUrl);
authWindow.setVisibleOnAllWorkspaces(true);
authWindow.setResizable(false);
authWindow.addListener('page-title-updated', function(stream) {
console.log("LOADED");
console.log(JSON.stringify(stream));
console.log(stream);
var url = (typeof stream.url !== 'undefined' ? stream.url : stream.originalEvent.url),
raw_code = /code=([^&]*)/.exec(stream.url) || null,
code = (raw_code && raw_code.length > 1) ? raw_code[1] : null,
error = /\?error=(.+)$/.exec(strean.url);
if (code || error) {
authWindow.close();
}
// If there is a code in the callback, proceed to get token from github
if (code) {
// requestToken(code);
} else if (error) {
alert("Oops! Couldn't log authenticate you with using Github.");
}
});
Where I'm doing console.log(JSON.stringify(stream)); I get {} so it's something that has to do the the eventListener? Any ideas or better approaches?
So what I was missing was the right event. The correct approach is:
// Build the OAuth consent page URL
var authWindow = new BrowserWindow({ width: 800, height: 600, show: false, 'node-integration': false });
var githubUrl = 'https://github.com/login/oauth/authorize?';
var authUrl = githubUrl + 'client_id=' + options.client_id + '&scope=' + options.scopes;
authWindow.loadUrl(authUrl);
authWindow.show();
// Handle the response from GitHub
authWindow.webContents.on('did-get-redirect-request', function(event, oldUrl, newUrl) {
var raw_code = /code=([^&]*)/.exec(newUrl) || null,
code = (raw_code && raw_code.length > 1) ? raw_code[1] : null,
error = /\?error=(.+)$/.exec(newUrl);
if (code || error) {
// Close the browser if code found or error
authWindow.close();
}
// If there is a code in the callback, proceed to get token from github
if (code) {
requestGithubToken(options, code);
} else if (error) {
alert("Oops! Something went wrong and we couldn't log you in using Github. Please try again.");
}
});
// Reset the authWindow on close
authWindow.on('close', function() {
authWindow = null;
}, false);
I also wrote a tutorial that describes the full implementation which can be found at http://manos.im/blog/electron-oauth-with-github/