The YouTube API v3 allows you to request information about a video, such as its title, description, etc.
Is there a way to determine whether the video supports HD resolution?
A workaround could be to look for a maxres thumbnail:
var checkURL = "https://www.googleapis.com/youtube/v3/videos?key=XYZ&part=snippet&fields=items(snippet(thumbnails))&id=" + uid;
$.getJSON(checkURL, function(data) {
if (data.items.length > 0) {
/* Verify this video is HD */
if (data.items[0].snippet.thumbnails.maxres == undefined) {
alert("This video does not support HD")
}
}
});
but is there a better approach?
You can check if a video supports HD by reading contentDetails.definition in a video resource:
string
Indicates whether the video is available in high definition (HD) or only in standard definition.
Valid values for this property are: hd, sd
Related
I'm trying to play a video from an app using Flash Builder 4.7, AIRSDK 31.0 and ios 12.
private function init():void{
holder.addChild(video);
this.addElement(holder);
nc.connect(null);
ns = new NetStream(nc);
ns.client = {};
ns.client.onMetaData = ns_onMetaData;
ns.client.onCuePoint = ns_onCuePoint;
video.attachNetStream(ns);
ns.play("Videos/video.mp4");
ns.addEventListener(NetStatusEvent.NET_STATUS, statusNet);
}
This works on simulators and on android devices, but not for ios devices. I've seen a couple of similiar questions but they are trying to stream an mp4 from a "http" address where mine is using a local file.
I've been asked to stick to mp4 format, although I have read using an FLV file should work.
Special considerations for H.264 video in AIR 3.0 for iOS
For H.264 video, the iOS APIs for video playback accept only a URL to a file or stream. You cannot pass in a buffer of H264 video data to be decoded.
So do I need to find a new way of playing the video other than netStream or am I best to swap to a different file type?
As a side note Adobe says to write your mp4 URLs like this:
("mp4:samples/myvideo.mp4");
My app can't find the file with "mp4:" at the front of the URL.
If you are wanting to play videos that are packaged with your iOS app it's important to ensure you are actually including them when you compile your app.
Untested but something like this should work.
var _dFile:File;
var _ns:NetStream;
var _nc:NetConnection;
var _customClient:Object;
var _video:Video;
_customClient = new Object();
_customClient.onMetaData = metaDataHandler;
_nc = new NetConnection();
_nc.connect(null);
_ns = new NetStream(_nc);
_ns.client = _customClient;
//this is the important bit for finding files within the .ipa bundle.
_dFile = File.applicationStorageDirectory.resolvePath("nameOfYourVideoDirectory/nameOfVideo.mp4");
_ns.play(_dFile.url);
_video = new Video(480, 340);
_video.attachNetStream(_ns);
_ns.addEventListener(NetStatusEvent.NET_STATUS, onNSComplete, false, 0, true);
private function metaDataHandler(infoObject:Object):void {
trace("Length of video",infoObject.duration);
}
private function onNSComplete(e:NetStatusEvent):void{
if(e.info.code == "NetStream.Buffer.Empty") {
//do something
}
}
However, I would highly recommend using an ANE to play video on mobile via the native media player. Take a look at Distriqt MediaPlayer ANE.
I'm looping my streamed videos (not live stream) via .m3u8 playlist and each time the video restarts, it plays the video with the same bitrate adapting that occurs the first time you watch the video (bad quality -> good quality). Is there a way to refresh the stream quality each time the video loops so that the beginning gets replaced with the higher-rate bitrate seamlessly? Instead of just re-playing what was initially loaded?
Apple's AVPlayer attempts to load the first stream listed in the HLS playlist. So if you want the highest quality stream to be loaded first by default, you need to specify it as the first stream in the playlist file.
With that in mind, one way of achieving what you need to achieve is to have a different m3u8 file for each of your streams.
For example, if you have a three variant stream playlist, you would have three .m3u8 playlists.
Then in your view controller where you are using your AVPlayer, you need to keep a reference to the last observed bitrate and most recent bit rate:
var lastObservedBitrate: double = 0
var mostRecentBitrate: double = 0
You would then need to register a notification observer on your player with notification name: AVPlayerItemNewAccessLogEntryNotification
NSNotificationCenter.defaultCenter().addObserver(self, selector:#selector(MyViewController.accessEventLog(_:)), name: AVPlayerItemNewAccessLogEntryNotification, object: nil)
Whenever the access log is updated, you can then inspect the bitrate and stream used using the following code:
func accessLogEvent(notification: NSNotification) {
guard let item = notification.object as? AVPlayerItem,
accessLog = item.accessLog() else {
return
}
accessLog.events.forEach { lastEvent in
let bitrate = lastEvent.indicatedBitrate
lastObservedBitrate = lastEvent.observedBitrate
if let mostRecentBitrate = self.mostRecentBitrate where bitrate != mostRecentBitrate {
self.mostRecentBitrate = bitrate
}
}
}
Whenever your player loops, you can load the appropriate m3u8 file based on your lastObservedBitrate. So if your lastObservedBitrate is 2500 kbps, you would load your m3u8 file that has the 2500kbps stream at the top of the file.
Shameless plug: We've designed something similar in our video api. All you need to do is request the m3u8 file with your connection type: wifi or cellular and lastObservedBitrate and our API will vend you the best possible stream for that bitrate, but still have the ability to downgrade/upgrade the stream if network conditions change.
If you are interested in checking it out visit: https://api.storie.com or https://github.com/Storie/StorieCloudSDK
I'm using WebApi to serve videos on a website. I've tested this on all major desktop browsers and the HTML5 Video tag plays the video as expected. However, I can't get this to work on iPhones (Mobile Safari). The Get() method is never called even after pressing the play button.
WEBAPI CODE
public HttpResponseMessage Get()
{
var path = System.Web.Hosting.HostingEnvironment.MapPath("~/Content/testbw2.mp4");
var stream = new FileStream(path, FileMode.Open, FileAccess.Read, FileShare.Read);
if (Request.Headers.Range != null)
{
try
{
HttpResponseMessage partialResponse = Request.CreateResponse(HttpStatusCode.PartialContent);
partialResponse.Content = new ByteRangeStreamContent(stream, Request.Headers.Range, _mediaType);
return partialResponse;
}
catch (InvalidByteRangeException invalidByteRangeException)
{
return Request.CreateErrorResponse(invalidByteRangeException);
}
}
else
{
HttpResponseMessage fullResponse = Request.CreateResponse(HttpStatusCode.OK);
fullResponse.Content = new StreamContent(stream);
fullResponse.Content.Headers.ContentType = _mediaType;
return fullResponse;
}
}
HTML
<video controls>
<source src="http://localhost/WebApplication24/api/range" type="video/mp4">
Your browser does not support the video tag.
</video>
If I change the video src to point directly to the file...
src="http://localhost/WebApplication24/Content/testbw2.mp4"
It works so I know this isn't an encoding issue.
Is there something I am doing wrong? I get the feeling mobile safari wont request the video src if the url doesn't end with .mp4
Here are the possible reasons (some) that your file does not work and possible solutions:
Safari is fully expecting an actually-named MP4. No other combinations of file and mime-type works. The other browsers opt for the WEBM file first, especially Chrome. source
On iOS prior to 7.0.2, Safari does support HTML5 Video, the Quicktime Player has to be installed in order for this to work. source
Also there are issues in which steaming doesn't work in iOS 7 where as local files does. source
For some reason videos would not play on iOS unless controls="true" is set. source
iOS doesn't support all the profiles that h.264 provides. You have to encode your h264 with a baseline profile only in order for it to be playable on iphone/ipad. Encoding with Miro Video Converter might help. source
I am developing an iOS application in Titanium Appcelerator and using inbuilt camera to record a 2-5 minutes video. Once the video is recorded it needs to be passed to a PHP Web application via REST API. However the size of this video is too big [almost 100 Mb] to be transferred successfully.I am looking for a way to either compress the video or reduce its size to be able to upload successfully.Your suggestions are valuable , so kindly let me know what is the best way forward.
Thank you for your time
You could try a module developed by me: ti.ios.trim (originally built to do video trimming, but it also supports video compression). You could leave out the startTime and the endTime parameters and do something like the following:
Ti.Media.showCamera({
mediaTypes: [Titanium.Media.MEDIA_TYPE_VIDEO],
success: function(e) {
var tempFile = Ti.Filesystem.getFile(Ti.Filesystem.tempDirectory, new Date().getTime() +'-'+ _.random(0,1000) +'.mov');
tempFile.write(e.media);
compressVideo(tempFile.resolve());
}
});
function compressVideo(pathToVideoFile) {
var trimmer = require('ti.ios.trim');
trimmer.trimVideo({
input: pathToVideoFile,
quality: 1, // use 1 for high compression or 2 for medium compression
success: function(e) {
Ti.API.info('SUCCESS:');
Ti.API.info('path to the compressed file: '+ e.videoURL);
},
error: function(e) {
Ti.API.error('ERROR:');
Ti.API.info(JSON.stringify(e));
}
});
}
I've been running into an issue now for a while where on some ios devices my webaudio system only seems to work with headphones where as other devices (exact same os, model, etc) the audio plays perfectly fine through the speakers or headphones. I've searched for a solution to this but haven't found anything on this exact issue. The only thing I can think of is that maybe it's an audio channel issue or something.
How can I fix this?
#Alastair is correct, the mute toggle switch does mute WebAudio, but it does not mute HTML5 tags. Thanks to his work I managed to find a work around for the web which enables WebAudio to play even when the mute toggle switch is on. I'd post this as a comment on his reply, but I don't have the reputation.
In order to play WebAudio you must also play at least one WebAudio sound source node and one HTML5 tag during a user action. It is fine if these sounds are short bits of silence. I found that this self contained code works without any extra files needed:
EDIT 11/29/19:
Removed vestigial typescript typedefs. Thanks #Joep. I also realized the code below is woefully out of date and janky. Just consider it an example. Editing this post prompted me to create an open source solution for this. You can see a demo of it here: https://spencer-evans.com/share/github/unmute/ and check out the repo here: https://github.com/swevans/unmute
/**
* PLEASE DONT USE THIS AS IT IS, THIS IS JUST EXAMPLE CODE.
* If you want a drop in solution I have a script on git hub
* Demo:
* #see https://spencer-evans.com/share/github/unmute/
* Github Repo:
* #see https://github.com/swevans/unmute
*/
var isWebAudioUnlocked = false;
var isHTMLAudioUnlocked = false;
function unlock() {
if (isWebAudioUnlocked && isHTMLAudioUnlocked) return;
// Unlock WebAudio - create short silent buffer and play it
// This will allow us to play web audio at any time in the app
var buffer = myContext.createBuffer(1, 1, 22050); // 1/10th of a second of silence
var source = myContext.createBufferSource();
source.buffer = buffer;
source.connect(myContext.destination);
source.onended = function()
{
console.log("WebAudio unlocked!");
isWebAudioUnlocked = true;
if (isWebAudioUnlocked && isHTMLAudioUnlocked)
{
console.log("WebAudio unlocked and playable w/ mute toggled on!");
window.removeEventListener("mousedown", unlock);
}
};
source.start();
// Unlock HTML5 Audio - load a data url of short silence and play it
// This will allow us to play web audio when the mute toggle is on
var silenceDataURL = "data:audio/mp3;base64,//MkxAAHiAICWABElBeKPL/RANb2w+yiT1g/gTok//lP/W/l3h8QO/OCdCqCW2Cw//MkxAQHkAIWUAhEmAQXWUOFW2dxPu//9mr60ElY5sseQ+xxesmHKtZr7bsqqX2L//MkxAgFwAYiQAhEAC2hq22d3///9FTV6tA36JdgBJoOGgc+7qvqej5Zu7/7uI9l//MkxBQHAAYi8AhEAO193vt9KGOq+6qcT7hhfN5FTInmwk8RkqKImTM55pRQHQSq//MkxBsGkgoIAABHhTACIJLf99nVI///yuW1uBqWfEu7CgNPWGpUadBmZ////4sL//MkxCMHMAH9iABEmAsKioqKigsLCwtVTEFNRTMuOTkuNVVVVVVVVVVVVVVVVVVV//MkxCkECAUYCAAAAFVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVVV";
var tag = document.createElement("audio");
tag.controls = false;
tag.preload = "auto";
tag.loop = false;
tag.src = silenceDataURL;
tag.onended = function()
{
console.log("HTMLAudio unlocked!");
isHTMLAudioUnlocked = true;
if (isWebAudioUnlocked && isHTMLAudioUnlocked)
{
console.log("WebAudio unlocked and playable w/ mute toggled on!");
window.removeEventListener("mousedown", unlock);
}
};
var p = tag.play();
if (p) p.then(function(){console.log("play success")}, function(reason){console.log("play failed", reason)});
}
window.addEventListener("mousedown", unlock);
This is likely because the iPhone's side switch is on "mute". It's very confusing - HTML5 <audio> tags still play fine when the phone is muted, but WebAudio does not. Why? Who knows. But it's a restriction I currently haven't found a way around.
If the iPhone mute button is down, meaning that the iPhone is muted, what is played through Web Audio Api will be muted.
Unfortunately there is no way to check if that physical button (located on the left edge towards the top of the iPhone) is on or off through Javascript.
This issue is completely independent from the fact that in iOS Safari the audio has to be started by a user action for it to be unmuted. There are some tricks that can be done to overcome that fact, including the one suggested by here Spencer, were you use "any action or a specific action" started by the user to "play" a silent audio file to allow subsequently playing audio files to play unmuted.
had same issue, and finally understood problem.
indeed WebView don't play sound on internal speakers if phone is in mute.
when i dig deeper i found a workaround :)
original post => https://stackoverflow.com/a/37874619/8064246
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
//print("AVAudioSession Category Playback OK")
do {
try AVAudioSession.sharedInstance().setActive(true)
//print("AVAudioSession is Active")
} catch _ as NSError {
//print(error.localizedDescription)
}
} catch _ as NSError {
//print(error.localizedDescription)
}