Efficiently get every video of every playlist of a channel - youtube-api

I am using C# .NET with the YouTubeAPI-Nuget Package but that shouldn't be that important because I can apply concepts from every language.
I need to get all Playlists of a channel (specifically the logged-in channel, using the "mine"-parameter) and every video of each playlist quite often, I already tried to do it as rarely as possible but it still has to happen quite often.
The problem is that it takes a long time because I first have to get all playlists (1 API call for every 50 Playlists) and then per playlist get every video (1 API-call per 50 videos per playlist).
So this can multiply quite quickly. For 39 Playlists all with < 50 videos this takes a few seconds every time.
So my question is: is there any better/more efficient way?
My Optimizations:
just try to do this as rarely as possible
Don't include "Liked Videos", "Uploads" and similar unneeded Playlists in the second step (I would exclude them but they aren't included in the first place for some reason)
Code Example of how I currently do it:
private async Task InitPlaylists()
{
var playlists = new Dictionary<string, string>();
var page = "";
while (true)
{
var request = _youTubeService.Playlists.List("snippet");
request.Mine = true;
request.PageToken = page;
request.MaxResults = 50;
var result = await request.ExecuteAsync();
foreach (var playlist in result.Items) playlists.Add(playlist.Id, playlist.Snippet.Title);
if (result.NextPageToken == null) break;
page = result.NextPageToken;
}
foreach (var (id, title) in playlists)
{
var videos = new List<PlaylistItem>();
page = "";
while (true)
{
var listRequest = _youTubeService.PlaylistItems.List("id,snippet");
listRequest.PageToken = page;
listRequest.MaxResults = 50;
listRequest.PlaylistId = id;
var listResult = await listRequest.ExecuteAsync();
videos.AddRange(listResult.Items);
if (listResult.NextPageToken == null) break;
page = listResult.NextPageToken;
}
Playlists.Add(new Playlist(id, title, videos.ToDistinctDictionary()));
}
}
EDIT: Maybe it helps if I say why I need this, maybe someone has an idea how to cut some calls that way:
I want to be able to add and remove videos from playlists, like on the youtube studio edit page.
So I need A. every playlist of the channel and B. every video of every playlist because I need to know if the video is already in the playlist

If you only manipulate your YouTube channel through your script then you can keep track of current playlists' videos state. If that's not the case only multithreading can make your check faster by using a thread for each playlist. Here is a quota-free and possibly faster way of getting every video id of a playlist using youtube-dl -j --flat-playlist.

Related

How to count number of Audio/Subtitle Tracks in given AVURLAsset in Swift?

I want to count the number of audio and subtitle tracks in given AVURLAsset. How do I do that using Swift ?
For given manifest expected answer should be 2 subtitles and 3 audio tracks
#EXTM3U
#EXT-X-MEDIA:TYPE=SUBTITLES,GROUP-ID="subtitle",NAME="#1 Fre",DEFAULT=YES,FORCED=NO,LANGUAGE="fre",URI="subtitles/planete_interdite_subtitle3_fre_vtt.m3u8"
#EXT-X-MEDIA:TYPE=SUBTITLES,GROUP-ID="subtitle",NAME="#3 Eng",DEFAULT=NO,FORCED=NO,LANGUAGE="eng",URI="subtitles/planete_interdite_subtitle5_eng_vtt.m3u8"
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="hdready",NAME="#2 Eng",DEFAULT=NO,AUTOSELECT=YES,LANGUAGE="eng",URI="hdready/planete_interdite_4160_n264_720p_audio2_eng.m3u8"
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="fullhd",NAME="#1 Fre",DEFAULT=YES,AUTOSELECT=YES,LANGUAGE="fre",URI="fullhd/planete_interdite_8256_n264_1080p_audio1_fre.m3u8"
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="fullhd",NAME="#2 Eng",DEFAULT=NO,AUTOSELECT=YES,LANGUAGE="eng",URI="fullhd/planete_interdite_8256_n264_1080p_audio2_eng.m3u8"
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=314000,CODECS="avc1.66.30,mp4a.40.2",RESOLUTION=256x144,AUDIO="low",SUBTITLES="subtitle"low/planete_interdite_228_h264_144p.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=638000,CODECS="avc1.66.30,mp4a.40.2",RESOLUTION=426x240,AUDIO="medium",SUBTITLES="subtitle"medium/planete_interdite_500_h264_240p.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=1942000,CODECS="avc1.66.30,mp4a.40.2",RESOLUTION=640x360,AUDIO="high",SUBTITLES="subtitle"high/planete_interdite_1228_q264_360p.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=3274000,CODECS="avc1.66.30,mp4a.40.2",RESOLUTION=854x480,AUDIO="veryhigh",SUBTITLES="subtitle"veryhigh/planete_interdite_2080_q264_480p.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=4814000,CODECS="avc1.4d001f,mp4a.40.2",RESOLUTION=1280x720,AUDIO="hdready",SUBTITLES="subtitle"hdready/planete_interdite_4160_n264_720p.m3u8
#EXT-X-STREAM-INF:PROGRAM-ID=1,BANDWIDTH=9501000,CODECS="avc1.640028,mp4a.40.2",RESOLUTION=1920x1080,AUDIO="fullhd",SUBTITLES="subtitle"fullhd/planete_interdite_8256_n264_1080p.m3u8
Thanks
< /Hey >
It looks like you will need to be using the function AVAsset.tracks() :
func tracks(withMediaType mediaType: AVMediaType) -> [AVAssetTrack]
Then you can specify the mediaType argument out of the selection they provide here. From what you can specify, it looks like you will have to use AVMediaType.audio & AVMediaType.subtitle.
i.e. To get the number of audio tracks
var audioTracks = YourAVURLAsset.tracks(AVMediaType.audio);
var nAudioTracks = audioTracks.count;
or number of subtitle tracks
var subtitleTracks = YourAVURLAsset.tracks(AVMediaType.subtitle);
var nSubtitleTracks = var subtitleTracks.count;
Note: Code is untested so may need to be adjusted a bit but hopefully this gets you going in the right direction

How to add external WebVTT subtitles into HTTP Live Stream on iOS client

We have videos encoded via bitmovin.com and provided as HTTP Live Streams (Fairplay HLS), but subtitles although in WebVTT format are exposed separately as direct URLs for the whole file, not individual segments and are not part of the HLS m3u8 playlist.
I am looking for the way how an external .vtt file downloaded separately can still be included in the HLS stream and be available as a subtitle in AVPlayer.
I know Apple's recommendation is to include segmented VTT subtitles into the HLS playlist, but I can't change the server implementation right now, so I want to clarify if it is even possible to provide the subtitle to AVPlayer to play along with the HLS stream.
The only valid post on this subject claiming it is possible is this: Subtitles for AVPlayer/MPMoviePlayerController. However, the sample code loads local mp4 file from bundle and I am struggling to make it work for m3u8 playlist via AVURLAsset. Actually, I am having problem to get videoTrack from the remote m3u8 stream as the asset.tracks(withMediaType: AVMediaTypeVideo) returns empty array. Any ideas if this approach can work for real HLS stream? Or is there any other way to play separate WebVTT subtitle with HLS stream without including them into HLS playlist on the server? Thanks.
func playFpsVideo(with asset: AVURLAsset, at context: UIViewController) {
let composition = AVMutableComposition()
// Video
let videoTrack = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
do {
let tracks = asset.tracks(withMediaType: AVMediaTypeVideo)
// ==> The code breaks here, tracks is an empty array
guard let track = tracks.first else {
Log.error("Can't get first video track")
return
}
try videoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, asset.duration), of: track, at: kCMTimeZero)
} catch {
Log.error(error)
return
}
// Subtitle, some test from the bundle..
guard let subsUrl = Bundle.main.url(forResource: "subs", withExtension: "vtt") else {
Log.error("Can't load subs.vtt from bundle")
return
}
let subtitleAsset = AVURLAsset(url: subsUrl)
let subtitleTrack = composition.addMutableTrack(withMediaType: AVMediaTypeText, preferredTrackID: kCMPersistentTrackID_Invalid)
do {
let subTracks = subtitleAsset.tracks(withMediaType: AVMediaTypeText)
guard let subTrack = subTracks.first else {
Log.error("Can't get first subs track")
return
}
try subtitleTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, asset.duration), of: subTrack, at: kCMTimeZero)
} catch {
Log.error(error)
return
}
// Prepare item and play it
let item = AVPlayerItem(asset: composition)
let player = AVPlayer(playerItem: item)
let playerViewController = AVPlayerViewController()
playerViewController.player = player
self.playerViewController = playerViewController
context.present(playerViewController, animated: true) {
playerViewController.player?.play()
}
}
I figured this out. It took forever and I hated it. I'm putting my explanation and source code on Github but I'll put stuff here too incase the link dies for whatever reason: https://github.com/kanderson-wellbeats/sideloadWebVttToAVPlayer
I'm dropping this explanation here to try to save some future people a lot of pain. Lots of stuff I found online was wrong, or left out confusing pieces, or had a bunch of extra irrelevant information, or a mixture of all three. On top of that, I saw lots of people asking for help and trying to do the same thing with nobody providing any clear answers.
So to begin I'll describe what I'm trying to do. My backend server is Azure Media Services, and it's been really great for streaming different resolution video as needed but it just doesn't really support WebVtt. Yeah you can host a file on there, but it seems it cannot give us a master playlist that includes a reference to the subtitles playlist (as Apple requires). It seems both Apple and Microsoft decided what they were going to do with subtitles back in like 2012 and haven't touched it since. At that time they either didn't talk to each other or deliberately went opposite directions, but they happen to have poor intercompatibilty, and now devs like us are forced to stretch the gap between the behemoths. Many of the resources online covering this topic are addressing things like optimized caching of arbitrary streamed data, but I found those resources to be more confusing than helpful. All I'm wanting to do is add subtitles to on-demand videos played in AVPlayer being served by Azure Media Services with the HLS protocol when I have a hosted WebVtt file - nothing more, nothing less. I'll start by describing everything in words, then I'll put the actual code at the end.
Here is the extremely condensed version of what you need to do:
Intercept the requests for the master playlist and return an edited version of it that references the subtitle playlists (multiple for multiple languages, or just one for one language)
Select a subtitle to show (well documented on https://developer.apple.com/documentation/avfoundation/media_playback_and_selection/selecting_subtitles_and_alternative_audio_tracks )
Intercept requests to the subtitle playlists that will come through (after you've selected a subtitle to show) and return playlists you've built on the fly that reference the WebVtt files on the server
That's it. Not too much, except there are many complications that get in the way that I had to discover myself. I'll describe them each first briefly and then in greater detail.
Brief complication explanations:
Many requests will be coming through, but you should only (and can only) handle a couple of them yourself, the others need to be allowed to pass through untouched. I will describe which ones need handling and which ones don't and how to handle them.
Apple decided a simple HTTP request was not good enough and decided to obscure things by translating it into a weird double-identity AVAssetResourceLoadingRequest thing that has a DataRequest property (AVAssetResourceLoadingDataRequest) and a ContentInformationRequest property (AVAssetResourceLoadingContentInformationRequest). I still don't understand why this was necessary or what benefit it brings, but what I've done here with them is working. Some promising blogs/resources seem to suggest you have to mess with the ContentInformationRequest but I find that you can simply ignore the ContentInformationRequest, and in fact messing with it more often than not just breaks things.
Apple suggests you segment your VTT file into small pieces, but you simply can't do this client-side (Apple disallows this), but luckily it also seems you don't actually have to do it, it's merely a suggestion.
INTERCEPTING REQUESTS
To intercept requests, you have to subclass/extend AVAssetResourceLoaderDelegate and the method of interest is the ShouldWaitForLoadingOfRequestedResource method. To make use of the delegate, instantiate your AVPlayer by handing it an AVPlayerItem but hand the AVPlayerItem an AVUrlAsset which has a delegate property you assign the delegate to. All the requests will come through the ShouldWaitForLoadingOfRequestedResource method so that's where all the business will happen, except for one sneaky complication - the method will only be invoked if requests begin with something other than http/https, so my advice is to stick a constant string at the front of the Url you're using to create your AVUrlAsset, which you can then just shave off after the requests comes in to your delegate - let's call that "CUSTOMSCHEME". This part is described in a couple of places online, but it can be super frustrating if you don't know you have to do it because it will seem like nothing is happening at all.
INTERCEPTING - TYPE A) redirecting
Ok so now we're intercepting requests, but you don't want to (/can't) handle them all yourself. Some of the requests you just want to allow to pass through. You do this by doing the following:
create a new NSUrlRequest to the CORRECTED Url (shave off that "CUSTOMSCHEME" part from earlier) and set it to the Redirect property on the LoadingRequest
create a new NSHttpUrlResponse with that same corrected Url and a 302 code and set it to the Response property on the LoadingRequest
call FinishLoading on the LoadingRequest
return true
With those steps you can add in breakpoints and stuff to debug and inspect all the requests that will come through, but they'll proceed normally so you won't break anything. However, this approach isn't just for debugging, it's also a necessary thing to do for several requests even in the finished project.
INTERCEPTING - TYPE B) editing/faking response
When some requests come in, you'll want to do a request of your own so the response to your request (with some tweaking) can be used to fulfill the LoadingRequest. So do the following:
create an NSUrlSession and call the CreateDataTask method on the session (with a corrected URL - remove the "CUSTOMSCHEME")
call Resume on the DataTask (outside of the callback on the DataTask)
return true
up in the DataTask's callback you'll have data, so (after doing your edits) you call Respond on the LoadingRequest's DataRequest property with that (edited) data, followed by calling FinishLoading on the LoadingRequest
INTERCEPTING - which requests get which type of treatment
Lots of requests will come in, some need to be redirected, some need to be given manufactured/altered data responses. Here are the types of requests you'll see in the order they'll come in and what to do with each:
a request to the master playlist, but the DataRequest's RequestedLength is 2 - just redirect (TYPE A)
a request to the master playlist, but the DataRequest's RequestedLength matches the (unedited) length of the master playlist - do your own request to the master playlist so you can edit it and return the edited result (TYPE B)
a request to the master playist, but the DataRequest's RequestedLength is humongous - do the same thing as you did for the previous one (TYPE B)
lots of requests will come through for fragments of audio and video - all these requests need to be redirected (TYPE A)
once you get the master playlist edited correctly (and a subtitle selected) a request will come through for the subtitle playlist - edit this one to return a manufactured subtitle playlist (TYPE B)
HOW TO EDIT THE PLAYLISTS - master playlist
The master playlist is easy to edit. The change is two things:
each video resource has its own line and they all need to be told about the subtitle group (for each line that starts with #EXT-X-STREAM-INF I'm adding ,SUBTITLES="subs" on the end)
new lines need to be added for each subtitle language/type, all belonging to the subtitle group with their own URL (so for each type, add a line like #EXT-X-MEDIA:TYPE=SUBTITLES,GROUP-ID="subs",LANGUAGE="!!!yourLanguageHere!!!",NAME="!!!yourNameHere!!!",AUTOSELECT=YES,URI="!!!yourCustomUrlHere!!!"
The !!!yourCustomUrlHere!!! you use in step 2 will have to be detected by you when it's used for a request so you can return the manufactured subtitle playlist as part of the response, so set it to something unique. That Url will also have to use the "CUSTOMSCHEME" thing so that it comes to the delegate. You can also check out this streaming example to see how the manifest should look: https://developer.apple.com/streaming/examples/basic-stream-osx-ios5.html (sniff the network traffic with the browser debugger to see it).
HOW TO EDIT THE PLAYLISTS - subtitle playlist
The subtitle playlist is a little more complicated. You have to make the whole thing yourself. The way I've done it is to actually grab the WebVtt file myself inside the DataTask callback, then parse the thing down to find the end of the very last timestamp sequence, convert that to an integer number of seconds, and then insert that value in a couple places in a big string. Again, you can use the example listed above and sniff network traffic to see a real example for yourself. So it looks like this:
#EXTM3U
#EXT-X-TARGETDURATION:!!!thatLengthIMentioned!!!
#EXT-X-VERSION:3
#EXT-X-MEDIA-SEQUENCE:0
#EXT-X-PLAYLIST-TYPE:VOD
#EXTINF:!!!thatLengthIMentioned!!!
!!!absoluteUrlToTheWebVttFileOnTheServer!!!
#EXT-X-ENDLIST
Note that the playlist does NOT segment the vtt file as Apple recommends because this can't be done client-side (source: https://developer.apple.com/forums/thread/113063?answerId=623328022#623328022 ). Also note that I do NOT put a comma at the end of the "EXTINF" line even though Apple's example here says to do that, because it seems to break it: https://developer.apple.com/videos/play/wwdc2012/512/
Now the actual code:
public class CustomResourceLoaderDelegate : AVAssetResourceLoaderDelegate
{
public const string LoaderInterceptionWorkaroundUrlPrefix = "CUSTOMSCHEME"; // a scheme other than http(s) needs to be used for AVUrlAsset's URL or ShouldWaitForLoadingOfRequestedResource will never be called
private const string SubtitlePlaylistBoomerangUrlPrefix = LoaderInterceptionWorkaroundUrlPrefix + "SubtitlePlaylist";
private const string SubtitleBoomerangUrlSuffix = "m3u8";
private readonly NSUrlSession _session;
private readonly List<SubtitleBundle> _subtitleBundles;
public CustomResourceLoaderDelegate(IEnumerable<WorkoutSubtitleDto> subtitles)
{
_subtitleBundles = subtitles.Select(subtitle => new SubtitleBundle {SubtitleDto = subtitle}).ToList();
_session = NSUrlSession.FromConfiguration(NSUrlSessionConfiguration.DefaultSessionConfiguration);
}
public override bool ShouldWaitForLoadingOfRequestedResource(AVAssetResourceLoader resourceLoader,
AVAssetResourceLoadingRequest loadingRequest)
{
var requestString = loadingRequest.Request.Url.AbsoluteString;
var dataRequest = loadingRequest.DataRequest;
if (requestString.StartsWith(SubtitlePlaylistBoomerangUrlPrefix))
{
var uri = new Uri(requestString);
var targetLanguage = uri.Host.Split(".").First();
var targetSubtitle = _subtitleBundles.FirstOrDefault(s => s.SubtitleDto.Language == targetLanguage);
Debug.WriteLine("### SUBTITLE PLAYLIST " + requestString);
if (targetSubtitle == null)
{
loadingRequest.FinishLoadingWithError(new NSError());
return true;
}
var subtitlePlaylistTask = _session.CreateDataTask(NSUrlRequest.FromUrl(NSUrl.FromString(targetSubtitle.SubtitleDto.CloudFileURL)),
(data, response, error) =>
{
if (error != null)
{
loadingRequest.FinishLoadingWithError(error);
return;
}
if (data == null || !data.Any())
{
loadingRequest.FinishLoadingWithError(new NSError());
return;
}
MakePlaylistAndFragments(targetSubtitle, Encoding.UTF8.GetString(data.ToArray()));
loadingRequest.DataRequest.Respond(NSData.FromString(targetSubtitle.Playlist));
loadingRequest.FinishLoading();
});
subtitlePlaylistTask.Resume();
return true;
}
if (!requestString.ToLower().EndsWith(".ism/manifest(format=m3u8-aapl)") || // lots of fragment requests will come through, we're just going to fix their URL so they can proceed normally (getting bits of video and audio)
(dataRequest != null &&
dataRequest.RequestedOffset == 0 && // this catches the first (of 3) master playlist requests. the thing sending out these requests and handling the responses seems unable to be satisfied by our handling of this (just for the first request), so that first request is just let through. if you mess with request 1 the whole thing stops after sending request 2. although this means the first request doesn't get the same edited master playlist as the second or third, apparently that's fine.
dataRequest.RequestedLength == 2 &&
dataRequest.CurrentOffset == 0))
{
Debug.WriteLine("### REDIRECTING REQUEST " + requestString);
var redirect = new NSUrlRequest(new NSUrl(requestString.Replace(LoaderInterceptionWorkaroundUrlPrefix, "")));
loadingRequest.Redirect = redirect;
var fakeResponse = new NSHttpUrlResponse(redirect.Url, 302, null, null);
loadingRequest.Response = fakeResponse;
loadingRequest.FinishLoading();
return true;
}
var correctedRequest = new NSMutableUrlRequest(new NSUrl(requestString.Replace(LoaderInterceptionWorkaroundUrlPrefix, "")));
if (dataRequest != null)
{
var headers = new NSMutableDictionary();
foreach (var requestHeader in loadingRequest.Request.Headers)
{
headers.Add(requestHeader.Key, requestHeader.Value);
}
correctedRequest.Headers = headers;
}
var masterPlaylistTask = _session.CreateDataTask(correctedRequest, (data, response, error) =>
{
Debug.WriteLine("### REQUEST CARRIED OUT AND RESPONSE EDITED " + requestString);
if (error == null)
{
var dataString = Encoding.UTF8.GetString(data.ToArray());
var stringWithSubsAdded = AddSubs(dataString);
dataRequest?.Respond(NSData.FromString(stringWithSubsAdded));
loadingRequest.FinishLoading();
}
else
{
loadingRequest.FinishLoadingWithError(error);
}
});
masterPlaylistTask.Resume();
return true;
}
private string AddSubs(string dataString)
{
var tracks = dataString.Split("\r\n").ToList();
for (var ii = 0; ii < tracks.Count; ii++)
{
if (tracks[ii].StartsWith("#EXT-X-STREAM-INF"))
{
tracks[ii] += ",SUBTITLES=\"subs\"";
}
}
tracks.AddRange(_subtitleBundles.Select(subtitle => "#EXT-X-MEDIA:TYPE=SUBTITLES,GROUP-ID=\"subs\",LANGUAGE=\"" + subtitle.SubtitleDto.Language + "\",NAME=\"" + subtitle.SubtitleDto.Title + "\",AUTOSELECT=YES,URI=\"" + SubtitlePlaylistBoomerangUrlPrefix + "://" + subtitle.SubtitleDto.Language + "." + SubtitleBoomerangUrlSuffix + "\""));
var finalPlaylist = string.Join("\r\n", tracks);
return finalPlaylist;
}
private void MakePlaylistAndFragments(SubtitleBundle subtitle, string vtt)
{
var noWhitespaceVtt = vtt.Replace(" ", "").Replace("\n", "").Replace("\r", "");
var arrowIndex = noWhitespaceVtt.LastIndexOf("-->");
var afterArrow = noWhitespaceVtt.Substring(arrowIndex);
var firstColon = afterArrow.IndexOf(":");
var period = afterArrow.IndexOf(".");
var timeString = afterArrow.Substring(firstColon - 2, period /*(+ 2 - 2)*/);
var lastTime = (int)TimeSpan.Parse(timeString).TotalSeconds;
var resultLines = new List<string>
{
"#EXTM3U",
"#EXT-X-TARGETDURATION:" + lastTime,
"#EXT-X-VERSION:3",
"#EXT-X-MEDIA-SEQUENCE:0",
"#EXT-X-PLAYLIST-TYPE:VOD",
"#EXTINF:" + lastTime,
subtitle.SubtitleDto.CloudFileURL,
"#EXT-X-ENDLIST"
};
subtitle.Playlist = string.Join("\r\n", resultLines);
}
private class SubtitleBundle
{
public WorkoutSubtitleDto SubtitleDto { get; set; }
public string Playlist { get; set; }
}
public class WorkoutSubtitleDto
{
public int WorkoutID { get; set; }
public string Language { get; set; }
public string Title { get; set; }
public string CloudFileURL { get; set; }
}
}
If using a streaming service where you can edit the streaming manifest and upload other files where your encoded media is, then with a little bit of manual work (which could be scripted out), you can put the subtitles in the manifest in the way that iOS expects it to be. I was able to get this to work with Azure Media Services, although it is a little hacky.
Since Azure Media Services—which I'll call AMS from now on—streaming endpoints create the streaming manifest on the fly, I couldn't just add the necessary changes to a file. Instead, I created a new master playlist based off of AMS' generated playlist. #SomeXamarinDude explains in his answer the changes that are needed in the master playlist, but I'm going to include an example for completeness.
Let's say the AMS generated master playlist from a streaming endpoint with the URL:
https://mediaservicename-use2.streaming.media.azure.net/d36754c2-c8cf-4f0f-b73f-dafd21fff50f/YOUR-ENCODED-ASSET.ism/manifest\(format\=m3u8-aapl\)
Looks like this:
#EXTM3U
#EXT-X-VERSION:4
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="audio",NAME="aac_eng_2_128079_2_1",LANGUAGE="eng",DEFAULT=YES,AUTOSELECT=YES,URI="QualityLevels(128079)/Manifest(aac_eng_2_128079_2_1,format=m3u8-aapl)"
#EXT-X-STREAM-INF:BANDWIDTH=623543,RESOLUTION=320x180,CODECS="avc1.640015,mp4a.40.2",AUDIO="audio"
QualityLevels(466074)/Manifest(video,format=m3u8-aapl)
#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=623543,RESOLUTION=320x180,CODECS="avc1.640015",URI="QualityLevels(466074)/Manifest(video,format=m3u8-aapl,type=keyframes)"
#EXT-X-STREAM-INF:BANDWIDTH=976825,RESOLUTION=480x270,CODECS="avc1.64001e,mp4a.40.2",AUDIO="audio"
QualityLevels(811751)/Manifest(video,format=m3u8-aapl)
...
Then, the manually created playlist—which I'll name manually-created-playlist.m3u8—will need to look like this:
#EXTM3U
#EXT-X-VERSION:4
#EXT-X-MEDIA:TYPE=SUBTITLES,GROUP-ID="subs",NAME="English",LANGUAGE="en",AUTOSELECT=YES,URI="https://mediaservicename-use2.streaming.media.azure.net/d36754c2-c8cf-4f0f-b73f-dafd21fff50f/subtitle-playlist.m3u8"
#EXT-X-MEDIA:TYPE=AUDIO,GROUP-ID="audio",NAME="aac_eng_2_128079_2_1",LANGUAGE="eng",DEFAULT=YES,AUTOSELECT=YES,URI="YOUR-ENCODED-ASSET.ism/QualityLevels(128079)/Manifest(aac_eng_2_128079_2_1,format=m3u8-aapl)"
#EXT-X-STREAM-INF:SUBTITLES="subs",BANDWIDTH=623543,RESOLUTION=320x180,CODECS="avc1.640015,mp4a.40.2",AUDIO="audio"
YOUR-ENCODED-ASSET.ism/QualityLevels(466074)/Manifest(video,format=m3u8-aapl)
#EXT-X-I-FRAME-STREAM-INF:BANDWIDTH=623543,RESOLUTION=320x180,CODECS="avc1.640015",URI="YOUR-ENCODED-ASSET.ism/QualityLevels(466074)/Manifest(video,format=m3u8-aapl,type=keyframes)"
#EXT-X-STREAM-INF:SUBTITLES="subs",BANDWIDTH=976825,RESOLUTION=480x270,CODECS="avc1.64001e,mp4a.40.2",AUDIO="audio"
YOUR-ENCODED-ASSET.ism/QualityLevels(811751)/Manifest(video,format=m3u8-aapl)
...
Note that the path changes I had to make to the various bitrate playlists.
This manual playlist will then need to be uploaded to the same Azure Storage Container that contains the rest of your encoded media assets.
I also had to create and upload a file called subtitle-playlist.m3u8 and a transcript.vtt to the same Azure Storage Container. My subtitle playlist looked like this:
#EXTM3U
#EXT-X-TARGETDURATION:61
#EXT-X-ALLOW-CACHE:YES
#EXT-X-PLAYLIST-TYPE:VOD
#EXT-X-VERSION:3
#EXT-X-MEDIA-SEQUENCE:1
#EXTINF:61.061000
https://mediaservicename-use2.streaming.media.azure.net/d36754c2-c8cf-4f0f-b73f-dafd21fff50f/transcript.vtt
#EXT-X-ENDLIST
Note that some of the subtitle playlist values depend on the length of the WebVTT file.
At this point, you should be able to point a HLS player to the following URL and be able to enable closed captions:
https://mediaservicename-use2.streaming.media.azure.net/d36754c2-c8cf-4f0f-b73f-dafd21fff50f/manually-created-master-playlist.m3u8
I hope this helps someone. Apparently there is a ticket in the works for fixing this on AMS' side.
Thank you to #SomeXamarinDude for your answer; I would have been totally lost with this issue if it weren't for all the groundwork you put in.

Spotify iOS SDK returning songs that do not exist? - Swift

So I'm building some playlist and song retrieval into my app at the moment, and I'm really confused by some of the results I'm getting back from the API. It seems to be returning songs that no longer exist on Spotify or have been long removed from a playlist.
Retrieving a list of Playlists from a user is working fine, but just in case this problem is arising from the way I draw that playlist's tracks, here is the code I use to get them:
SPTPlaylistSnapshot.playlistWithURI(uri, accessToken: session.accessToken) { (error, playlistSnapshotOb) -> Void in
if let playlistSnapshot = playlistSnapshotOb as? SPTPlaylistSnapshot {
let itemz = playlistSnapshot.firstTrackPage.items //tracksForPlayback()
for item in itemz{
let track = item as! SPTPlaylistTrack
let splice = "\(track.uri)"
let trackURI = splice.stringByReplacingOccurrencesOfString("spotify:track:", withString: "")
var displayArtist = String()
let artistz = track.artists
if artistz.count > 1{
for i in 0...(artistz.count - 1){
let itz = artistz[i] as! SPTPartialArtist
if i > 0 {
displayArtist += ", \(itz.name)"
}else{
displayArtist += "\(itz.name)"
}
}
self.tracks.append(track.name)
self.ArtistObjects.append(displayArtist)
self.uriS.append(trackURI)
}else{
let singularArtist = artistz[0] as! SPTPartialArtist
displayArtist = singularArtist.name
self.tracks.append(track.name)
self.ArtistObjects.append(displayArtist)
self.uriS.append(trackURI)
}
Additionally, below is a screenshot of the desktop Spotify app showing the real content of the playlist I am pulling:
Spotify per Desktop
You'll see that the songs "Big Bank Dank" and "Light Day Remix" are not actually on this playlist, but for some reason, on my app below, when I pull this playlist, it has these songs listed:
Spotify In My App
(Apparently I can't post an actual image because of my rep - apologies)
Any idea why it's doing this?
The tracks are probably just not available any longer for some unspecified reason. This is quite common. By default, the Spotify client does not show unavailable tracks in playlists, but in settings there is a toggle you can flip so that they are shown as greyed out instead.
I don't know about iOS SDK, but there should be either an attribute telling you the available markets for the tracks or if it is playable or not, depending on the country of the user being logged in.
This is how it works in the Web API, which should be similar.
https://developer.spotify.com/web-api/track-relinking-guide/

How to get Multiple photos from graph api

I have uploaded 2 photos in web when Iam parse the feed I am getting only one photo.
In the story property showing 2 photos are there.
story = "Vineesh TP added 2 new photos.";
How can I get the all photos that I have uploaded.
From the Json Iam getting the response
{
actions = (
{
link = "https://www.facebook.com/100001846436204/posts/758373487567525";
name = Comment;
},
{
link = "https://www.facebook.com/100001846436204/posts/758373487567525";
name = Like;
}
);
"created_time" = "2014-10-21T05:55:53+0000";
from = {
id = 100001846436204;
name = "Vineesh TP";
};
icon = "https://fbstatic-a.akamaihd.net/rsrc.php/v2/yz/r/StEh3RhPvjk.gif";
id = "100001846436204_758373487567525";
link = "https://www.facebook.com/photo.php?fbid=758373457567528&set=pcb.758373487567525&type=1&relevant_count=2";
"object_id" = 758373457567528;
picture = "https://fbcdn-sphotos-g-a.akamaihd.net/hphotos-ak-xpa1/v/t1.0-9/s130x130/10420431_758373457567528_1356675492237188571_n.jpg?oh=4f274cc1e68e7222b98d5db14146d4bf&oe=54BC3A91&__gda__=1424398967_7542f5c26057b6294968d9ef0d67a1bf";
privacy = {
allow = "";
deny = "";
description = Public;
friends = "";
networks = "";
value = EVERYONE;
};
"status_type" = "mobile_status_update";
story = "Vineesh TP added 2 new photos.";
"story_tags" = {
0 = (
{
id = 100001846436204;
length = 10;
name = "Vineesh TP";
offset = 0;
type = user;
}
);
};
type = photo;
"updated_time" = "2014-10-21T05:55:53+0000";
}
Add this field
?fields=attachments
you will get all the photos related to the post.
Understanding how stories work related to photos
You might need some clarification to undertand the reason for why.
story = "Vineesh TP added 2 new photos.";
This story indicates that you have added 2 photos, if not specified which album, these photos were added to the Timeline album, else to the specified one (obviously).
Note that if you comment or like on this story "... added 2 new photos", those comments and likes are going on the album to which the photos were added. There is no story that you can comment or like with the text "added x new photos", this is just some proxy/shortcut entry in your feed/stream that redirects any comment/like/action to the album where the pictures were added to.
Understanding this, you will get a clarification on what you are parsing over there, and why you only get one picture. Because he gives you just a "tip" of the iceberg about what happened. Now you can tell the user, two new pictures where added, here is ONE of them as a thumbnail. (Imagine you added 10 pictures, he would still give you one, ore maybe a few more, but never all of them).
How can I get the photos that I have uploaded ?
Well this shouldn't be that difficult, first remember to which album you posted the pictures. If you didn't specify an album, then they are uploaded by default to the "timeline photos" album.
What you should do is, query the album edge, query the album where the pictures were added to.
Depending on the publish time, or a list of Id's you keep, you can retrieve the added photos.
How can I get ALL the photos that I have uploaded ?
For like ALL ALL pictures -> Check the photos edge of Facebook API
For like normal pictures that you have upload by default upload actions, query the timeline album
references
https://developers.facebook.com/docs/graph-api/reference/v2.1/album
https://developers.facebook.com/docs/graph-api/reference/v2.1/photo
Quote from Facebook
/v2.1/{post-id} will now return all photos attached to the post In
previous versions of the API only the first photo was returned with a
post. Any apps that expect only one photo to be returned should
upgrade to possible receive more than one.

access netStream or movieClip from a loop AS2

I've got a load of videos
var ns1:NetStream = new NetStream(nc);
container1.compMa.theVideo.attachVideo(ns1);
ns1.play("sukh_diesel.flv", 1);
//
var ns2:NetStream = new NetStream(nc);
container2.compMa.theVideo.attachVideo(ns2);
ns2.play("sukh_beneath.flv", 1);
//and 4 more, which I've left out to be concise
I want to pause them with
function pauseVid(){
this.ns1.pause();
for(i=1;i<7;i++){
this["ns"+i]pause();
}
}
the commented out line:
this.ns1.pause()
works, but when I try it in a loop it can't access it?
Have you tried using eval to access your stream by name?
function pauseVid(){
var localStream:NetStream;
for(i=1;i<=7;i++){
localStream = eval("ns"+i);
localStream.pause();
}
}
I would recommend keeping track of your objects with an array of streams for example. This way you would avoid change the upper limit of your for statement.

Resources