youtube data api retrieve view counts - youtube-api

i implemented in my .net project api youtube.
this is my code
var youtubeService = new YouTubeService(new BaseClientService.Initializer()
{
ApiKey = "MY_API_KEY",
ApplicationName = "MY_APPLICATION_NAME"
});
var searchListRequest = youtubeService.Search.List("snippet");
searchListRequest.Q = SearchText;
searchListRequest.MaxResults = 50;
searchListRequest.Order = SearchResource.ListRequest.OrderEnum.ViewCount;
var searchListResponse = await searchListRequest.ExecuteAsync();
foreach (var searchResult in searchListResponse.Items)
{
if (searchResult.Id.Kind == "youtube#video")
{
}
}
in searchResult not have a STATISTICS (for example view counts).
How to?

Because search.list don't have the part statitics you need to call two time the API.
One time with the request search.list
You get the id of the channel
And a second call with channel.list, with the id of the channel and parameter: statistics
Then you have viewCount
Doc can help : https://developers.google.com/youtube/v3/docs/search/list
https://developers.google.com/youtube/v3/docs/channels/list

I came across the same problem using their Javascript based API search functionality.
Looks like they do not have a build in "views" option for their search based API.
https://developers.google.com/youtube/v3/docs/search#snippet
How ever, you can use their JSON based API and create a AJAX based search box, which returns a JSON based response with a option for view count!
https://developers.google.com/youtube/2.0/developers_guide_jsonc
Created this myself, check it out:
$(document).ready(function() {
var q = $('#query');
$('#search-button').click(function(e) {
var url = "https://gdata.youtube.com/feeds/api/videos?q=" + q.val() + "&v=2&alt=jsonc";
$.getJSON( url, function( response ) {
for(var i = 0; i < response.data.items.length; i++) {
var tr = "<tr>",
title = "<td>" + response.data.items[i].title + "</td>",
views = "<td>" + response.data.items[i].viewCount + "</td>",
likes = "<td>" + response.data.items[i].likeCount + "</td>",
dislikes = "<td>" + (response.data.items[i].ratingCount - response.data.items[i].likeCount) + "</td>",
endtr = "</tr>";
$('#search-container').append(tr + title + views + likes + dislikes + endtr);
}
});
e.preventDefault();
});
});
http://jsfiddle.net/19m9tLo3/1/

Related

How to authenticate with twitter from a firefox plugin

Echofon abandoned their firefox twitter plugin around April 2013, but it's been maintained on github until some recent changes to the twitter API broke it.
In normal use, authentication should follow PIN-based authentication, but instead the request to https://api.twitter.com/oauth/request_token is returning "{"errors":[{"code":32,"message":"Could not authenticate you."}]}'" status='401'
I think the problem is in the TwitterClient.buildOAuthHeader function
TwitterClient.buildOAuthHeader = function (user, method, url, param)
{
var ts = Math.ceil(Date.now() / 1000);
var diff = EchofonUtils.timestampDiff();
if (diff != 0) {
EchofonUtils.debug("local timestamp " + ts + " / server timetsamp " + (ts + diff));
ts += diff;
}
var converter = Cc["#mozilla.org/intl/scriptableunicodeconverter"].createInstance(Ci.nsIScriptableUnicodeConverter);
converter.charset = "UTF-8";
var result = {};
var data = converter.convertToByteArray(user + Date.now() + url + Math.random(), result);
var ch = Cc["#mozilla.org/security/hash;1"].createInstance(Ci.nsICryptoHash);
ch.init(ch.MD5);
ch.update(data, data.length);
var hash = ch.finish(false);
var s = convertToHexString(hash);
var oauthparam = {"oauth_consumer_key" : OAUTH_CONSUMER_KEY,
"oauth_timestamp" : ts,
"oauth_signature_method" : "HMAC-SHA1",
"oauth_nonce" : s + Math.random(),
"oauth_version" : "1.0"};
if (user.oauth_token) {
oauthparam["oauth_token"] = EchofonAccountManager.instance().get(user.user_id).oauth_token;
}
var dict = {};
for (var key in param) dict[key] = param[key];
for (var key in oauthparam) dict[key] = oauthparam[key];
var paramStr = encodeURLParameter(dict);
var base = [method, RFCEncoding(url), RFCEncoding(paramStr)].join("&");
var signature;
var secret = user.oauth_token_secret ? EchofonAccountManager.instance().get(user.user_id).oauth_token_secret : "";
var signature = EchofonSign.OAuthSignature(base, secret);
oauthparam['oauth_signature'] = signature;
var headers = [];
for (var key in oauthparam) {
headers.push(key + '="' + RFCEncoding(oauthparam[key]) + '"');
}
headers.sort();
return headers.join(",");
}
I've registered a new application at dev.twitter.com and I'm using the consumer key from that instead of the one in the repository.
Also, I've added the oauth_callback attribute to the oauthparam object, with the value set to "oob" as detailed in the PIN-based authentication link above, but the plugin is not authenticating correctly with the API.
What needs to be changed in the authorization header to correct this?
This issue has been resolved.
Instructions on how to install a patched version of the plugin here - https://github.com/echofox-team/echofon-firefox-unofficial/issues/85#issuecomment-581843812

YouTube : This video contains content from vevo?

I am trying to play a YouTube video in my application. Everything works fine. However, when I try to watch a video that contains content from Vevo, it fails.
I had also tried to pass el=vevo in get_video_info:
http://www.youtube.com/get_video_info?video_id=uuZE_IRwLNI&el=vevo&ps=default&eurl=&gl=US&hl=en
stream
{
"fallback_host" = "tc.v12.cache7.googlevideo.com";
itag = 22;
quality = hd720;
s = "8E6E5D13EB65FB653B173B94CB0BCC3A20853F5EDE8.5E2E87DF33EEDE165FEA90109D3C7D5DADA06B6BB60";
type = "video/mp4; codecs=\"avc1.64001F, mp4a.40.2\"";
url = "http://r7---sn-cvh7zn7r.googlevideo.com/videoplayback?pcm2fr=yes&sver=3&expire=1393773646&itag=22&id=bae644fc84702cd2&upn=SjZd81MudQs&sparams=gcr%2Cid%2Cip%2Cipbits%2Citag%2Cpcm2fr%2Cratebypass%2Csource%2Cupn%2Cexpire&ms=au&gcr=in&mt=1393747698&source=youtube&ratebypass=yes&ipbits=0&fexp=935620%2C919120%2C912523%2C932288%2C914084%2C916626%2C937417%2C937416%2C913434%2C932289%2C936910%2C936913%2C902907&mv=m&key=yt5&ip=103.250.162.79";
}
When I use url its not playing. Is there any solution?
get_video_info works only for the videos which are allowed to be viewed as embedded videos in other websites. I struggled a lot with get_video_info but could find any solution for vevo. however I was able to make it work by retrieving the actual video page, in actual video page you have to grab player version and hit url (specified in code) to grab the streams links and actual signatures.
youtube might change this in future but today following solutions is working great for me.
Its c# you should know how to convert it into object-C, entry point of following code is ExtractUrls function and remember to pass it html of video page.
e.g. html content of http://www.youtube.com/watch?v=J5iS3tULXMQ&nomobile=1
private static List<string> ExtractUrls(string html)
{
string Player_Version = Regex.Match(html, #"""\\/\\/s.ytimg.com\\/yts\\/jsbin\\/html5player-(.+?)\.js""").Groups[1].ToString();
string Player_Code = new WebClient().DownloadString("http://s.ytimg.com/yts/jsbin/" + "html5player-" + Player_Version + ".js");
html = Uri.UnescapeDataString( Regex.Match(html, #"""url_encoded_fmt_stream_map"":\s+""(.+?)""", RegexOptions.Singleline).Groups[1].ToString());
var Streams = Regex.Matches(html, #"(^url=|(\\u0026url=|,url=))(.+?)(\\u0026|,|$)");
var Signatures = Regex.Matches(html, #"(^s=|(\\u0026s=|,s=))(.+?)(\\u0026|,|$)");
List<string> urls = new List<string>();
for (int i = 0; i < Streams.Count - 1; i++)
{
string URL = Streams[i].Groups[3].ToString();
if (Signatures.Count > 0)
{
string Sign = Sign_Decipher(Signatures[i].Groups[3].ToString(), Player_Code);
URL += "&signature=" + Sign;
}
urls.Add(URL.Trim());
}
return urls;
}
public static string Sign_Decipher(string s, string Code)
{
string Function_Name = Regex.Match(Code, #"signature=(\w+)\(\w+\)").Groups[1].ToString();
var Function_Match = Regex.Match(Code, "function " + Function_Name + #"\((\w+)\)\{(.+?)\}",RegexOptions.Singleline);
string Var = Function_Match.Groups[1].ToString();
string Decipher = Function_Match.Groups[2].ToString();
var Lines = Decipher.Split(';');
for (int i = 0; i < Lines.Length; i++)
{
string Line = Lines[i].Trim();
if (Regex.IsMatch(Line, Var + "=" + Var + #"\.reverse\(\)"))
{
char[] charArray = s.ToCharArray();
Array.Reverse(charArray);
s = new string(charArray);
}
else if (Regex.IsMatch(Line, Var + "=" + Var + #"\.slice\(\d+\)"))
{
s = Slice(s, Convert.ToInt32(Regex.Match(Line, Var + "=" + Var + #"\.slice\((\d+)\)").Groups[1].ToString()));
}
else if (Regex.IsMatch(Line, Var + #"=\w+\(" + Var + #",\d+\)"))
{
s = Swap(s, Convert.ToInt32(Regex.Match(Line, Var + #"=\w+\(" + Var + #",(\d+)\)").Groups[1].ToString()));
}
else if (Regex.IsMatch(Line, Var + #"\[0\]=" + Var + #"\[\d+%" + Var + #"\.length\]"))
{
s = Swap(s, Convert.ToInt32(Regex.Match(Line, Var + #"\[0\]=" + Var + #"\[(\d+)%" + Var + #"\.length\]").Groups[1].ToString()));
}
}
return s;
}
private static string Slice(string Input, int Length)
{
return Input.Substring(Length, Input.Length - 1);
}
private static string Swap(string Input, int Position)
{
var Str = new StringBuilder(Input);
var SwapChar = Str[Position];
Str[Position] = Str[0];
Str[0] = SwapChar;
return Str.ToString();
}
credit goes to comments under this code project artical
Certain videos have a domain-level whitelist or blacklist applied to them. This is done at the discretion of the content owner.
If there is a whitelist or a blacklist, and the domain of the embedding site can't be determined (perhaps because of there not being a real referring domain in the case of your native application), then the default behavior is to block playback.
This blog post has a bit more detail as well: http://apiblog.youtube.com/2011/12/understanding-playback-restrictions.html
That specific video can only be played when it's embedded on a real website with a real referring URL, due to the way domain white/blacklisting works. And, we don't expose those lists via the API. It's a longstanding feature request
YouTube video URL should contain a signature (which is included in the 's' field), to use this url, you need to decrypt the signature first and add it to the URL.
The signature decryptor can be found on the web page of the video (i.e. youtube.com/watch?v=VIDEO_ID).
I can't provide more info as it would be against YouTube terms of service :).

"Authorization is required to perform that action" in Google Scripts with no way to give authorization

I am trying to create a sort of middleware so some legacy software I am working on can consume Twitter feeds.
Since Twitter has made API 1.0 obsolete and 1.1 requires OAuth and because for this project I only have client-side scripting available to me, I opted to use a Google script to perform the OAuth negotiation:
Source: http://www.labnol.org/internet/twitter-rss-feeds/27931/
function start() {
// Get your Twitter keys from dev.twitter.com
var CONSUMER_KEY = "-----";
var CONSUMER_SECRET = "----";
// Ignore everything after this line
initialize(CONSUMER_KEY, CONSUMER_SECRET);
}
function initialize(key, secret) {
ScriptProperties.setProperty("TWITTER_CONSUMER_KEY", key);
ScriptProperties.setProperty("TWITTER_CONSUMER_SECRET", secret);
var url = ScriptApp.getService().getUrl();
if (url) {
connectTwitter();
var msg = "";
msg += "Sample RSS Feeds for Twitter\n";
msg += "============================";
msg += "\n\nTwitter Timeline of user #labnol";
msg += "\n" + url + "?action=timeline&q=labnol";
msg += "\n\nTwitter Favorites of user #labnol";
msg += "\n" + url + "?action=favorites&q=labnol";
msg += "\n\nTwitter List labnol/friends-in-india";
msg += "\n" + url + "?action=list&q=labnol/friends-in-india";
msg += "\n\nTwitter Search for New York";
msg += "\n" + url + "?action=search&q=new+york";
msg += "\n\nYou should replace the value of 'q' parameter in the URLs as per requirement.";
msg += "\n\nFor help, please refer to http://www.labnol.org/?p=27931";
MailApp.sendEmail(Session.getActiveUser().getEmail(), "Twitter RSS Feeds", msg);
}
}
function doGet(e) {
var a = e.parameter.action;
var q = e.parameter.q;
var feed;
switch (a) {
case "timeline":
feed = "https://api.twitter.com/1.1/statuses/user_timeline.json?screen_name=" + q;
break;
case "search":
feed = "https://api.twitter.com/1.1/search/tweets.json?q=" + encodeString(q);
break;
case "favorites":
feed = "https://api.twitter.com/1.1/favorites/list.json?screen_name=" + q;
break;
case "list":
var i = q.split("/");
feed = "https://api.twitter.com/1.1/lists/statuses.json?slug=" + i[1] + "&owner_screen_name=" + i[0];
break;
default:
feed = "https://api.twitter.com/1.1/statuses/user_timeline.json";
break;
}
var id = Utilities.base64Encode(feed);
var cache = CacheService.getPublicCache();
var rss = cache.get(id);
if ((!rss) || (rss == "undefined")) {
rss = JSONtoRSS(feed, a, q);
cache.put(id, rss, 3600);
}
return ContentService.createTextOutput(rss)
.setMimeType(ContentService.MimeType.RSS);
}
function JSONtoRSS(json, type, key) {
oAuth();
var options = {
"method": "get",
"oAuthServiceName": "twitter",
"oAuthUseToken": "always"
};
try {
var result = UrlFetchApp.fetch(json, options);
if (result.getResponseCode() === 200) {
var tweets = Utilities.jsonParse(result.getContentText());
if (type == "search")
tweets = tweets.statuses;
if (tweets) {
var len = tweets.length;
var rss = "";
if (len) {
rss = '<?xml version="1.0"?><rss version="2.0">';
rss += ' <channel><title>Twitter ' + type + ': ' + key + '</title>';
rss += ' <link>' + htmlentities(json) + '</link>';
rss += ' <pubDate>' + new Date() + '</pubDate>';
for (var i = 0; i < len; i++) {
var sender = tweets[i].user.screen_name;
var tweet = htmlentities(tweets[i].text);
rss += "<item><title>" + sender + ": " + tweet + "</title>";
rss += " <author>" + tweets[i].user.name + " (#" + sender + ")</author>";
rss += " <pubDate>" + tweets[i].created_at + "</pubDate>";
rss += " <guid isPermaLink='false'>" + tweets[i].id_str + "</guid>";
rss += " <link>https://twitter.com/" + sender + "/statuses/" + tweets[i].id_str + "</link>";
rss += " <description>" + tweet + "</description>";
rss += "</item>";
}
rss += "</channel></rss>";
return rss;
}
}
}
} catch (e) {
Logger.log(e.toString());
}
}
function connectTwitter() {
oAuth();
var search = "https://api.twitter.com/1.1/application/rate_limit_status.json";
var options = {
"method": "get",
"oAuthServiceName": "twitter",
"oAuthUseToken": "always"
};
try {
var result = UrlFetchApp.fetch(search, options);
} catch (e) {
Logger.log(e.toString());
}
}
function encodeString(q) {
var str = encodeURIComponent(q);
str = str.replace(/!/g, '%21');
str = str.replace(/\*/g, '%2A');
str = str.replace(/\(/g, '%28');
str = str.replace(/\)/g, '%29');
str = str.replace(/'/g, '%27');
return str;
}
function htmlentities(str) {
str = str.replace(/&/g, "&");
str = str.replace(/>/g, ">");
str = str.replace(/</g, "<");
str = str.replace(/"/g, """);
str = str.replace(/'/g, "'");
return str;
}
function oAuth() {
var oauthConfig = UrlFetchApp.addOAuthService("twitter");
oauthConfig.setAccessTokenUrl("https://api.twitter.com/oauth/access_token");
oauthConfig.setRequestTokenUrl("https://api.twitter.com/oauth/request_token");
oauthConfig.setAuthorizationUrl("https://api.twitter.com/oauth/authorize");
oauthConfig.setConsumerKey(ScriptProperties.getProperty("TWITTER_CONSUMER_KEY"));
oauthConfig.setConsumerSecret(ScriptProperties.getProperty("TWITTER_CONSUMER_SECRET"));
}
I have followed all of the instructions prescribed in the guide including running 'start' twice... and all that does is send my email account an email with the various URLs. When I try to access the URLs provided in the email (and the plan is to drop that URL into our legacy javascript that currently points to the Twitter 1.0 API), i get the error "Authorization is required to perform that action"
I have confirmed countless times that it is set up to "Execute the app as [me]" and "Anyone, even anonymous can access app"
I am not sure what I am missing or what got screwed up.
Turns out, I forgot to set up the callback URL within the Twitter app setup as specified in the source instructions. Oops!
This would explain why it wasn't working even though everything on the Google side was correct.

Is OAuth a must before using Google+API

I m trying to use get and list method with google plus comment. In official site it said (All API calls require either an OAuth 2.0 token or an API key. ) and I have tried send GET request without the step of OAuth it works it returns json format data. My question is OAuth must require before using google+ API?
It depends on exactly what data you're trying to get.
https://developers.google.com/+/api/oauth documents the benefits of using OAuth, but in general, if you want to get private profile data, or if you wish to use the /me/ URL shortcut, you will need to use OAuth and may, if you wish, use an App Key in addition. If all you're interested in is public data, you can use the App Key.
The short answer to whether you can do it is that you can get comments from Google+ without OAuth.
As for the how would you do this, I'm not sure which language you're doing this in but the following code shows how this is done in JavaScript.
The API calls used here can be experimented with in the API explorer:
Listing Activities
Listing Comments
A demo of this code is here.
You will need an API key (the simple key) for a project with the Google+ APIs from the Google APIs console. When you set up the project, you will only need to enable the Google+ API from the services section.
First, grab the activities using the public data API:
// Gets the activities for a profile
function getActivities(profileID){
var activities = null;
var URL = "https://www.googleapis.com/plus/v1/people/" + profileID + "/activities/public?alt=json&key=" + key;
var request = new XMLHttpRequest();
request.open('GET', URL, false);
request.send(); // because of "false" above, will block until the request is done
// and status is available. Not recommended, however it works for simple cases.
if (request.status === 200) {
if (debug) console.log("retrieved activities \n\n");
var activities = jQuery.parseJSON(request.responseText).items;
console.log("Discovered " + activities.length + " activities");
}else{
handleRequestIssue(request);
}
return activities;
}
The following code loops through the activities
for (var i=0; i < activities.length; i++) {
console.log("trying to do something with an activity: " + i);
var activity = activities[i];
console.log(activity.id);
}
Next, you can use the activity IDs to retrieve the comments per activity:
function getCommentsForActivity(activityID){
var comments = "";
var URL = "https://www.googleapis.com/plus/v1/activities/" + activityID + "/comments?alt=json&key=" + key;
var request = new XMLHttpRequest();
request.open('GET', URL, false);
request.send(); // because of "false" above, will block until the request is done
// and status is available. Not recommended, however it works for simple cases.
if (request.status === 200) {
if (debug) console.log(request.responseText);
var comments = jQuery.parseJSON(request.responseText).items;
if (debug){
for (comment in comments){
console.log(comment);
}
}
}else{
handleRequestIssue(request);
}
return comments;
}
function manualTrigger(){
var activities = getActivities("109716647623830091721");
}
The following code brings it all together and retrieves activities and comments for a specific post:
$(document).ready(function () {
var renderMe = "";
var activities = getActivities("109716647623830091721");
console.log("activities retrieved: " + activities.length);
for (var i=0; i < activities.length; i++) {
console.log("trying to do something with an activity: " + i);
var activity = activities[i];
renderMe += "<br/><div class=\"article\"><p>" + activity.title + "</p>";
console.log(activity.id);
// get comments
var comments = getCommentsForActivity(activity.id);
for (var j=0; j<comments.length; j++){
renderMe += "<br/><div class=\"comment\">" + comments[j].object.content + "</div>";
}
renderMe += "</div>";
}
console.log("I'm done");
document.getElementById("ac").innerHTML = renderMe;
});

Displaying Twitter stream on webpage?

I want to display a twitter feed of a user on my website. What is the simplest way to do this? I guess Javascript. What I want specifically is for the last 5 tweets to load & then, when another tweet is made, for that to automatically appear at the top of the Tweets. It needs to cover pretty much the whole website, apart from the header & footer. Any suggestions/code to do that?
Cheers, help greatly appreciated!
Loading new data without refreshing will need to be AJAX. To get the data, ses the Twitter API http://apiwiki.twitter.com/. The API will allow you to get the data in the format of choice (xml, json, ect...) which you can then parse and return either the data or HTML to the page that submitted the AJAX call. That should give you a push in the right direction.
Simplest way would be adding the Twitter widget : http://twitter.com/goodies/widget_profile and it updates new tweets automatically (using AJAX I think). You ca set the dimensions too.
use any twitter wrapper calss for example this http://emmense.com/php-twitter/ to get the status and display it. than use javascript time function inside function make ajax call to php script and append latest tweet on top of your container.
you can use jquery for dom update
$('#dividhere').prepend('Bla bla bla');
use jQuery, sry for my programming language, but i like our czech lang
<script type="text/javascript">
jQuery(document).ready(function($){
$.getJSON('http://api.twitter.com/1/statuses/user_timeline/##USERNAME##.json?count=2&callback=?', function(zpravicky){
$("#twitter").html(formatujExtSocialniProfil(zpravicky));
});
});
</script>
in external javascript file code like this
function formatujExtSocialniProfil(twitters) {
var statusHTML = [];
for (var i=0; i<twitters.length; i++){
var username = twitters[i].user.screen_name;
var status = twitters[i].text.replace(/((https?|s?ftp|ssh)\:\/\/[^"\s\<\>]*[^.,;'">\:\s\<\>\)\]\!])/g, function(url) {
return ''+url+'';
}).replace(/\B#([_a-z0-9]+)/ig, function(reply) {
return reply.charAt(0)+''+reply.substring(1)+'';
});
statusHTML.push('<li><span>'+status+'</span> <br/><b>'+relative_time(twitters[i].created_at)+'</b></li>');
}
return statusHTML.join('');
}
function relative_time(time_value) {
var values = time_value.split(" ");
time_value = values[1] + " " + values[2] + ", " + values[5] + " " + values[3];
var parsed_date = Date.parse(time_value);
var relative_to = (arguments.length > 1) ? arguments[1] : new Date();
var delta = parseInt((relative_to.getTime() - parsed_date) / 1000);
delta = delta + (relative_to.getTimezoneOffset() * 60);
if (delta < 60) {
return 'seconds ago';
} else if(delta < 120) {
return 'minute ago';
} else if(delta < (60*60)) {
return (parseInt(delta / 60)).toString() + ' minutes';
} else if(delta < (120*60)) {
return 'hours ago';
} else if(delta < (24*60*60)) {
return 'ago ' + (parseInt(delta / 3600)).toString() + ' hours';
} else if(delta < (48*60*60)) {
return 'yesterday';
} else {
return 'since ago' + (parseInt(delta / 86400)).toString() + ' days';
}
}

Resources