What is the cutoff for videoDuration <4m? [Youtube-API] - youtube

So I was wondering what the cutoff is for the <4 videoDuration filter on YouTube before I upload drafts of empty videos on YouTube.
EDIT: SOLVED!
239.9s counts as 4m - https://youtu.be/px-L_zHykRI
239.4s counts as 4m https://youtu.be/P0_1koVVUrM
239s counts as <4m - https://youtu.be/S4uc8FHFaQA
You can verify this yourselves by searching for the
id (e.g S4uc8FHFaQA) and then filtering for <4m!

Related

Microsoft Graph "messages" delta request truncates too many results with date filter

I think I've found a bug with the date filtering on the delta API.
I'm finding on one of the email accounts I'm working with using Office 365 Graph API that the "messages" graph API delta request is returning a different number of items than are actually in a folder for the expected time range. There are 150,000 items covering 10 years in the folder but delta only returns the last 5,000-ish items covering the last 60 or so days.
Paging Works Fine
When querying the graph API for the folder "Inbox" it has 154,045 total items and 57456 unread items.
IUserMailFoldersCollectionPage foldersPage =
await client.Users[mailboxid].MailFolders.Request().GetAsync();
I can skip over 10,000, 50,000 or more messages using paging.
model.messages = await client.Users[mailboxid].MailFolders[folderid].Messages.Request().Top(top)
.Skip(skip).GetAsync();
Delta with Date Filter doesn't work
But when looping with nextToken and deltaTokens, the deltaToken appears after 5000 or so email messages. Basically it seems like it's only returning results for the last couple months even though the filter is saying find messages for the last 20 years.
Here is the example for how we generate the Delta request. The time is hardcoded here but in reality it is a variable.
var sFilter = $"receivedDateTime ge {DateTimeOffset.UtcNow.AddYears(-20).ToString("yyyy-MM-dd")}";
model.messages = await client.Users[mailboxid].MailFolders[folderid].Messages.Delta().Request()
.Header("Prefer", "odata.maxpagesize=" + maxpagesize)
.Filter(sFilter)
.OrderBy("receivedDateTime desc")
.GetAsync();
And then on each paging operation I do the following. "nexttoken" is either the next or delta link depending on what came back from the first request.
model.messages = new MessageDeltaCollectionPage();
model.messages.InitializeNextPageRequest(client, nexttoken);
model.messages = await model.messages.NextPageRequest
.Header("Prefer", "odata.maxpagesize=" + maxpagesize)
.GetAsync();
Delta without Filter works
If I do the exact same code for delta above but remove the "Filter" operation on date, then I get all the messages in the folder.
This isn't a great solution since I normally only need messages for the last year or 2 years and if there are 15 years of messages it is a huge waste to query everything.
Update on 12/3/2019
I'm still getting this issue. I recently switched back to trying to use Delta again whereas before I was querying everything from the server even though I might only need the last month of data. But that's super wasteful.
This code works fine for most mailboxes but sometimes I encounter a mailbox with this issue.
My code looks like this.
string sStartingTime = startingTime.ToString("yyyy'-'MM'-'dd'T'HH':'mm':'ss") + "Z";
var messageCollectionPage = await client.Users[mailboxsource.GetMailboxIdFromAccountID()].MailFolders[folder.Id].Messages.Delta().Request()
.Filter("receivedDateTime+ge+" + Uri.EscapeDataString(sStartingTime))
.Select(select)
.Header("Prefer", "odata.maxpagesize=" + preferredPageSize)
.OrderBy("receivedDateTime desc")
.GetAsync(cancellationToken);
At around 5000 results the Delta request just stops returning results even though there are 66K items in the folder.
Paul, my peers confirmed there is indeed a 5000-item limit if you apply $filter to a delta query of the message resource.
Within the next day, the docs will also be updated with this information. Thank you for your patience and support!

twitter API limiting tweets to one day, tweepy

I'm trying to pull data from Twitter over a month or so for a project. There are <10000 tweets over this time period with this hashtag, but I'm only seeming to get all the tweets from the current day. I got 68 yesterday, and 80 today; both were timestamped with the current day.
api = tweepy.API(auth)
igsjc_tweets = api.search(q="#igsjc", since='2014-12-31', count=100000)
ipdb> len(igsjc_tweets)
80
I know for certain there should be more than 80 tweets. I've heard that Twitter rate-limits to 1500 tweets at a time, but does it also rate-limit to a certain day? Note that I've also tried the Cursor approach with
igsjc_tweets = tweepy.Cursor(api.search, q="#igsjc", since='2015-12-31', count=10000)
This also only gets me 80 tweets. Any tips or suggestions on how to get the full data would be appreciated.
Here's the official tweepy tutorial on Cursor. Note: you need to iterate through the Cursor, shown below. Also, there is a max count that you can pass .items(), so it's probably a good idea to pull month-by-month or something similar and probably a good idea to sleep in between calls. HTH!
igsjc_tweets_jan = [tweet for tweet in tweepy.Cursor(
api.search, q="#igsjc", since='2016-01-01', until='2016-01-31').items(1000)]
First, tweepy cannot bring too old data using its search API
I don't know the exact limitation but maybe month or two back only.
anyway,
you can use this piece of code to get tweets.
i run it in order to get tweets from last few days and it works for me.
notice that you can refine it and add geocode information - i left an example commented out for you
flag = True
last_id = None
while (flag):
flag = False
for status in tweepy.Cursor(api.search,
#q='geocode:"37.781157,-122.398720,1mi" since:'+since+' until:'+until+' include:retweets',
q="#igsjc",
since='2015-12-31',
max_id=last_id,
result_type='recent',
include_entities=True,
monitor_rate_limit=False,
wait_on_rate_limit=False).items(300):
tweet = status._json
print(Tweet)
flag = True # there still some more data to collect
last_id = status.id # for next time
Good luck

Youtube-v3-Api PlayListItems: list - video that is in the list is not returned when filtered by video id

I am trying to use PlaylistItems: list (method) in my java code.
The code itself is irrelevant here as the problem can be replicated with use of Youtube API examples here
https://developers.google.com/youtube/v3/docs/playlistItems/list
The problem:
While video with specific id exists in the playList,
when using list query on playlistlist filtered by video id and maxResults < of the vieo's sequence number, the result returned is an empty list instead of list with that specific video.
For example:
https://developers.google.com/youtube/v3/docs/playlistItems/list
part = snippet,id
playlistId = PL6894BC5B5D452193
videoId = xEsC1tw-pOw
maxResults = 5 (default value)
(the 11th video in litst)
Result is empty list.
But if i search for
part = snippet,id
playlistId = PL6894BC5B5D452193
videoId = m1V1SjMD1lo
video is found
(the 1st video in list)
As far as I can tell the reason is the value of
maxResults parameter.
However in a real world scenario - my list contains more that 100 items and the max allowed value for maxResults is 50.
So is there a way to find the correct video using list method?
Is this a bug or am I missing something?

twython : get followers list

Using twython I am trying to retrieve list of all of the followers of a particular id which has more than 40k followers. But I am running into below error
"Twitter API returned a 429 (Too many requests) rate limit exceeded. How to over come this issue?
Below is the snippet, I am printing user name and time zone information.
next_cursor = -1
while(next_cursor):
search = twitter.get_followers_list(screen_name='ndtvgadgets',cursor=next_cursor)
for result in search['users']:
time_zone =result['time_zone'] if result['time_zone'] != None else "N/A"
print result["name"].encode('utf-8')+ ' '+time_zone.encode('utf-8')
next_cursor = search["next_cursor"]
Change the search line to:
search = twitter.get_followers_list(screen_name='ndtvgadgets',count=200,cursor=next_cursor)
Then import the time module and insert time.sleep(60) between each API call.
It'll take ages for a user with 41K followers (around three and a half hours for the ndtvgadgets account), but it should work. With the count increased to 200 (the maximum) you're effectively requesting 200 results every minute. If there are other API calls in your script in addition to twitter.get_followers_list you might want to pad the sleep time a little or insert a sleep call after each one.

youtube api - show the most viewd from specific channels

I have a question about the Youtube Api , i'm using the CodeIgniter YouTube API Library by jimdoescode https://github.com/jimdoescode/CodeIgniter-YouTube-API-Library.
Imagine that you have 2 channels , channel x and y .
I need to run a php code which shows me the most viewed videos per week from this tow channels ONLY in (ASC or DESC) order .
** the channels is not yours - it belongs to any user
Ex :
channel x has:
video1 - 3 watchers
video2 - 1 watchers
video3 - 6 watchers
channel y has:
video4 - 9 watchers
video5 - 3 watchers
video6 - 2 watchers
the php code should result the following
video4
video3
video1
video5
video6
video2
I’ve searched on the Youtube api , Developer's Guide, can you help me with some hints please?
I don't believe you can pull feeds from two channels at once from the youtube API.
You would have to pull two feeds and merge the data in PHP and then sort.
As you say you are using the CI Youtube API Library, you would need to add the parameter
orderby with a value of viewCount to the function you are using to pull the feed.
For example, if you are using getUserUploads() you would want something like:
$resultX = $this->youtube->getUserUploads('channelX',array('orderby'=>'viewCount'));
$resultY = $this->youtube->getUserUploads('channelY',array('orderby'=>'viewCount'));
How you parse the XML response results and convert them to an array for sorting I will leave up to you.
http://gdata.youtube.com/feeds/api/users/USERNAME/uploads?orderby=viewCount&max-results=5
Replace USERNAME with the username of the user you want to check for.
Replace 5 with the videos you want to be sorted (shown).

Resources