As far as I understand correctly, a call to a Youtube API will cost 'units', and I have a daily default of 10000 units, which is reset every day around midnight (Pacific Time).
On https://developers.google.com/youtube/v3/determine_quota_cost, these costs are mentioned for every resource and method. However, I cannot find the Live Streaming resources on that page.
I'm interested particularly in resources 'liveBroadcast' and 'liveStream' and their methods to list, insert, update, and delete them, including 'bind' to bind a liveStream to a liveBroadcast.
Is there a quota-page for the Youtube Live resources and methods?
Related
I have a very simple message and getting the v3 youtube data api to get the list of comments. I am just fetching the list of videos and then fetching the comments (at frequency of 5 sec) to get updated messages. using the page token as needed to minimize the load and computaion.
Today after some time while internally testing the application i started getting the quota exceeded exception. I know the youtube provided by default 10000 units and since reading the comments (and videos as well) is just 1 unit, i should expect to get similar numbers.
However, the data is wrongly calculated.
Following are request details
If you see, there are 2895 total requests LiveChatMessages-> List.
However, when i go to IAM-> Quotas, it showed 14k earlier, then 12.6k in quota usage
There seems to be some problem either with the computation or with the Documentation that defines the units for queries. Can someone help please..
PS: Just using the two apis as mentioned above in screenshot. Both are list.
If you see, there are 2895 total requests LiveChatMessages-> List. However, when i go to IAM-> Quotas, it showed 14k earlier, then 12.6k in quota usage
Yes i can see that there are 2895 requests, but how do you know what the qutoa costs are for those requests. You are using the YouTube Live Streaming api for those requests. Not the YouTube-Data-api
There is no documentation of the quota cost for the YouTube Live Streaming api calls. If Google says you used all your quota then you probably have.
I would post an issue over on the issue forum asking them to document the quota cost for the calls Issue forum
I recently began using the Youtube Data v3 API for a program that I'm writing which is purely for personal use. To give a brief summary of what it does, it checks the the live chat from my most recent (usually ongoing) livestream and performs actions based on certain keywords entered in chat (essentially commands for people to use from live chat). In order to do that, however, I have to constantly send requests to get a refreshed livechat. As it is now, it sends requests on 1 second intervals. I recently did a livestream to test out my program and it only took about 25 minutes for me to reach the daily quota limit of 10,000 units/day.
The request is:youtube.liveChatMessages().list(liveChatId=liveChatId,part="snippet")
It seems like every request I make costs 6 units, according to the math. I want to be able to host livestreams at lengths of up to 3 hours, which would require a significant quota increase. I'm aware that there is an option to fill out a form to request additional quota. However, it asks for business information such as a business name, business website, business mailing address, etc. Like I said before, I'm doing this for my own use only. I'm in no way part of a business, and just made my program as a personal project. Does anyone know if there's any way to apply for additional quota as an individual/hobbyist? If not, do you think just putting n/a in those fields would be acceptable? I did find another post where someone else had the exact same problem, but no one was able to give a helpful answer. Any advice would be greatly appreciated.
Unfortunately, and although only related, it seems as Google is for the money here. I also tried to do something similar myself (a very basic chat bot just reading the chat messages), and, although some other users on the net got some different results, they all have in common that, according to the doc how it should be done, all poll at this interval of about once a second (that's the timeout one get as part of the answer to a poll for new messages). I, along with a few others, got as most as about 5 minutes with polling once a second, some others, like you, got a few more minutes out of it. I changed the interval by hand in incrementing intervals of 5 seconds each: 5, 10, 15, etc... you get the picture. I can't remember on which value I finally tuned in, but I was only able to get about 2 1/2 hours worth with a rather long polling interval of just once every 10 seconds or so - still way enough for a simple chat bot just reading the chat. But also replying would had at least doubled the usage and hence halfed the time.
It's already a pain to get it working as an idividual as just setting up the required OAuth authentication requires one to at least provide basic information like providing a fixed callback and some legal and policy information. I always ended up in had it rejected with this standard reply "Your project seem to be for internal use only.". I even was able to got this G suite working (before it required payment) to set up an "internal" project (only possible if account belongs to a G suite organization account), but after I set up the OAuth login I got the error that my private account I wanted to use the bot on was not part of the organization and hence can't be used. TLDR: Just useless waste of time.
As far as I'm in for this for several months now there's just no way to get it done as a private individual for personal use. Yes, one can just set it up and have the required check rejected (as it uses the YouTube data API scopes), but one still stuck with that 10.000 units / day quota. Building your own powerful tool capable of doing more than just polling once every 10 to 30 seconds with just a minimum of interaction doesn't get you any further than just a few minuts, maybe one or two hours if you're lucky. If you want more you have to set up a business and pay for it - simple and short: Google wants you to pay for that service.
As Mixer got officially announced to be shut down on July 22nd you have exactly these two options:
Use one of the public available services like Streamlabs, Nightbot, etc ... They're backed by their respective "businesses" and by it don't seem to have those quota limits (although I just found some complaints on Streamlabs just from April - so about one month prior to when you posted this question where they admitted to had reached their limits - don't know if they already got it solved).
Don't use YouTube for streaming but rather Twitch - as Twitch doesn't have these limits and anybody is free to set up an API token either on the main account or on a second bot account (which is also explicitly explained in their docs). The downside of this are of course the objective sacrifices one has to suffer: a) viewers only have the quality of the streamer until one reaches at least affiliate b) caped at max 1080p60 with only 6.000kBit/s c) only short time of VOD storage
I myself wanted to use YouTube as my main platform (and currently do, but without my own stuff at the moment) and my own bot stuff and such as streaming on YouTube has some advantages over Twitch, but as YouTube wants me to pay what others (namely: Twitch) offer me for free (although overall not as good quality) it's an easy decision to make. Mixer looked promissing, as it also offered quite some neat features (overall better quality than Twitch, lower latency), but the requirements to get partner status were so high (2.000 followers along with another insane high number to reach) and Mixer itself just so little of a platform (I made the fun to count all the streamers and viewers - only a few hundred streamers with just a few 10.000s viewers the whole platform had less than some big Twitch channels on their own) - and now it's announced soon to be dead anyway.
Hope this may give you some input into what a small streamer has to consider and suffer from when chosing a platform - but after all what I experienced I have these information: Either do it like all the others: Stream on Twitch and use YouTube as an archive to export to from Twitch (although Twitch STILL doesn't have an auto-export of the latest VOD implemented - but I guess that could be done by some small script) - or if you want to stay on YouTube use some existing bot like Nightbot or any of the other services like Streamlabs.
If you get any other information on how to convince Google to increase the limit as an individual please let us know.
While the subscription count in
www.googleapis.com/youtube/v3/channels?part=statistics
seems to be updated instantly, the views update around daily.
A workaround that I found was to list all videos in the "uploaded" playlist with
www.googleapis.com/youtube/v3/playlistItems?part=contentDetails
and iterate through them, calling
www.googleapis.com/youtube/v3/videos?part=statistics
for each. This seems to get the most accurate results, though it requires more than 3 credits for every uploaded video, thus using my quota up relatively fast.
Is there a faster way around the problem?
I would like to implement it on an ESP8266 so it would be preferable not to require a lot of storage or processing power.
You can get the view count by getting the liveStreamingDetails, the liveStreamingDetails object contains metadata about a live video broadcast. The object will only be present in a video resource if the video is an upcoming, live, or completed live broadcast. Then, under this, you will get the concurrentViewers. It will show the number of viewers currently watching the broadcast. The property and its value will be present if the broadcast has current viewers and the broadcast owner has not hidden the viewcount for the video
EDIT
Specific to your use case, I believe a 2-part API would help with your inquiry.
I'm thinking of you calling a search query to retrieve all videos of the channel. The Search resource will have the id.videoId that you'll concatenate as part of the list call. This will give you the statistics.viewCount of each video, which you'll need to add up to get the total channel view count.
Hopefully this helps with your inquiry.
Happy coding!
We are a content owner who has multiple channels rolled up into our analytics
We are pulling each day's earnings from the Youtube API to store in our database, but we want the data available per-video and not just per-channel.
The only way I've found to get a single video's earnings for a particular day is to do it for each individual video, like so
GET https://www.googleapis.com/youtube/analytics/v1/reports?ids=contentOwner%3D%3D<contentownerid>&start-date=2013-10-01&end-date=2013-10-01&metrics=views%2Cearnings&dimensions=day&filters=video%3D%3D<videoid>&sort=day&key={YOUR_API_KEY}
This, obviously, is not ideal. We have over 5000 videos we're trying to keep track of, and the number will only grow, so making thousands of API calls every morning is not a good solution.
Is there a way to get earnings reports for the whole channel, separated by video, for a given date range with a single API call? I would think I would set "dimension" to video, but that won't work with anything involving the monetary fields like earnings.
For V2 of the YouTube Data API what are the exact limits for the quota?
I am aware that this is a frequent question, however I am yet to find any concrete answers.
Reason for Question:
I am going to querying a large pool of videos for their comments on a regular basis and would like to know when I am coming close to my quota limit, so the system can slow down. In V3 of the YouTube API, the quota limits are clearly documented. However I'm unable to use V3 of the API as it does not support the retrieval of comments (sidenote - does anyone know why?)
In v2 of the data API, the quota was not a fixed number per day as it is in v3, but instead was a limit that prevented too many requests within a short period of time. Unfortunately, I don't believe that there exists anywhere some firm documentation as to how many requests that would be or what the short period of time would be, either; generally, Youtube has always stated that if you get a quota error while making a call to v2 of the data API you should wait "a few minutes" before trying again. Here's the only official statement.
https://developers.google.com/youtube/2.0/developers_guide_protocol_error_responses?hl=en#Quota_errors
It is possible that one of the reasons for this lack of direct documentation is that there isn't a hard and fast number, but it changes in response to the current load.
In answer to your side question, there haven't been any official statements from the YouTube team about why comment retrieval hasn't yet been implemented, but it likely will be in time (as will other pieces of data retrievable via v2 but not yet via v3).