I have encounter some problems when using YouTube API recently, and I would like to ask if you have the same problem like I do and if anyone have any solution.
Before, I had 100 millions and 50 millions quotas per day, but I just found out that the quota of some keys with less usage has decreased a lot (500 millions has decreased to 300K and the one with 1 million has decreased to 600K and 10K )
The info that I found are all from the project of 2016, and the quotas are all 10K or other case like the whole project had been shut down, so the quota is 0. And none of them is the same as the problem that we have encountered. So I would like to ask if anybody knows why does this happen and how can we prevent and resolve it. Thanks a lot!
The default quota limit is now 10,000 units:
Projects that enable the YouTube Data API have a default quota
allocation of 10 thousand units per day, an amount sufficient for the
overwhelming majority of our API users...
https://developers.google.com/youtube/v3/getting-started#quota
Previously they had given 1 million units to new accounts. My own APIs were each reduced from 1 million to 10k also, because I never use even 5k units. You can ask for more units if you reach the quota limit, inside your Developer Console, IAM & Admin > Quotas > EDIT QUOTAS:
The only way to increase your Quota is to fill out this form https://support.google.com/youtube/contact/yt_api_form and submit your request to YouTube. Then you have to wait no less than two weeks.
Be careful: if your app doesn't respect YouTube TOS, they will terminate it
Related
I need data of the 500K videos for my research project. How to increase the limit of the requests. Is there any paid services for youtube data
You will need to apply for an extension via the YouTube API Services - Audit and Quota Extension Form
Maths
The standard quota you get with the YouTube api is 10000 quota units.
The Search.list method returns a max of 50 records each request. This method costs 100 points per request.
500,000 views needed / 50 = 10000 requests.
10000 requests * 100 quota costs = 1,000,000 quota units
I would try applying for at least a million quota units.
I would love to hear if they approve it.
Is it always true that the cost of a video upload is 1600?
I can see on this page that the videos / insert mutation costs 1600, but I was wondering if this could take more if I upload a very large video?
yes videos insert costs 1600 quota units. The size, length or quality of the video does not matter. This is stated at the top of the Quota cost page
The table below shows the quota cost for calling each API method. All API requests, including invalid requests, incur a quota cost of at least one point.
All costs are incurred per call to the method in question.
I'm creating a highly picture oriented app that might end up using a lot of ckassets. But I read that there is a 25mb limit on daily data transfer per user. My question is is this data transferrable? If one user uses 0 then some other person can use 50?
I feel like 25mb limit on data transfer seems so small since one pic is 100k so one can only play w 250 pics max per day. It just seems like such a drastic limitation. Thank you.
The data transfer limits for CloudKit are monthly and are based on the number of active users. You get 50MB/month per user with a minimum of 2GB.
The 50MB/month/user is only used to calculate the free quota; it is not an actual per-user limit, so if some users transfer 150MB and some transfer 0 that is fine. You only pay if your total transfer for all users exceeds 50MB*number of users (or 2GB if you have less than 40 users)
In your question you quote 25MB/day but the limit is actually monthly, so if every user used 50MB a month that would mean they could transfer about 16 images per day.
Extra data is fairly inexpensive though. Say you had 40 users and they transferred 50 images each per day, that would be 6GB per month, which would cost you $0.40
Note that the maximum free transfer is 200TB/month so above 4,000,000 active users the 50MB/user no longer applies, the available transfer is less on a per-user basis but the 200TB is still applied as an aggregate across all users.
I have been having trouble with a page crashing on my website and I have worked out that it is because the memory limit is too low. I have read an article (http://codingcyber.com/how-to-increase-php-memory-limit-on-godaddy-hosting-882/#) and decided to buy more RAM and I am about to increase the memory limit.
Before I do, I just want to know what will happen if multiple users are using the site all at once. I guess a clearer way to explain my question is with an analogy.
If I have 2048mb ram and a memory_limit = 256mb, what happens if 20 users all login at once and use 50mb of the RAM? I imagine that since no one has exceeded the 256mb limit and the total RAM used (50mb * 20 users = 2000mb) is just under the 2048 limit, that the site should be ok, but I just want to confirm that this is correct (I've never done anything like this before).
Thanks for confirming or correcting.
Just spoke with godaddy and they explained that it's not a problem to have extra people as long as the TOTAL RAM used is lower than the site limit. If one person goes over it will only impact them. If everyone goes over the limit as a whole (unlikely in my case) that's when we will have problems...
I'm choosing an analytics service for my iOS app. I want to track quite a lot of events and the app I'm developing is going to be used outdoors, so there will be no wi-fi connection available, and even the cellular connectivity can be of a poor quality.
Analytics is the only thing that requires network connectivity in my app. Recently I've checked how much traffic it consumes, and it consumes much more than I've expected. That was about 500KB for Google Analytics and about 2MB for Flurry, and that's just for a 2-minute long session with a few hundred events. It seems very inefficient to me. (Flurry logs a little bit more parameters, but definitely not 4 times more.)
I wonder — have anybody compared other popular analytics solutions for their bandwidth consumption? Which one is the slimmest one?
Thank you
If you don't need real time data (and you probably don't with outdoor app), you can get the best network compression for Analytics by dispatching more hits at once to benefit from batching and compression. To do that set the dispatch interval to 30 minutes. The maximum size of uncompressed hit that analytics will accept is about 8k so you should be sending less then that. With compression that would bring it down to ~25% of the original size for individual hit assuming mostly ascii data. To generate 500k of data you should be sending few hundred hits individually. With batching and compression the hits will shrink down more efficiently. Usually batch of 20 hits will compress to less then 10% of the uncompressed size or about 800 bytes per hit at most. For further network savings just send less data per event or fewer events. Btw, Analytics has a rate limit of 60 tokens that are replenished at a rate of 1 hit every 2 seconds. If you are sending few hundred events in short period of time your data is likely getting rate limited.
https://developers.google.com/analytics/devguides/collection/ios/limits-quotas#ios_sdk