Parse.com cloud calls count as query? - ios

I am uisng parse.com cloud code and calling this method -
[PFCloud callFunctionInBackground:#"getTime" withParameters:#{} block:^(NSString *result, NSError *error)
this calls on the cloud code for -
Parse.Cloud.define("getTime", function(request, response) {
response.success(new Date().getTime().toString());
});
My question - doe's this call cost money? i mean is that count for a query/call that need to be payed for? (because you have 30 queries or calls per second and than you start paying for every other call)
thanks

That's not true, you're not charged for extra calls.
In simple words if you buy a machine worth $100 which packs 40 packets in one second and you piled 400 packs in front of it. Then it will take 10 seconds to pack all the packets. The manufacturing company will not charge you extra, just because you gave the machine 400 packs in just one second.
If your account is free then you are allowed only 30 calls/sec, which mean if you have 40 calls in one second then the reset 10 will be entertained in next second which will cause delay in the response.
And if your have pricing plan $100 per month then you are allowed 40calls/sec. Which mean if you have 50calls/sec, only 40 will be entertained in first second and the rest 10 will be entertained in the next second. You won't be charged for extra calls per second.
The whole pricing plan depends upon response time that you want to compromise. The more users the less response time. Go check the pricing plans and see how they are increasing the background jobs for your plan and the number of request.
You can check pricing plan here
Although the kind of payment you were talking about is true for the push notifications. In this case they will charge you
$0.05/1000 extra recipients

Related

What is the time unit for "datadelay" attribute on GoogleFinance?

I'm using the GoogleFinance() functions on a Google spreadsheet to keep track of my stocks. With the "datadelay" attribute I can check how long ago the data has been updated for the last time. But it only returns a raw number, like "54000" for one ticker and "15" for another. What time unit is that supposed to be? minutes? seconds? milliseconds?
When I check the documentation for the Google Finance I saw that there is a page explains there might be delay up to 20 minutes. They also mentioned that they are using different exchanges to retrieve market data and all this different exchanges might have different data delay. It can explain the differences in the "datadelay" column.
For the unit of this column, my assumption is it should be shorter than seconds since 54000 seconds = 900 min, which is far higher than the maximum delay defined in the help page. But I am not sure what would be value for this column when you query in the not-trading days.
The page shows delays for each exchange.
GoogleFinance() function updates every minute (if it is set so), but keep in mind that results may be delayed up to 20 minutes. so the answer is between 1-20 minutes

Twilio API "IncomingPhoneNumber.list()" is slow on Master account (And I only need the total amount of numbers)

I have a Flask App that uses twilio. I display the total amount of incomingphonenumbers in an account/subaccount to the user.
Here is what I'm currently doing to get the total:
client = Client(accountsid,accounttoken)
# Get the list
pnlist = client.incoming_phone_numbers.list(limit=1)
# Get the length of the list
total = len(pnlist)
This takes upwards of 19 seconds just to get the numbers in the Master account. On top of that I have to repeat this for all subaccounts.
Is there a better way to just get the total numbers for a account/subaccount?
Thanks in advance!
Twilio developer evangelist here.
I would recommend that, rather than listing all the numbers every time you want to count them, you cache the count within your own database against the account.
You can then update the count either when you know it changes, if you are purchasing/releasing numbers via the API in your app, or periodically (say, once a day) with a background job.

Change the delay from 20 to 1 minutes

Quotes are not sourced from all markets and may be delayed up to 20
minutes. Information is provided 'as is' and solely for informational
purposes, not for trading purposes or advice.
This advice appears when I use GOOGLEFINANCE() function in my spreadsheet. It is unfortunate that the data is delayed up to 20 minutes.
What is the best way to get real-time stock prices? Suppose my budget is around $50 per month.
Be aware that I trade only US equities, i.e. no bonds, no cryptocurrencies, and so on.
UPDATE
Here is a sample version of my portfolio spreadsheet : https://docs.google.com/spreadsheets/d/1hIfCuupmc_OZ6514DXFe_NrDCX1Ix6tcvySP_VolppI/edit#gid=42667785. It would be important for me to get the price in real-time, and not delayed by maximum 20 minutes.
Is there a way to fix that?
The GOOGLEFINANCE formula is not consistent with the delays. Different stocks can be delayed by different times. You can get an estimate of the delay by using GOOGLEFINANCE("TICKER","DATADELAY").
This is at least somewhat helpful, but not ideal, because you'll have a price on your sheet and you don't know exactly when the price was from, just an estimate of how old the price might be. And forget about pre-market or after-hours. Once the market closes, all bets are off you'll have no idea when the price is from (i.e. after hours quote or regular session close).
If you want accurate real-time quotes, you're going to need an add-on. You said your budget is $50. That doesn't leave you a lot of options. For $9 per month you can use the Market Data Add-on and get real-time stock prices along with historical intraday prices. There is also a free tier that gives you 100 free daily prices.
Market Data's STOCKDATA formula is a drop-in replacement for GOOGLEFINANCE, so it follows the same syntax. It will accomplish what you need. For example, STOCKDATA("SPY","ALL") will produce an output like this:
Date
Bid
Bid Size
Mid
Ask
Ask Size
Last
Volume
5/19/2022 9:09:48
388.36
1400
388.38
388.41
1400
388.37
2715229
Note that the date and time of the quote is included in the output, so you know exactly when the quote was fetched. There is no doubt as to whether the quote is coming from the previous day or whether it is a quote from the pre-market session (which is the case of this example). If you compare to the current time using NOW(), you'll find the Market Data quotes are delayed by about 1-2 seconds, which is due to network latency from your Google Sheet to the servers.
it's important to notice the word "may" in the first sentence:
...and may be delayed up to 20 minutes...
usually, it's way under 20 minutes (around 1 - 1:30 minutes), but there could be times when some delay may occur.
and to answer your question: no, it's not possible to force it under 1 minute
if you want to go full pro mode with Google Sheets then try: =CRYPTOFINANCE()
The documentation links from player0 indicate that ONLY crypto exchanges are supported. Data is NOT available from stock exchanges (NASDAQ, NYSE, etc).

Prepping Data For Usage Clustering

Dataset: I'm given the number of minutes individual customers use a product each day and am trying to cluster this data in order to find common usage patterns.
My question: How can I format the data so that, for example, a power user with high levels of use for a year looks the same as a different power user who has only been able to use the device for a month before I ended data collection?
So far I've turned each customer into an array where each cell is the number of minutes used that day. This array starts when the user first uses the product and ends after the user's first year of use. All entries in the cells must be double values (e.x. 200.0 minutes used) for the clustering model. I've considered either setting all cells/days after the last day of data collection to either -1.0 or NULL. Are either of these a valid approach? If not what would you suggest?
For the problem where you want both users (one that used the product a lot every day for a year, and the other used it a lot for one month), create a new entry where it's values are:
avg_usage per time_bin
time_bin can be a month, a day or another time bin which best fits your needs.
This way, a user which use a product, let's say 200 minutes per day for one year, will get:
200 * 30 * 12 / 12 = 6000 minutes per month
and the other user, which joined just last month, will also get, with the exact same usage will get:
200 * 30 * 1 / 1 = 6000 minutes per month.
This way, it doesn't matter when you have started to use the product, the only thing that matter, is the usage rate.
An important thing you might take into consideration, that products, may be forgotten for some time. for example, a computer, and I'm away for a vacation. Those days I didn't use my computer, doesn't have (maybe) an effect of my general usage of this product. So, based on your data, product and intuition you might consider removing gaps like the one I mentioned, and not take it into account inside the calculation.
The amount of time a user has used your product could be a signal of something, but if indeed he only started some time ago, and still using it until today, it may be something you need to take into consideration, and for that use, this average binning technique may help.

How can I measure the total time a user spends online with InfluxDB?

I'm measuring how long users are logged into a service. Every minute, for each user, their new total online time is sent to InfluxDB. I'd like to graph, in Grafana, the cumulative online time for all users.
What kind of query would I need to do that? I initially thought that I'd want sum(onlineTime) and group by time(1m), but I realized that's summing the values within that timeframe, not summing the totals of all users, so when a user wasn't logged in, the total would drop, because there were not data points for them.
I'm a bit confused about what I'm doing now. If I'm sending the wrong data, I can change that too.
So this depends on the time data you send back to InfluxDB
The time data is equal to the total time spent till that instant of time
In this case you would have to take the "last" value and add it up for all the users
The time is equal to the small increments
In this case you would have to add this multiple incremental value for a period of time.

Resources