Is it possible for users to interact without querying the backend? - ios

I am essentially building an App that enables people to score a given image. This scoring happens over an hour, during which the UI label representing the average score is being updated after each user scores. Is there a way to enable this interactive space where people can score, see what others have scored, and the label to be updated accordingly. Then when the time is up, save it to Parse as opposed to constantly saving data and querying the backend...seems unnecessarily clunky. thanks.

Related

How long google Page Speed Insight take to update Field Data

I m boosting my website performance . I'm testing my website page speed with Google PSI. I've done maximum correction and i'm getting 90+ score in PSI . All lab data attributes are green . But Field data is still not being updated. I m just wondering, how long Google page speed insight take to update the Field data. also, If Fields data will update, then it will be same as the Lab data ?
Page Insight Screenshort
The data displayed is aggregated daily, so the data should change day to day.
However the reason you do not see the results of any improvements you make instantly is because the data is taken over a rolling 28 day period.
After about 7 days you should start to see your score improve, in 28 days the report data will be reflective of changes you made today.
Because of this delay I would recommend taking your own metrics using the Web Vitals Library so you can see the results in real time. This also lets you verify the changes you made work at all screen sizes and cross browser.
Either that or you could query the CrUX data set yourself on shorter timescales.

Getting Granular Data from Google Analytics to enable Machine Learning applications

In the context of Google Analytics, I wonder if I can get granular data for an account in the form of a table --or multiple tables that could be joined --containing all relevant information collected per user and then per session.
For each user there should be rows describing in detail the activities and outcomes --micro and macro-- of each session. Features would include source, time of visit, duration of visit, pages visited, time per page, goal conversions etc.
Having the row data in a granular form would enable me to apply machine learning algorithms that would help me explore the data and optimize decisions (web design, budget allocation, biding).
This is possible, however not by default. You will need to set up custom dimensions to be able to identify individual clients, sessions, and timestamps to be able to get list wise user data, rather then pre-aggregated data. A good place to start is https://www.simoahava.com/analytics/improve-data-collection-with-four-custom-dimensions/
There is no way to collect all data per user in one simple query. You will need to run multiple queries, pivot tables, etc. and merge's to get the full dataset you are currently envisaging.
Beyond the problem you currently have, there is also then the problem of downloading the data.
1) There is a 10,000 row limit, so you will need to make a loop to download all available rows.
2) Depending on your traffic, you are likely to encounter sampled data, so you will need to download the data per day, or hour to avoid Google Analytics sampling.

Query high frequency firebase data at a lower frequency

We're currently logging measurements to a Firebase database every 3 seconds.
But I want to graph the data over different periods, sometimes 5 mins, in which case 3 seconds resolution is ok( ~100 points). However if I want to see how it changes over 12hrs at the 3second resolution, I'll have 14,400 points.
For the longer time periods, I want to drop the resolution to reduce the data points.
As we're using Firebase there's no backend to query the database and then filter the data, it's the UI that queries the DB so it has to be in the query.
Is there any standard methodologies for handling this? (or common names for this that I can search on)
Does anyone know a Firebase specific solution while querying
Is it best to save whether this is 1min,5min,10min,1hr data when the data is first saved? (this is a less preferred solution as the data being set to firebase is from a small ESP8266 microcontroller without a huge amount of memory)
Many thanks in advance
Example data:
In the end I went with a variant on option three, pushing the data into 3 different locations depending on time interval, 3sec, 1min, 10min
This means I'm not sending up any extra information, just to a different location.
When you do the queries, while limited to fixed intervals, I can just query those three locations

How to keep track of daily data?

In my rails application, I want to keep several daily metrics in order to see how this data changes over times. In other words, if I want to see how many times a user logged in on a particular date (and therefore allowing me to accumulate this data over times).
Some of this data I can figure out through queries, such as the number of posts a user made on a particular day (because the post model includes a date). However, there are many different daily metrics I want to keep track of.
I thought of creating a DataPlayers model which has data for every player and every day creating a new instance of this, but I don't think that is the best approach.
Are there best practices for this type of data collection?
You could use a gem like SqlMetrics to track events as they happen.
It stores the events in your own database so its easy to query them via sql.

calculating lots of statistics on database user data: optimizing performance

I have the following situation (in Rails 3): my table contains financial transactions for each user (users can buy and sell products). Since lots of such transactions occur I present statistics related to the current user on the website, e.g. current balance, overall profit, how many products sold/bought overall, averages, etc. (the same also on a per month/per year basis instead of overall). Parts of this information is displayed to the user on many forms/pages so that the user can always see his current account information (different bits of statistics is displayed on different pages though).
My question is: how can I optimize database performance (and is it worth it)? Surely, if the user is just browsing, there is no need to re-calculate all of the values every time a new page is loaded unless a change to the underlying database has been made?
My first solution would be to store these statistics in their own table and update them once a financial transaction has been added/edited (in Rails maybe using :after_update ?). Taking this further, if, for example, a new transaction has been made, then I can just modify the average instead of re-calculating the whole thing.
My second idea would be to use some kind of caching (if this is possible?), or to store these values in the session object.
Which one is the preferred/recommended way, or is all of this a waste of time as the current largest number of financial transactions is in the range of 7000-9000?
You probably want to investigate summary tables, also known as materialized views.
This link may be helpful:
http://wiki.postgresql.org/wiki/Materialized_Views

Resources