How long google Page Speed Insight take to update Field Data - lighthouse

I m boosting my website performance . I'm testing my website page speed with Google PSI. I've done maximum correction and i'm getting 90+ score in PSI . All lab data attributes are green . But Field data is still not being updated. I m just wondering, how long Google page speed insight take to update the Field data. also, If Fields data will update, then it will be same as the Lab data ?
Page Insight Screenshort

The data displayed is aggregated daily, so the data should change day to day.
However the reason you do not see the results of any improvements you make instantly is because the data is taken over a rolling 28 day period.
After about 7 days you should start to see your score improve, in 28 days the report data will be reflective of changes you made today.
Because of this delay I would recommend taking your own metrics using the Web Vitals Library so you can see the results in real time. This also lets you verify the changes you made work at all screen sizes and cross browser.
Either that or you could query the CrUX data set yourself on shorter timescales.

Related

Lighthouse doesn't calculate right results

I'm optimizing a website using the Lighthouse. I can see that it doesn't really calculate right the results.
Here are some points I think it's wrong in the calculation.
First, although the website time load around 200ms and as you can see the site loaded on first frame but in the lighthouse, all the values of "First Contentful Paint", "Time to Interactive", "Speed Index", and "Largest Contentful Paint
" always are 1.8 seconds for every tests (likes fake number) and it's too high compared with the load time.
The "Avoid long main-thread tasks" show the time so big compared with the load time.
Second, I have kept minimum number of the resources and tried to minimize much as much posible the size of the resources, but the Lighthouse still show the suggestion for this one
How can I pass those things?
After read the comment of Graham, I have some answers already.
However, even I removed all the JS code and all the image, the "First Contentful Paint" reduces 0.1s and the score still is 99. I'm not sure if Google team considers that the website contains only a line text is popular in the morden website. And of course currently, the PWA is not standard yet and even we use the PWA we still should load for "full state" as well.

Query high frequency firebase data at a lower frequency

We're currently logging measurements to a Firebase database every 3 seconds.
But I want to graph the data over different periods, sometimes 5 mins, in which case 3 seconds resolution is ok( ~100 points). However if I want to see how it changes over 12hrs at the 3second resolution, I'll have 14,400 points.
For the longer time periods, I want to drop the resolution to reduce the data points.
As we're using Firebase there's no backend to query the database and then filter the data, it's the UI that queries the DB so it has to be in the query.
Is there any standard methodologies for handling this? (or common names for this that I can search on)
Does anyone know a Firebase specific solution while querying
Is it best to save whether this is 1min,5min,10min,1hr data when the data is first saved? (this is a less preferred solution as the data being set to firebase is from a small ESP8266 microcontroller without a huge amount of memory)
Many thanks in advance
Example data:
In the end I went with a variant on option three, pushing the data into 3 different locations depending on time interval, 3sec, 1min, 10min
This means I'm not sending up any extra information, just to a different location.
When you do the queries, while limited to fixed intervals, I can just query those three locations

Is it possible for users to interact without querying the backend?

I am essentially building an App that enables people to score a given image. This scoring happens over an hour, during which the UI label representing the average score is being updated after each user scores. Is there a way to enable this interactive space where people can score, see what others have scored, and the label to be updated accordingly. Then when the time is up, save it to Parse as opposed to constantly saving data and querying the backend...seems unnecessarily clunky. thanks.

SiteCatalyst conversion rate as trendline

I want to analyze data for an account signup process on my web site. I'm using SiteCatalyst for web tracking.
I've set up a Fallout report with the 3 pages that makes up the signup process. In this report I can see how many visitors that I'm losing in every step and a "Total Conversion" and "Total Fallout" at the end.
I would like to plot the Total Conversion as a trendline over time. Is that possible?
I would also like to plot another version of Conversion, where the rate is the number of visits on the last page of the signup process but divided by the total number of visits to my site.
Thanks
Mike M
Yes you can. You need to trend a success event or a calculated metric though - so make sure that you've got an event being set or calculated when the conversion occurs. You can then trend the metric over time.
Tim

Tracking impressions/visits per web page

I have a site with several pages for each company and I want to show how their page is performing in terms of number of people coming to this profile.
We have already made sure that bots are excluded.
Currently, we are recording each hit in a DB with either insert (for the first request in a day to a profile) or update (for the following requests in a day to a profile). But, given that requests have gone from few thousands per days to tens of thousands per day, these inserts/updates are causing major performance issues.
Assuming no JS solution, what will be the best way to handle this?
I am using Ruby on Rails, MySQL, Memcache, Apache, HaProxy for running overall show.
Any help will be much appreciated.
Thx
http://www.scribd.com/doc/49575/Scaling-Rails-Presentation-From-Scribd-Launch
you should start reading from slide 17.
i think the performance isnt a problem, if it's possible to build solution like this for website as big as scribd.
Here are 4 ways to address this, from easy estimates to complex and accurate:
Track only a percentage (10% or 1%) of users, then multiply to get an estimate of the count.
After the first 50 counts for a given page, start updating the count 1/13th of the time by a count of 13. This helps if it's a few page doing many counts while keeping small counts accurate. (use 13 as it's hard to notice that the incr isn't 1).
Save exact counts in a cache layer like memcache or local server memory and save them all to disk when they hit 10 counts or have been in the cache for a certain amount of time.
Build a separate counting layer that 1) always has the current count available in memory, 2) persists the count to it's own tables/database, 3) has calls that adjust both places

Resources