I want to analyze data for an account signup process on my web site. I'm using SiteCatalyst for web tracking.
I've set up a Fallout report with the 3 pages that makes up the signup process. In this report I can see how many visitors that I'm losing in every step and a "Total Conversion" and "Total Fallout" at the end.
I would like to plot the Total Conversion as a trendline over time. Is that possible?
I would also like to plot another version of Conversion, where the rate is the number of visits on the last page of the signup process but divided by the total number of visits to my site.
Thanks
Mike M
Yes you can. You need to trend a success event or a calculated metric though - so make sure that you've got an event being set or calculated when the conversion occurs. You can then trend the metric over time.
Tim
Related
I have reached the Daily Quota Limit, and have submitted the Quota Increase Form.
After seeing the confirmation notice of my submission, I have not heard or received an email from them.
Is there any other solution to this issue? How long does it usually take for them to get back?
With things considered, we may have to increase the daily quota up to 100,000.
Is there a way to collect multiple data from a single quota?
My website mainly involves collecting view counts of videos through video IDs.
I have submitted the YouTube API Services - Audit and Quota Extension Form.
Thank you in advance
The time to get the quota increase varies greatly. It kind of depends on how back logged the team is.
In the beginning when they reduced it to 10k and I applied for mine it took more then three months.
These days I think you should get something in less then two weeks but don't hold me to that I dont work for YouTube this is just my experience.
Oh and just check it now and then they may apply it before the actually send you an email saying that they are going to apply it.
I m boosting my website performance . I'm testing my website page speed with Google PSI. I've done maximum correction and i'm getting 90+ score in PSI . All lab data attributes are green . But Field data is still not being updated. I m just wondering, how long Google page speed insight take to update the Field data. also, If Fields data will update, then it will be same as the Lab data ?
Page Insight Screenshort
The data displayed is aggregated daily, so the data should change day to day.
However the reason you do not see the results of any improvements you make instantly is because the data is taken over a rolling 28 day period.
After about 7 days you should start to see your score improve, in 28 days the report data will be reflective of changes you made today.
Because of this delay I would recommend taking your own metrics using the Web Vitals Library so you can see the results in real time. This also lets you verify the changes you made work at all screen sizes and cross browser.
Either that or you could query the CrUX data set yourself on shorter timescales.
What all metrics are generally collected as part of Performance Testing of Cloud applications?
To be more specific is there any Industry standard values for the following metrics?.
Maximum allowed API response time - How fast my back end API returns results
User perception - What is the maximum acceptable time it should take to load a page completely for the user in the browser.
To give more context, my application is a cloud based real time application, where latency hits customer satisfaction.
The best metric to measure User Perceived Performance is Speed Index.
You can read the detailed explanation and how it's calculated:
https://sites.google.com/a/webpagetest.org/docs/using-webpagetest/metrics/speed-index
TLDR; "The Speed Index is the average time at which visible parts of the page are displayed. It is expressed in milliseconds and dependent on size of the view port."
Time to first paint, time to DOM content ready, etc, are all fundamentally flawed in that they measure a single point and do not convey the actual user experience.
Speed Index actually measures the speed that most of the UI is shown to the user which will give you a good measurement for how fast your users perceive your page to load.
You can use webpagetest.org tool to run web performance tests on your app (if it's public facing), choosing a wide range of configuration parameters (i.e. location, network speed and latency, browser, viewport, etc.).
It can even create a video of your web app loading!
I've been using Google Analytics very successfully for measuring apps. We learn a lot from the data. There's just one thing that I can't find any information about and that's the metric 'time on site'. It does measure something, but in my opinion time is often too long for average usage.
Can anyone explain to me if these data mean:
the time:
from opening app till going to background
from opening app till really terminating
Something else?
Many thanks in advance!
Greetings from Holland,
Sonja Kamst
Time on Site: Time on site is one way of measuring visit quality. If visitors spend a long time visiting your site, they may be interacting extensively with it. However, Time on site can be misleading because visitors often leave browser windows open when they are not actually viewing or using your site.
The average duration of visits (sessions) for the selected time frame. Session time is calculated by adding up time on page for each page in the session except for the last page in the session. The average time on site is determined by dividing the total time on site by the number of sessions for the selected time frame.
Calculations do not include the amount of time that visitors spend on the last page in the session, because there is no way to determine how long the visitor spent on the last page.
To illustrate, assume there are 3 visits (sessions) for the day, and you want to know the average visit duration for that day. Let's assume the three visits look like this:
Page 1 (10:00 a.m.) --> Page 2 (10:05) --> Page 3 (10:06) --> Exit
Page 1 (9:00 a.m.) --> Page 2 (9:01) --> Page 3 (9:06) --> Exit Page
1 (2:00 p.m.) --> Page 2 (2:15 p.m.) --> Page 3 (2:16) --> Exit
In each of these sessions, it is possible to use the time stamp on the subsequent pages to calculate the time spent on pages 1 and 2 (5, 1, and 15 minutes respectively). However, it is not possible to calculate how much time the visitors spent on the last page in the session, because there is no data available to Analytics that indicates when the visitor left.
In this example, the calculations would be:
Total time on site: 21 minutes
Average time on site (21/3): 7:00
from http://support.google.com/analytics/bin/answer.py?hl=en&answer=1006253
The time it measures is from opening the app to actually terminating it. So, for instance, if the app is maintained in a background state and you haven't included logic to end the session, your time in app numbers will be artificially inflated if/when it's brought back into the forefront.
I can't speak to the specifics of monitoring an application state, but there is some info on time on site issues for iOS here on SO:
Why are my iOS app's session lengths 30 min + in Google Analytics?
I have a site with several pages for each company and I want to show how their page is performing in terms of number of people coming to this profile.
We have already made sure that bots are excluded.
Currently, we are recording each hit in a DB with either insert (for the first request in a day to a profile) or update (for the following requests in a day to a profile). But, given that requests have gone from few thousands per days to tens of thousands per day, these inserts/updates are causing major performance issues.
Assuming no JS solution, what will be the best way to handle this?
I am using Ruby on Rails, MySQL, Memcache, Apache, HaProxy for running overall show.
Any help will be much appreciated.
Thx
http://www.scribd.com/doc/49575/Scaling-Rails-Presentation-From-Scribd-Launch
you should start reading from slide 17.
i think the performance isnt a problem, if it's possible to build solution like this for website as big as scribd.
Here are 4 ways to address this, from easy estimates to complex and accurate:
Track only a percentage (10% or 1%) of users, then multiply to get an estimate of the count.
After the first 50 counts for a given page, start updating the count 1/13th of the time by a count of 13. This helps if it's a few page doing many counts while keeping small counts accurate. (use 13 as it's hard to notice that the incr isn't 1).
Save exact counts in a cache layer like memcache or local server memory and save them all to disk when they hit 10 counts or have been in the cache for a certain amount of time.
Build a separate counting layer that 1) always has the current count available in memory, 2) persists the count to it's own tables/database, 3) has calls that adjust both places