I am using GA add on for Sheets to transfer data from GA to Sheets.
I want to transfer user retention cohort data, but I am unable to figure out the right metrics and dimensions for the same.
I tried metric: ga:cohortRetentionRate
and dimension: ga:cohort, ga:cohortNthDay, and some others
But, I keep getting the error "Selected dimensions and metrics cannot be queried together" even though I checked on dev tools that they can be queried together.
Would really appreciate some help!
It is possible to compose a cohort requests by using the Request Composer tool, in the Cohort Request tab.
Programmatically, to make use of these new dimensions and metrics you must construct a V4 cohort request: https://developers.google.com/analytics/devguides/reporting/core/v4/advanced#cohorts
Related
I need help I integrate firebase analytics in my flutter projects. Wrote some custom events and want to work with data from this events, that why i create metrics and dimensions but it doesn't work because I got data thresholds, and got this message:
Data in a report or exploration may be withheld when Google signals is enabled and you have a low user count in the specified date range.
So, my question is, how can I turn off this data threshold?
Because my app now, using only 5-10 people and I understand that it not enough for analytics but I read docs and found that possible turn off google signal and all should work. Can some one help me with it?
By default, Google Analytics uses a "User-ID" for tracking users across devices and sessions, that must be provided by the developer. If you are not tracking users by ID however, it appears (based on experience) that events will always have a warning triangle with "thresholding applied", even with a large event count over a long time, say 50k events over a year.
One solution is to switch to device-only reporting, as it is not subject to thresholding. Of course, if you do track users with a persistent ID, this will not apply to you, in which case you may want to check that the user ID is being set properly so that Google Analytics can determine that the thresholds for reporting are being met with enough users.
If device-only reporting is an option for you, you can change this setting like so:
Log in to Google Analytics
Go to Admin in the lower left corner with a gear icon (not the Configure menu!)
Select your "Property" in the top right area, meaning the project for which you want to make this change. You may only have one which is already selected.
Select Reporting Identity
Select Show all, if visible
Select Device-based
Save your change
Google also provides some instructions on how to do this here.
Finally, I have seen this setting change from device-based to something else as Google changes how data is collected, so you may need to revisit this setting and re-confirm device-based.
There is no way to change the thresholds that Google Analytics uses in its reports.
Your only option to get access to this data is to enable the integration between Analytics and BigQuery in your Firebase project, and then perform your own reporting on the BigQuery data.
i have an apps that show event in firebase console and i have done linked project to big query, but in big query only show dataset firebase_crashlytics and firebase_messaging. How i could retrieving event data to big query ?
here are my event
my dataset in big query
Fron the documentation: https://support.google.com/firebase/answer/6318765?hl=en
Once an app is linked to BigQuery, a corresponding dataset is created
in the associated BigQuery project upon the first daily export of
events. Each day, raw event data for each linked app populates a new
table in the associated dataset, and raw event data is streamed into
an intraday BigQuery table in real-time. Data prior to linking to
BigQuery is not available for import.
Apart from that:
check that the connection is not disabled;
if you have recently linked, it may take at least one day to view the data on BQ.
Finally i have done the problems, i do :
Enable integration to Google Analytics
Create dummy apps
Upgrade Blaze plan
Create big query dataset
Set up the export (Enable integration to big query)
full step by step that i follow :
Enable Big query Export
In my ios app, users upload files. I am logging a custom event called "upload_time" because I would like to see approximately how long uploads are taking.
FIRAnalytics.logEvent(withName: "upload_time", parameters: [
kFIRParameterItemID: "upload_time_\(Constants.versionNumber)",
kFIRParameterItemName: val
])
I would like to be able to filter by the version number of the app and see the percentages of upload times. I have divided up times in 10s brackets so "val" is just rounded up to the nearest 10.
Just like how the select_content default event allows you to filter by content_type and then item_id, I would also like to be able to filter by version number and see the percentages for the different brackets of times in the console. At the moment, it seems that what I have setup is just adding up all the values for each day.
How I setup parameters in the console
Would greatly appreciate any help.
There's no way to configure ad hoc reports in the Firebase console.
If you want reports other than those provided in the console, then your best bet here would be to export the results to BigQuery and use a visualization tool.
Once you have these set up, the sky is the limit :)
Firebase Performance Monitoring sounds like a better fit if you're trying to measure upload times. Check out the getting started guide here. Performance Monitoring actually captures a bunch of network data automatically.
In addition, Performance Monitoring lets you filter by a number of parameters, such as device type, OS version, app version, and more. It's still in beta, so if there's some functionality that you'd like to have that isn't there yet, feel free to file a feature request.
To add another way to make this work in GA for Firebase is to Export the Firebase data to Big Query and run a query that calculates the percentage of upload time from all you app instances filtered by version.
Take a look at the Step 6 of this doc on sample query for Big Query data gather by Firebase.
In the context of Google Analytics, I wonder if I can get granular data for an account in the form of a table --or multiple tables that could be joined --containing all relevant information collected per user and then per session.
For each user there should be rows describing in detail the activities and outcomes --micro and macro-- of each session. Features would include source, time of visit, duration of visit, pages visited, time per page, goal conversions etc.
Having the row data in a granular form would enable me to apply machine learning algorithms that would help me explore the data and optimize decisions (web design, budget allocation, biding).
This is possible, however not by default. You will need to set up custom dimensions to be able to identify individual clients, sessions, and timestamps to be able to get list wise user data, rather then pre-aggregated data. A good place to start is https://www.simoahava.com/analytics/improve-data-collection-with-four-custom-dimensions/
There is no way to collect all data per user in one simple query. You will need to run multiple queries, pivot tables, etc. and merge's to get the full dataset you are currently envisaging.
Beyond the problem you currently have, there is also then the problem of downloading the data.
1) There is a 10,000 row limit, so you will need to make a loop to download all available rows.
2) Depending on your traffic, you are likely to encounter sampled data, so you will need to download the data per day, or hour to avoid Google Analytics sampling.
I've been playing around with the Twitter API using Twitter4j. I am trying to pull data given a keyword and date, and example of a query I would run using the REST API would be
bagels since:2014-12-27
Which would give me all tweets containing the keyword 'bagels' since 2014-12-27.
This works in theory, but I've quickly exceeded the rate limits since each query allows up to 100 results, and only 180 queries are allowed within a 15-minute interval. There are many keywords that return more than 18k results.
Is there a better way to pull large amounts of data from Twitter? I looked at the Streaming API but I don't know if I can pull data from a certain date range.
There are a few things you can do to improve your rates:
Make sure your count is maxed at 100, which it looks like you're doing.
Use Application-Only authorization - it increases your rate limit to 450.
Use the max_id, since_id parameters to page through data and avoid querying for results you're already received. See the Working with Timelines docs to see what I mean.
Consider using Gnip if you're willing to pay to remove rate limits.