Adobe Site Catalyst capturing success with events vs PageViews - adobe-analytics

In our website which uses adobe analytics, we have a functionality where user submits a service request. When he submits the request, the pageName that we sent is "myapp:service-request-submitted". We are also capturing this in an event say event6.
Each time the user submits the service request as per my understanding, the "Occurrence" or "PageViews" should increase by "1" right? The same should happen with event count also. Is this assumption correct?
But unfortunately, when I checked the report suite, there is a big difference in these numbers. PageViews is much higher than (30% more) event6 count.
Is there any logical reason for this behavior? Or my understanding of events and pageView count calculation is not correct?

Your understanding is correct... if every time that pageName gets sent on a beacon, you are also setting event6, then I would expect the numbers to match.
Some things that may be happening here:
Does event6 have any sort of serialization turned on? In the Adobe Analytics Admin Console, you can specify if an event should "always be captured" (the default option), count "once per visit", or count "once per unique ID" (so you could have it tied to a confirmation number). I'd check on that.
Is there a chance that pageName is getting set on beacons without that event6 set? Are there other beacons firing on that page (beyond just the pageload)?

Related

ActionCable subscriber does not get an update

Consider scenario in the App:
User submits an answer then gets taken to a page where they get subscribed to an ActionCable.
Every time a user submits an answer, a broadcast also happens so that subscribed users can be updated with the new user and their answer.
Problem: When two users submits an answer at almost the same time (within 1 second)
one user ends up seeing two answers, theirs and the other user
the other user ends up only seeing their answer
Question:
Is there a way to make it so that both players see each other's answers?
I've been searching for solutions but have not found any. I may have stumbled upon the cause of the issue from rubyonrails.org:
If you're not streaming a broadcasting at the very moment it sends out an update, you will not get that update, even if you connect after it has been sent.
Is there a way to make it so that both players see each other's answers?
Load the answers from the database on page load, and only use broadcasts to "update" what gets seen on-screen in real-time.
If updates are broadcasted for data already on-screen, either do nothing or use that to replace the answer already shown.
Subscribing to or listening to WebSocket messages is like tuning into a radio station. You have no idea what may have broadcasted before you started listening, so you'll need to get that data from elsewhere.
Alternatively, if you'd prefer to start with a blank slate, you could instruct the client-side code that after it is subscribed, to make an API call (or ActionCable request) to get all the relevant data needed, and also use broadcasts to "update" that data.

Adobe Analytics - PurchaseID set with Timestamp Issue

We are having a lot of transactions on the site, so for this reason we are re-cycling our booking confirmation numbers/ order id numbers on the confirmation screen which is set into our purchaseID . Since we are re-using our booking confirmation number, in order to make our purchaseID unique we are adding timestamp to our purchaseID variable using pipe delimeter. So formula looks like:
purchaseID = order_id + '|' + timestamp (current date).
My concern here is, let's say I make a booking today and my purchase id looks like -
purchaseID = 5747118 | 6-7-2019
Now I access my confirmation screen again tomorrow and after 2 days, 3 days and so on and I see adobe calls firing. Because I accessed my confirmation page on different dates my timestamp changed and thus my purchaseID is not unique anymore. Even though I am seeing my same booking confirmation page my purchaseID is not unique now. Does this mean, every time i view my confirmation screen on a different day my booking/revenue would be counted multiple times ? If yes, what's the best way to tackle this issue ?
So it sounds like someone can go to your site, make a purchase and see the confirmation page, and then later on, go back to the same confirmation page without actually making another purchase. Maybe they bookmarked the page and come back to it later for reference. Or maybe they refreshed the page, because reasons.
Does your site charge their credit card for accessing the page again? I sure hope not. Your site/coding should be structured in a way that does not keep charging the customer more money every time they view the page again.
And your code logic for outputting Adobe Analytics should be structured in the same way: your coding logic should be that you only output purchase event and variables (e.g. purchaseID) when a purchase actually occurs.
In practice, this sometimes isn't easy to do because of how the site is structured. So part of why purchaseIDexists is to de-duplicate purchases, so that if purchase event and data is re-popped, it will be de-duped. But it only works if you output the same purchaseID when the visitor refreshes the page or otherwise comes back to it later on (where they aren't actually making another purchase).
Which it sounds like you were doing with the original booking confirmation number you pushed to purchaseID. But things went south when you decided to throw a current datestamp into the mix because you started recycling booking confirmation numbers. Well you can't do that. You can use a dynamic value such as the current date/timestamp as part of the value, but you must remember it, and output it in the future.
Maybe this involves adding an extra column to your database with the date/timestamp of purchase (which I have to assume you surely already have), and then pull that value when you pull the booking confirmation number.
Or maybe the solution involves stepping back and rethinking the fact you are recycling booking confirmation numbers. This seems like a bad idea to me. It's definitely a bad idea for your Adobe Analytics implementation, as you have seen for yourself. But is this not a bad idea in general? What happens if a customer buys something today and has # 12345 as proof of transaction to reference, and then tomorrow, a week, a year or whatever from now, some other customer gets the same number?
It stands to reason that you will end up with a mess on your hands, trying to sort out which customer bought what. Transaction ids by their nature are supposed to be forever unique to the transaction. So my very first recommended solution to you would be to stop recycling your booking confirmation numbers. Move to a different format if you need to (e.g. UUID).
Failing that, my next recommendation would be what I said a couple paragraphs up, about storing the date/timestamp in a column at the actual time of purchase (which surely you already have), and then grab and use that value along with the booking confirmation # to use as delimited value, instead of generating the current date on the fly (which absolutely does not work).

Revulytics data not showing in Dashboard

I am using Revulytics SDK to track feature usage and came across the below problem.
I am sending feature usage after properly setting up the SDK configuration etc, using the EventTrack() method like this:
GenericReturn grTest = telemetryObj.EventTrack("FeatureUsage", textBoxName.Text.ToString(), null, false);
This returns OK and usually, I can see the usage data in the dashboard. However, after multiple tests, the data I am sending does not show up on the dashboard.
Can anyone hint me how to debug this? Thanks for any help!
I hit a similar issue when first working with this SDK.
I was able to address this as soon as I understood the following:
There are event quotas for the incoming events;
Event names are used for making the distinction.
So when I was sending dummy test data, it made it there, but when I sent some demo data for stakeholders, it was not showing up.
I think the same happens here. You're getting the event name form textbox.text... Pretty sure that varies every time you run the code.
Here are the things to keep in mind when testing your code:
the server has a mechanism to discard / consider events;
implicitly, it allows first xx events depending on the quota;
if you are sending more than xx events, they will not show up in reports.
So, you must control which events to discard and which to consider (there are a couple of levels you can configure, and based of them you can get the events in various types of reports).
Find the "Tracked Events Whitelist Management". You will be able to control these things form there.
This blog helped me (it is not SDK documentation): https://www.revulytics.com/blog/getting-started-with-usage-intelligence-part2-event-tracking
Good luck!

Twitter - public Stream handling deletion notices

I am using the Twitter public stream API to search for some keywords. I am writing my script in Java and therefore I use twitter4j. Now I stumbled over the information about status deletion notices:
Status deletion notices (delete)
These messages indicate that a given Tweet has been deleted. Client
code must honor these messages by clearing the referenced Tweet from
memory and any storage or archive, even in the rare case where a
deletion message arrives earlier in the stream that the Tweet it
references.
https://dev.twitter.com/docs/streaming-apis/messages#Status_deletion_notices_delete
So I created methods to remove records from my database when such a notice occurs. Unfortunately such a notice never occurs. I searched to find out what I am doing wrong and found some posts in the twitter developer section concerning the same problem:
https://dev.twitter.com/discussions/17393
https://dev.twitter.com/discussions/19943
https://dev.twitter.com/issues/1355
https://dev.twitter.com/discussions/12836
but unfortunately all these discussions got no answer. So for me it seems like I did no mistake with my code but twitter4j never sends me an deletion notice.
I want to respect the privacy of the twitter users - at least for legal reasons. So my question is:
What can I do to respect the privacy of the users ?
What do I have to do to satisfy my legal duties ?
One alternative seems to be to periodically iterate through all saved Tweets in my Database and request them from twitter to see whether I get a result back or not (so they were deleted). But this doesn't seem to be a practicable way because the data will get more and more and therefore at some point of time I will have limitations (in time, allowed twitter requests, ...). So what should I do?
Thanks in advance! Your help is greatly appreciated.
Ludwig
twitter4j v.3.0.6
Given the nature of the volume of tweets, it's unreasonable to assume that you would check to see if all the tweets are still there. You should make sure that you properly act on a delete notice from twitter. The onus is on them to actually send the delete notification.
That being said, I receive delete notifications from twitter. However, we aren't using the public stream, we are using sitestreams, which relies on authorizing specific social accounts and streaming all updates for those accounts (e.g. favorites, follows, blocks, tweets, retweets, etc) to us in realtime.
If you are doing a stream with filters, for example, it's probably not feasible (or at least very taxing) to run all deleted items through the same pipeline as new items. Or perhaps, to guess at which you were sent based on the times that you were running your filter.
As noted in the issue you linked to, the public streaming API will not necessarily send them out. I'd endeavor to handle them, and possibly provide a tool to manually remove any if a request comes in through another channel, but not worry too much about it, given that twitter doesn't provide the proper facility to be notified of such instances.

Eventbrite API: What does event_list_attendees modified_after compare to?

I looked through the questions and found a question somewhat related but it wasn't the same.
If you use the event_list_attendees api call you get back a list of attendees. Those attendees have a modified field. One of the possible parameters in the api call is modified_after.
My question is regarding what triggers the modified field to update? Is this a user profile related field or is it related to this particular event ticket purchase? The api describes these two as the following:
modified_after Return only attendees whose “modified” value is equal
to or after this date/time (e.g., “2013-01-28 00:00:00″)
modified The date and time the event was last modified, in ISO 8601
format (e.g., “2007-12-31 23:59:59″).
Perhaps to explain why I am wondering what triggers modified to update. The goal is to create a small, one day use, mobile website that will allow users to see who has shown up so far for a local event I am working with. I know the api does not directly support this functionality. In my case however "close enough" is "good enough". If someone's ticket being scanned at the door triggers the modified field that would be sufficient.
So, does it?
Great question!
The modified attribute relates only to the individual attendee in the order. So, it won't be triggered by the account wide profile changes for that respective user. However, if a user logs in to Eventbrite and changes the information that specifically relates to this event (example: they change the spelling on their last name for this specific order).
Alternatively, you can actually use /event_list_attendees and set "display_full_barcodes" to "true" to see the status of the barcodes. When the barcode is used, you'll know that someone has been scanned in.
If you come up with a cool hack, then we'd love to check it out!
Hope that helps!

Resources