Analytics events not being logged to Firebase - ios

I'm attempting to log simple analytic events to Firebase but nothing is ever showing up online.
Here is how I'm logging the event:
FIRAnalytics.logEventWithName("spot_view", parameters: [
"spot_name": spotName,
"is_private": isPrivate
])
I have the run time arguments on to see the Firebase output and I get this:
<FIRAnalytics/DEBUG> Event logged. Event name, event params: spot_view, {
"_o" = app;
"is_private" = 1;
"spot_name" = TestLogSpotView;
}
So the event is being triggered. I also get this showing that data is actually being uploaded:
2016-06-09 12:12:13.567 [60279:] <FIRAnalytics/DEBUG> Measurement data sent to network. Timestamp (ms), data: 1465488733550, <ACPMeasurementBatch 0x7de7bb60>: {
bundles {
protocol_version: 1
events {
params {
name: "_c"
int_value: 1
}
params {
name: "_o"
string_value: "auto"
}
name: "_f"
timestamp_millis: 1465488710347
}
events {
params {
name: "_et"
int_value: 1
}
params {
name: "_o"
string_value: "auto"
}
name: "_e"
timestamp_millis: 1465488710347
}
events {
params {
name: "_o"
string_value: "app"
}
params {
name: "is_private"
int_value: 1
}
params {
name: "spot_name"
string_value: "TestLogSpotView"
}
name: "spot_view"
timestamp_millis: 1465488710411
}
events {
params {
name: "content_type"
string_value: "cont"
}
params {
name: "_o"
string_value: "app"
}
params {
name: "item_id"
string_value: "1"
}
name: "select_content"
timestamp_millis: 1465488710411
}
user_attributes {
set_timestamp_millis: 1465488710347
name: "_fot"
int_value: 1465491600000
}
upload_timestamp_millis: 1465488733550
start_timestamp_millis: 1465488710347
end_timestamp_millis: 1465488710411
platform: "ios"
os_version: "9.3"
device_model: "x86_64"
user_default_language: "en-us"
time_zone_offset_minutes: -240
app_store: "manual_install"
app_id: “——“
app_version: "0.0.0"
gmp_version: 3200
uploading_gmp_version: 3200
resettable_device_id: “—“——
limited_ad_tracking: false
app_instance_id: “———“
bundle_sequential_index: 1
gmp_app_id: “———“
firebase_instance_id: “———“
app_version_major: 106
}
}
2016-06-09 12:12:13.568[60279:] <FIRAnalytics/DEBUG> Uploading data. Host: https://app-measurement.com/a
2016-06-09 12:12:13.595[60279:] <FIRAnalytics/DEBUG> Received SSL challenge for host. Host: https://app-measurement.com/a
2016-06-09 12:12:13.731[60279:] <FIRAnalytics/DEBUG> Successful upload. Got network response. Code, size: 204, 0
I ran this same code a couple of days ago and still nothing has showed up in Firebase.
I've also tried logging an event copied straight from Firebase which is this:
FIRAnalytics.logEventWithName(kFIREventSelectContent, parameters: [
kFIRParameterContentType:"cont",
kFIRParameterItemID:"1"
])
Which you can see in the log console output posted above.
The only other thing I could think would be wrong is in the GoogleService-Info.plist. There is an entry for IS_ANALYTICS_ENABLED which is set to no. I just flipped it to YES and going to try again although I don't believe this to be the fix. I think this entry only applies to Google Analytics.
Which also, other data such as device type and user sessions IS being logged. So it just logging events that doesn't work.

This may be one reason for custom events
Event name must contain only letters, numbers, or underscores.
My events were not logged as I was using a space in my event name.

One quick thing to check -- make sure your date range encompasses the date on which this event was logged. For example, if the event was logged today, you should change the date range to explicitly include Today. Ranges like "Last 30 Days" or "Last 7 Days" do not include Today.
If you don't see your reports update, you should contact support in order to get to the bottom of it more quickly.

Just for those unfortunate subjects (like myself) that get to this question while not seeing ANY Firebase analytics output anymore (no events in the Firebase console online and no local Firebase debug messages). For whatever reason, by accident, I removed the code that used to call
FIRApp.configure()
No matter what other flags or such I set, Analytics events that always worked fine before wouldn't show up. And the logging code still worked without any complaints -- just not doing anything apparently. The configure() call was the last thing that I came to check.

This might be a late answer but I double checked other solution and still had similar issue.
I fixed the issue by looking at the parameters sent with the event. Make sure all parameters are sent have a value.

Related

Is it possible to get a list of all streams using the spring cloud dataflow java rest client?

I'm using the spring cloud dataflow java rest client (https://docs.spring.io/spring-cloud-dataflow/docs/current/api/) and want to use it to retrieve all currently deployed streams.
It's easy enough to get a StreamOperations object and get a list of streams from it:
val template = DataFlowTemplate(<someUri>)
val streamOperations = template.streamOperations()
val streamDefinitionResources = streamOperations.list()
The streamDefinitionResources in the above is actually a PagedModel<StreamDefinitionResource>, that holds the first page of results using a page size of 2000.
I don't, however, see any way to iterate through all the pages to get all the streams using the java rest client (i.e. there's no paging support available via the StreamOperations or StreamDefinitionResource classes).
Is it possible to get all the streams using only the java rest client? Am I missing something?
Summary
The PagedModel<StreamDefinitionResource> has a getNextLink() method that you can use to manually traverse the "next" page of results.
Details
The underlying Dataflow REST API supports paging via page number and size request parameters and returns HAL responses that include _links to the next and previous pages.
For example, given 10 stream definitions this HTTP request:
GET localhost:9393/streams/definitions?page=0&size=2
returns the following response:
{
_embedded: {
streamDefinitionResourceList: [
{
name: "ticktock1",
dslText: "time | log",
originalDslText: "time | log",
status: "undeployed",
description: "",
statusDescription: "The app or group is known to the system, but is not currently deployed",
_links: {
self: {
href: "http://localhost:9393/streams/definitions/ticktock1"
}
}
},
{
name: "ticktock2",
dslText: "time | log",
originalDslText: "time | log",
status: "undeployed",
description: "",
statusDescription: "The app or group is known to the system, but is not currently deployed",
_links: {
self: {
href: "http://localhost:9393/streams/definitions/ticktock2"
}
}
}
]
},
_links: {
first: {
href: "http://localhost:9393/streams/definitions?page=0&size=2"
},
self: {
href: "http://localhost:9393/streams/definitions?page=0&size=2"
},
next: {
href: "http://localhost:9393/streams/definitions?page=1&size=2"
},
last: {
href: "http://localhost:9393/streams/definitions?page=4&size=2"
}
},
page: {
size: 2,
totalElements: 10,
totalPages: 5,
number: 0
}
}
The Dataflow Java REST client exposes this HAL response in the PagedModel<StreamDefinitionResource> response which provides a getNextLink() method.
Caveat 1) However, the current implementation (as you pointed out) is hardcoded to page size of 2000. This means you would not see this behavior until you had more than 2000 stream definitions.
Caveat 2) Another point to note is that traversal of the link to the "next" page is not automatically handled and you would need to manually invoke the links URL to retrieve the next page.
Assume the StreamOperations.list accepted a page size parameter the code could look something like this:
int pageSize = 2;
PagedModel<StreamDefinitionResource> pageOfStreamDefs = streamOperations().list(pageSize);
pageOfStreamDefs.getNextLink()
.ifPresent((link) -> someFunctionToInvokeAndProcessNextPage(link.getHref());
More details on the REST API parameters can be found here.
I guess I'm a bit late, but I had the same issue and found a workaround. As onobc said, the PagedModel of any resource has a getNextLink() method that returns a Link with the next page address.
You can use the same RestTemplate from DataflowTemplate to handle these next requests:
PagedModel<StreamDefinitionResource> streamDefPage = dataflowTemplate.streamOperations().list();
// Process page here
if (streamDefPage.getNextLink().isPresent()) {
Link link = streamDefPage.getNextLink().get();
PagedModel<StreamDefinitionResource> streamDefNextPage = dataflowTemplate.getRestTemplate().getForObject(link.getHref(), StreamDefinitionResource.Page.class);
// Process next page here
}
And so on.
Hope this helps!

SendInBlue trackEvent returns 204 but does not show event in the console

I am trying to send an event using the SendInBlue API here.
When I send the event, it returns a 204 correctly - but I am not getting any events here and I have created an automation flow which is triggered by the event, and it does not send.
const axios = require("axios");
const url = 'https://in-automate.sendinblue.com/api/v2/trackEvent';
(async() => {
try {
const event = await axios.post(
url,
JSON.stringify( {
email: 'myemail#emailprovider.co',
event: 'USER_SUBSCRIBED'
}),
{
Accept: 'application/json',
'Content-Type': 'application/json',
'ma-key': 'xkeysib-MY_v3_API_KEY'
},
);
console.log(event);
} catch (err) {
console.log(JSON.stringify(err))
}
})();
Is there a way I can see the events from this call coming in on the console?
The ma-key is not the same that API KEY. You should use the ma-key instead your current API for the automatization key.
After a couple of mails and a phone call, i figured out where is the ma-key:
You should login at send inblue. Click on Automatization (top menu). Click on Config (left tab bar). Click on something like 'see tracking code'. Then, you see a JS code. In this code, there is a key on it. This is your key.
My panel is in Spanish so maybe the words are not the same. Cheers.
As far as I know you can't really see the events in the console.
If you just want to make sure it's working you can
go to automation
start a workflow
Select a trigger: Website activities => An event happens
If you can select your event it means it worked.
Sendinblue is more a marketing automation tool and not an event analytics. So I'm not surprised you can't see the event in the GUI. If you want to see the events, try something like Mixpanel.
As #hector said pay attention to the API key. you're using the V3 campaigns (emails, contacts...) key. The tracking API is different.
Also, if you want to add event data, apparently you absolutely need to add a random unique ID. I struggled to find this as their docs are not super clear about it. So the body should look like something like this:
body: jsonEncode(<String, dynamic>{
'eventdata': {
id:"123456",
data: {
event_data1: value1,
event_data2: value2,
}
}
'email': example#mail.com,
'event': eventName
}),

Microsoft Graph API: "responseRequested" Attribute in "Update Event" does not work

As written in Microsoft Graph documentation, Event update endpoint allows responseRequested as one of the input property. It says:
Set to true if the sender would like a response when the event is
accepted or declined.
I tried setting it to false and I'm expecting it to have similar behavior with "Request responses" button in the UI. Unfortunately, it does not work as I'm expecting it to behave.
For example, in the web UI, if you turn "Request responses" off, no notification will be sent to the attendees and a message that shows no attendance response is required.
UI Screenshot
UI Screenshot - Expected behavior
For the code itself:
type UpdateEventRequest struct {
ResponseRequested bool `json:"responseRequested,omitempty"`
End *DateTimeTimeZone `json:"end,omitempty"`
}
type DateTimeTimeZone struct {
DateTime string `json:"dateTime,omitempty"`
TimeZone string `json:"timeZone,omitempty"`
}
func NewDateTimeTimeZone(t time.Time) *DateTimeTimeZone {
return &DateTimeTimeZone{
DateTime: t.Format("2006-01-02T15:04:05.999999"),
TimeZone: t.Location().String(),
}
}
When I tried to update an event to the following:
&UpdateEventRequest{
ResponseRequested: false,
End: NewDateTimeTimeZone(newEventEndTime),
}
Event end time is updated properly to newEventEndTime. However, ResponseRequested doesn't seem to update anything.
Am I missing something here? My initial goal is to change event end time via API without requiring attendance to submit "Yes/No" response. Thanks.

Microsoft Graph putting null to /users/:id/photo/$value causing 500

I am using the microsoft graph node.js sdk, making an app-only request to PUT to /users/f580eac0-9ece-413a-a26f-7964df1f2025/photo/$value, and I get the following:
{ statusCode: 500,
code: 'ErrorInternalServerError',
message: 'Value cannot be null.\r\nParameter name: userPrincipalName',
requestId: 'a5f0429d-ad3f-4f83-9e50-f540e5c8f9b8',
date: 2017-07-20T23:17:17.000Z,
body:
{ code: 'ErrorInternalServerError',
message: 'Value cannot be null.\r\nParameter name: userPrincipalName',
innerError:
{ 'request-id': 'a5f0429d-ad3f-4f83-9e50-f540e5c8f9b8',
date: '2017-07-20T23:17:17' } } }
How can I clear the photo for this user?
Unfortunately deleting a photo (or setting to null) is not supported currently. This is something that is being worked on, but I don't have a clear ETA and it may be some time before it's available.
The only (nasty) workaround I'm aware of is to PUT a 1x1 transparent photo.
Hope this helps,

Can server side logic be added to firebase to implement an automatically incremental key?

I am new to firebase, and I am trying to implement an iOS chat app. I am wondering if there is a way to add a incremental id to the received message.
For example:
I send the following message to firebase
{date: "2015-10-14T04:30:43", name: "Jacob" text:"Hi" userId: "y8jFdNwRAX" }
Is that possible that firebase add a messageId key to it
{msgId:1, date: "2015-10-14T04:30:43", name: "Jacob" text:"Hi" userId: "y8jFdNwRAX"};
and if I send another msg, firebase add msgId and increase it by 1:
{msgId:2, date: "2015-10-14T04:31:40", name: "Jacob" text:"morning" userId: "y8jFdNwRAX"};
Not sure if firebase can do this or not? Any help is appreciated. Thank you in advance.
So the answer to the question is no, it's not going to happen with some kind of automated server side logic. And, that's a bad idea in general.
There are ways to emulate a counter but it can be really tricky to work with and there are so many ways it can go wrong, it's just not good code.
So I would suggest looking for another solution:
Perhaps for each message has a child node that tracks whether it's been read?
message_id_1
timestamp: "2015-10-14T04:30:43"
name: "Jacob"
text: "Hi"
userId: "y8jFdNwRAX"
read: "yes"
message_id_2
timestamp: "2015-10-14T04:30:50"
name: "Bob"
text: "Hi Back At Ya"
userId: "y9jaksjk"
read: "no"
You could even have a 'read' node and an 'unread' node
read_messages
message_id_1
timestamp: "2015-10-14T04:30:43"
name: "Jacob"
text: "Hi"
userId: "y8jFdNwRAX"
unread_messages
message_id_2
timestamp: "2015-10-14T04:30:50"
name: "Bob"
text: "Hi Back At Ya"
userId: "y9jaksjk"
And here's a tricky one: Store messages in their own node and a reference to those unread ones in the users node
all_messages
message_id_1
timestamp: "2015-10-14T04:30:43"
from_userId: "y9jaksjk"
text: "This is message 1"
message_id_2
timestamp: "2015-10-14T04:30:50"
from_userId: "y9jaksjk"
text: "this is message 2"
users
"y8jFdNwRAX"
my_unread_messages:
message_id_1: true
(message_id_1: true saved as a child of the user is a reference that indicates the message is for that user and has not been read. When read, remove the reference.)
All of this is conjecture as we don't know the scope of the app and the use of the messages.
You may want to visit the docs a bit more and review some of the sample code provided for other options.
In case someone has the same question. I used timestamp as sort of "messageId" in server side to track the order(Probably not a good idea, but it solves what I need), and you can create '.indexon' on this "messageId" in the security rules to get better performance.
Here is an example for the security rule:
{
"rules": {
"Messages":{
"$Message":{
".indexOn": "messageId"
}
}
}
setup code in iOS:
msg[#"messageId"] = kFirebaseServerValueTimestamp;
Note, there is a possible bug for Firebase: the one you received timestamp from FEventTypeChildAdded seems different with the one saved in Firebase DB sometimes(a couple hundred ms diff).
i had the same problem for another reason
incremented keys can be achieved by transactions:
https://www.firebase.com/docs/ios/api/#firebase_runTransactionBlock
an example for swift:
let messagesRef = ...
let counterRef = ... //you need to keep the count of messages
counterRef.runTransactionBlock({ (currentData) -> FTransactionResult! in
var value = currentData.value as? Int
if (value == nil) {value=0}
currentData.value = value! + 1
return FTransactionResult.successWithValue(currentData)
}) { (error, commited, snap) -> Void in
if error != nil {
print(error)
}
if commited {
//if increment is made set message
let messageInfo = ....
messagesRef.childByAppendingPath(snap.value).setValue(messageInfo)
}
if the only goal is ordering, childbyautoid function would be enough since it creates the keys accordingly
messagesRef.childByAutoId().setValue(messageInfo)
hope this helps someone

Resources