In my database, I have three entities: User, List and UserList (represents a many to many relationship between user and List). In my app, I have initialized SyncContext with StoreTrackingOptions.NotifyLocalAndServerOperations tracking option.
await Client.SyncContext.InitializeAsync(_store, StoreTrackingOptions.NotifyLocalAndServerOperations);
When my app is running for a given User, when I add a new List association for that user (by inserting a linkage record into UserList), I am able to detect this change:
var subscription = Client.EventManager.Subscribe<StoreOperationCompletedEvent>(async (storeEvent) => await StoreChangedEventHandler(storeEvent));
protected async Task StoreChangedEventHandler(StoreOperationCompletedEvent storeEvent) {..}
Now note that creating the linkage, will pull the UserList record for the User as well as the List record referenced by UserList.
When I delete this linkage record though, there is no notification of that coming to my client.
Questions: Is such notification (of deleted records) possible? If so, how do I make it happen?
I have solved this issue by enabling soft delete on the server (Azure Mobile Server SDK). By doing that, all soft-deleted records are pulled back to the client and I can filter them out for presentation. Works for me but may not work for everyone else.
When doing a PurchaseOrderQuery in QBXML I am trying to get Quickbooks to only return purchase orders that are not yet processed (i.e. "IsFullyReceived" == false). The response object contains the IsFullyReceived flag, but the query object doesn't seem to have a filter for it??
This means I have to get every single Purchase Order whether or not it's received, then do the filtering logic in my application - which slows down Web Connector transactions.
Any ideas?
Thanks!
You can't.
The response object contains the IsFullyReceived flag, but the query object doesn't seem to have a filter for it??
Correct, there is no filter for it.
You can see this in the docs:
https://developer-static.intuit.com/qbSDK-current/Common/newOSR/index.html
This means I have to get every single Purchase Order whether or not it's received, then do the filtering logic in my application - which slows down Web Connector transactions.
Yep, probably.
Any ideas?
Try querying for only Purchase Orders changed or modified (ModifiedDateRangeFilter) since the last time you synced.
Or, instead of pulling every single PO, keep track of a list of POs that you think may not have been received yet, and then only query for those specific POs based on RefNumber.
Or, watch the ItemReceipt and BillPayment objects, and use that to implement logic about which POs may have been recently filled, since BillPayment andItemReceipt` objects should get created as the PO is fulfilled/received.
I have a simple question
I'm developping a chat system service but I came across something interesting.
I'm currently operating this way :
First table :
[Messages] :
SenderID(user),
RecipientID(user),
Content(string)
Of course, everytime a user sends a message to the other user, I add it to the table. But I thought about the fact that If a table has a million lines, it would become a mess
So I thought about another way being :
First Table :
[Conversation]
ConversationID,
USER1(user),
USER2(user)
Second table :
[Messages] in which I have
ConversationID,
Content(string)
So basically, I'm asking, which configuration should I use?
The approach below should be able to sort you out. This is a good basis for both chat and messaging, where with chat you can poll recent messages from the client side and slap on an intuitive UI.
Message
Message {
MessageId,
FromId, -- Foreign key User.UserId
ToId, -- Foreign key User.UserId
Subject,
Content,
Attachment, -- can be null or default to a 0
DateReceived, -- can be null or default to 1901 or sumin'
DateRead
...
}
User
User {
UserId
UserName
...
}
Queries
Inbox = Message where ToId = Me.UserId
Sent = Message where FromId = Me.UserId
Conversation = Group by Subject
Attachment = (Simple = Path to attachment file. || Better = DocumentId)
Attachment
Document {
int DocumentId,
int DocTypeId,
virtual DocumentType DocumentType,
string FileName,
int UserId,
string mimeType,
float fileSize,
string storagePath,
int OrganizationId,
string fileHash,
string ipAddress,
DateTime DateCreated = DateTime.Now;
}
Then you run into the issue with group chats. Do you send a message to each recipient of the group or do you create a single message that each recipient has access to?
But we're keeping it simple.
Here is another idea, Store the notion of a "message". Whether it be an email, text, sms or sound file. this is where you would store the message text and or other metadata. Think of the "message" as everything up to releasing the stop talk button on a walkie-talkie to end the transmission and everything up until the start and stop has been stored in the "message". This message could be tied to other messages by the fact that the user was replying to a "message" from a prior party's request to talk.
If you wanted to make the whole relationship easy then you could log the "messages" sent to a user, "inbox" and messages sent from a user, "outbox". This is synonymous with today's messaging mentality.
Message
MessageID,
Subject,
Content
...
MessageReceived
MessageReceivedID,
UserID,
FromUserID,
MessageID,
DateReceived,
DateRead,
MessageSent
MessageSentID,
UserID,
MessageID,
DateSent,
DeletedStatusID
Both solutions work for simple messaging. The question here really is if there is ever more than one conversation context of messages that should exist between the same two users.
If two users are always talking within the scope of the same conversational context/history every time they being chatting, your first solution is sufficient. Think of this scenario like a Skype chat. It's just one long conversation over time.
If conversational context changes when they begin chatting (i.e. their chat history never really persists between two different days worth of conversations), solution two makes more sense. Think of this scenario like an email between two users. The next day I could write another email to the same person, but it is a different conversation.
Also for solution two, you would also need to add USER to the second table to track which user sent the message in the conversation:
First Table :
[Conversation]
ConversationID,
USER1(user),
USER2(user)
Second table :
[Messages]
ConversationID,
Content(string)
USER(user)
In summary, just because a table would have millions of lines doesn't mean it's not the correct way of doing it. In this case, it just depends on the requirements of your application.
I am making a call to;
https://www.eventbrite.com/xml/organizer_list_events?id=MYID
and getting a collection of event returned. I am integrating this data into an internal event system.
However, periodically I will get a venue without an ID. This means I cant add the venue into the system as I have no way of checking for duplicates before it is imported.
Show the VenueID always be returned? If not, under what circumstances would it not be returned?
The venue Id will not be returned if there is no venue, e.g. if the event is a webinar / online only event or the venue is not yet chosen
I am working on a custom email notification for a WSS 3.0 solution. I am using a custom class inheriting from IAlertNotifyHandler to generate the email. There is a great example here that shows how this is done for an Immediate alert. Here is some of the code related to the SPAlertHandlerParams, which is used to get information about the alert and the item that triggered the alert.
SPAlertHandlerParams ahp;
int id = ahp.eventData[0].itemId; //gets the itemId of the item triggering the notification.
SPListItem myItem = list.GetItembyId(id);
For immediate alerts, this works great as the item I want is always at the [0] position of the eventData object. For a digest event, I thought I could just loop through all of the items in the ahp.eventData. Two problems with this.
First, it gives me all of the events where it is sending notifications, not just the ones for me. Second the eventData[0].itemId no longer points to a valid id on the list. It is 6-7 digit number instead of a 3 digit number.
Does anyone know the correct way to get to the alert information for digest emails?
Please let me know if you have any additional questions about this.
Thanks for your help!
For my project, I created a custom timer job (using the post by Andrew Connell) that mimics the Alert functionality. It runs overnight and queries for any users subscribed to my list with daily alerts. It then packages all of the new tasks into a custom email message.
I left the custom alert in place to suppress any daily notifications from the system. I just return 'True' so that alerts are not sent for tasks assigned to only 1 person. I suppose looking back on this, I could have run the query code in the custom alert and not need a separate timer job.