Twilio completed room - can it be rejoined/recreated? - twilio

I'm using the REST API to create rooms. In the docs, it says a room will end when empty for 5 minutes. I would like to know what this means when trying to reuse the same room. Can participants rejoin that room again?
My use case is that users make request to join a room via a server endpoint. A check by UniqueName is performed to see if this room exists. If it does not, it is then created using the UniqueName.
There will certainly be cases where a room is created but is empty for more than 5 minutes, and I would like to re-use the same UniqueName to join the room (empty, completed or otherwise). But it's not clear to me that this can be done once a room is completed.
E.g., if another user tries to join the completed room, will the logic from above still work, or will it break b/c the room is in completed status can cannot be joined or re-created? My goal is to always have access to a room with the same UniqueName.
Please advise, thanks.
Edit
I just had a thought to instead retrieve a list of rooms by UniqueName, which would show me a room that has already been created, regardless of its status. However, I would still need to be able to use the same room if its status is completed.
Can the status be updated from completed to in-progress?
Update
It would seem that the status cannot be updated from completed to in-progress. So, when a room is completed, how I can I continue to use the same UniqueName for another room if one already exists with that unique name?

Answering my own question here as I think I have a solution.
After tinkering with the Twilio quickstart app, I discovered that the UniqueName is unique in the context of rooms that are in-progress. I verified this by joining and leaving a room multiple times within a few seconds. A new room was created every time I left. I wasn't aware of this, but rooms created via client SDK will close instantly after all participants leave. As stated in the docs, rooms created via the REST API remain open for 5 minutes.
When I say "created", I mean that a new room with a new SID is created and can be viewed in the Twilio console, even if the UniqueName has not changed. So you can in fact have multiple rooms with the same UniqueName, but only the room with status in-progress is evaluated against when making REST API calls.
Thus, the answer would be to simply try to create a room with a unique name. Doing so when the room exists results in a 404 error as stated in the docs. Ie:
{ [Error: Room exists]
status: 400,
message: 'Room exists',
code: 53113,
moreInfo: 'https://www.twilio.com/docs/errors/53113',
detail: undefined }
However, if a room with that unique name was already completed, a new room with the same unique name can be created. The simplest approach would be to try/catch on creating a room and handle the error.
let room;
try {
room = await client.video.rooms.create({uniqueName: 'test'})
} catch (e) {
console.log(e);
}

Related

Unable to detect a Delete on the server from the client

In my database, I have three entities: User, List and UserList (represents a many to many relationship between user and List). In my app, I have initialized SyncContext with StoreTrackingOptions.NotifyLocalAndServerOperations tracking option.
await Client.SyncContext.InitializeAsync(_store, StoreTrackingOptions.NotifyLocalAndServerOperations);
When my app is running for a given User, when I add a new List association for that user (by inserting a linkage record into UserList), I am able to detect this change:
var subscription = Client.EventManager.Subscribe<StoreOperationCompletedEvent>(async (storeEvent) => await StoreChangedEventHandler(storeEvent));
protected async Task StoreChangedEventHandler(StoreOperationCompletedEvent storeEvent) {..}
Now note that creating the linkage, will pull the UserList record for the User as well as the List record referenced by UserList.
When I delete this linkage record though, there is no notification of that coming to my client.
Questions: Is such notification (of deleted records) possible? If so, how do I make it happen?
I have solved this issue by enabling soft delete on the server (Azure Mobile Server SDK). By doing that, all soft-deleted records are pulled back to the client and I can filter them out for presentation. Works for me but may not work for everyone else.

How do I filter Purchase Order query in QBXML to only return records that are not fully received?

When doing a PurchaseOrderQuery in QBXML I am trying to get Quickbooks to only return purchase orders that are not yet processed (i.e. "IsFullyReceived" == false). The response object contains the IsFullyReceived flag, but the query object doesn't seem to have a filter for it??
This means I have to get every single Purchase Order whether or not it's received, then do the filtering logic in my application - which slows down Web Connector transactions.
Any ideas?
Thanks!
You can't.
The response object contains the IsFullyReceived flag, but the query object doesn't seem to have a filter for it??
Correct, there is no filter for it.
You can see this in the docs:
https://developer-static.intuit.com/qbSDK-current/Common/newOSR/index.html
This means I have to get every single Purchase Order whether or not it's received, then do the filtering logic in my application - which slows down Web Connector transactions.
Yep, probably.
Any ideas?
Try querying for only Purchase Orders changed or modified (ModifiedDateRangeFilter) since the last time you synced.
Or, instead of pulling every single PO, keep track of a list of POs that you think may not have been received yet, and then only query for those specific POs based on RefNumber.
Or, watch the ItemReceipt and BillPayment objects, and use that to implement logic about which POs may have been recently filled, since BillPayment andItemReceipt` objects should get created as the PO is fulfilled/received.

How to model a Messaging/Chatting system in C# with Entity Framework

I have a simple question
I'm developping a chat system service but I came across something interesting.
I'm currently operating this way :
First table :
[Messages] :
SenderID(user),
RecipientID(user),
Content(string)
Of course, everytime a user sends a message to the other user, I add it to the table. But I thought about the fact that If a table has a million lines, it would become a mess
So I thought about another way being :
First Table :
[Conversation]
ConversationID,
USER1(user),
USER2(user)
Second table :
[Messages] in which I have
ConversationID,
Content(string)
So basically, I'm asking, which configuration should I use?
The approach below should be able to sort you out. This is a good basis for both chat and messaging, where with chat you can poll recent messages from the client side and slap on an intuitive UI.
Message
Message {
MessageId,
FromId, -- Foreign key User.UserId
ToId, -- Foreign key User.UserId
Subject,
Content,
Attachment, -- can be null or default to a 0
DateReceived, -- can be null or default to 1901 or sumin'
DateRead
...
}
User
User {
UserId
UserName
...
}
Queries
Inbox = Message where ToId = Me.UserId
Sent = Message where FromId = Me.UserId
Conversation = Group by Subject
Attachment = (Simple = Path to attachment file. || Better = DocumentId)
Attachment
Document {
int DocumentId,
int DocTypeId,
virtual DocumentType DocumentType,
string FileName,
int UserId,
string mimeType,
float fileSize,
string storagePath,
int OrganizationId,
string fileHash,
string ipAddress,
DateTime DateCreated = DateTime.Now;
}
Then you run into the issue with group chats. Do you send a message to each recipient of the group or do you create a single message that each recipient has access to?
But we're keeping it simple.
Here is another idea, Store the notion of a "message". Whether it be an email, text, sms or sound file. this is where you would store the message text and or other metadata. Think of the "message" as everything up to releasing the stop talk button on a walkie-talkie to end the transmission and everything up until the start and stop has been stored in the "message". This message could be tied to other messages by the fact that the user was replying to a "message" from a prior party's request to talk.
If you wanted to make the whole relationship easy then you could log the "messages" sent to a user, "inbox" and messages sent from a user, "outbox". This is synonymous with today's messaging mentality.
Message
MessageID,
Subject,
Content
...
MessageReceived
MessageReceivedID,
UserID,
FromUserID,
MessageID,
DateReceived,
DateRead,
MessageSent
MessageSentID,
UserID,
MessageID,
DateSent,
DeletedStatusID
Both solutions work for simple messaging. The question here really is if there is ever more than one conversation context of messages that should exist between the same two users.
If two users are always talking within the scope of the same conversational context/history every time they being chatting, your first solution is sufficient. Think of this scenario like a Skype chat. It's just one long conversation over time.
If conversational context changes when they begin chatting (i.e. their chat history never really persists between two different days worth of conversations), solution two makes more sense. Think of this scenario like an email between two users. The next day I could write another email to the same person, but it is a different conversation.
Also for solution two, you would also need to add USER to the second table to track which user sent the message in the conversation:
First Table :
[Conversation]
ConversationID,
USER1(user),
USER2(user)
Second table :
[Messages]
ConversationID,
Content(string)
USER(user)
In summary, just because a table would have millions of lines doesn't mean it's not the correct way of doing it. In this case, it just depends on the requirements of your application.

Eventbrite VenueID is missing

I am making a call to;
https://www.eventbrite.com/xml/organizer_list_events?id=MYID
and getting a collection of event returned. I am integrating this data into an internal event system.
However, periodically I will get a venue without an ID. This means I cant add the venue into the system as I have no way of checking for duplicates before it is imported.
Show the VenueID always be returned? If not, under what circumstances would it not be returned?
The venue Id will not be returned if there is no venue, e.g. if the event is a webinar / online only event or the venue is not yet chosen

Custom email notifications for "Digest" type of alert subscription

I am working on a custom email notification for a WSS 3.0 solution. I am using a custom class inheriting from IAlertNotifyHandler to generate the email. There is a great example here that shows how this is done for an Immediate alert. Here is some of the code related to the SPAlertHandlerParams, which is used to get information about the alert and the item that triggered the alert.
SPAlertHandlerParams ahp;
int id = ahp.eventData[0].itemId; //gets the itemId of the item triggering the notification.
SPListItem myItem = list.GetItembyId(id);
For immediate alerts, this works great as the item I want is always at the [0] position of the eventData object. For a digest event, I thought I could just loop through all of the items in the ahp.eventData. Two problems with this.
First, it gives me all of the events where it is sending notifications, not just the ones for me. Second the eventData[0].itemId no longer points to a valid id on the list. It is 6-7 digit number instead of a 3 digit number.
Does anyone know the correct way to get to the alert information for digest emails?
Please let me know if you have any additional questions about this.
Thanks for your help!
For my project, I created a custom timer job (using the post by Andrew Connell) that mimics the Alert functionality. It runs overnight and queries for any users subscribed to my list with daily alerts. It then packages all of the new tasks into a custom email message.
I left the custom alert in place to suppress any daily notifications from the system. I just return 'True' so that alerts are not sent for tasks assigned to only 1 person. I suppose looking back on this, I could have run the query code in the custom alert and not need a separate timer job.

Resources