Microsoft Team creation using REST API taking more time - microsoft-graph-api

I am using Standard Template to create a MSTeams Team that includes installing an app using 'https://graph.microsoft.com/beta/teams' REST API.
As it returns teamsasyncoperation (https://learn.microsoft.com/en-us/graph/api/resources/teamsasyncoperation?view=graph-rest-beta) I periodically hit the teamsasyncoperation that is in the location header to get the status.
The document mentions periodically check with more than 30 seconds interval.
It is too much time to wait. And now-a-days often I receive the response as 'notStarted' even after 20 minutes.
Why is MSTeams taking that long to create the team? Is there any best way to create the team (that includes install app,create a channel and add members) quickly within 10 seconds?
Does Microsoft provides any paid subscription to make the Team creation as higher priority and quicker?

Related

Microsoft Graph API getSchedule all-day events

I am attempting to create a vacation leave calendar for my team using the Microsoft Graph API. I have successfully used the /me/calendar/getSchedule endpoint as described in https://learn.microsoft.com/en-us/graph/outlook-get-free-busy-schedule.
However, the response does not include all-day events. Most people book leave for multiple days at a time. I would expect to see "oof" events (status 3) for whole 24 hour periods, but instead there are none.
Is there an option I am missing or another API that can provide this?

Webhooks v/s polling for all users drives' within an Org (multi-tenant) - MS Graph

We are going to have a multi-tenant application that is going to be scanning files for each user per organization for multiple orgs. We would like to get notification if any user uploads/changes a file in the drive. There are at-least 2 options to retrieve those: either store delta link for each user and poll periodically to get the change or subscribe using webhooks to get notification on change. If we have 10k+ users, I am not sure if the first option is feasible. For the latter one, my only concern with webhook is do I have to register for each user separately? ie., does the resource need to be /users//drive/root or should it just be /drive/root? Since there is a limitation of no. of webhooks per app/tenant, I am not sure if creating webhooks for each user is a right approach.
Please advise.
The limitation applies to the users/groups objects (i.e. if you wanted to subscribe to users/groups being updated), not to the drives.
Yes, you need to subscribe to each drive individually, and to renew the subscriptions individually as well. If you want to save the number of roundtrips, you can group those operations in a batch.
You can also combine the delta + webhooks: do the initial sync with delta and store the token, register your webhook. And then trigger querying the delta link upon receiving a change notification. This way you get the best of both features and avoid regularly polling delta when there might not be any change.

Places Room List API returns cached data

I have removed 4 existing room lists using Remove-DistributionGroup cmdlet.
I have added 1 new room list using New-DistributionGroup cmdlet.
However when calling the API
https://graph.microsoft.com/beta/places/microsoft.graph.roomlist
(using cURL)
with Application Permissions to Places.ReadAll, the API returns the old roomlists and not the new one.
However when calling EWS via EWS Java SDK, the output is as expected i.e. it only shows the new roomlist.
The Office365 Web app also shows the old room lists in the Browse for more rooms option.
Is the places resource returning cached data? If so what is the refresh interval here?
Update (Feb'20) Issue is no longer happening. Changes to roomlists are reflected instantaneously
This can take up to 60 hrs to show up in the Places API and web app. We are working on making this instant and should have this ready by end of Jan 2020.

How can I poll the OneNote API frequently without hitting rate limits?

I have a use case where I need to poll the OneNote API approximately every minute in order to respond to text added to pages by the user.
(Aside: I'd LOVE to use webhooks to get notifications only when something changes, but that's only supported for consumer notebooks at this time, as far as I can tell.)
Polling with this frequency works for a few users (5 or so), but then, with more users who authorized the same Microsoft application, the app seems to hit an application-level rate limit and begins receiving 429 Too Many Requests responses.
How can I ensure polling will still work as the number of users grows? And are there any rate limits that can be made public or shared in confidence for valid use cases?
So it is possible to register for webhooks on the sharepoint notebooks as onedrive items - as a notebook page gets updated the notificationUrl fires and you can then using delta calls to determine which sections (section.one files) have been updated.
I would then use the onenote-api to get the pages in the updated notebook sections GET https://www.onenote.com/api/v1.0/me/notes/sections/{id}/pages
An alternative would be to treat the sharepoint drive as a webdav server and use the propfind method with the getlastmodified property to poll the drive determine which sections of various notebooks have been updated.
But I agree it would be easier if onenote webhooks were extended to sharepoint.

SurveyMonkey Long Lived Access Token Lifespan

I am working on a project that requires an automated SSIS package to
connect to SurveyMonkey data store via API to incrementally download survey
results for the day or specified time period for custom reporting and low scoring task assignment.
Via OAuth I can collect a long lived access token, but due to the automated
and infinite nature of my projects lifespan, I cannot manually initiate
OAuth2 token refreshes or complete manual re-authentication cycles.
Is there another method to automatically export this data upon a scheduled
request?
Additionally, for clarification for how long is a long lived access token
valid? 60 days?
Miles from surveymonkey.com support responded to me with a great answer. I hope it can help someone down the line.
Hi Rob,
Currently our tokens should not expire - this is not guaranteed and
may change in future, but we will send out an update well ahead of
time if this does ever change. The token you receive on completion of
OAuth lets you know how long the token will last for without user
intervention, currently it returns 'null' in the 'expires_in' field.
There is no other automated way to schedule the data to be exported
currently, however it sounds like our current setup should suit your
needs
In addition to Miles's reply, it is very straightforward to pull diffs from surveymonkey using modified dates. we keep "last sync" timestamp per-survey in our database, and update it after each successful data pull.
Use the REST api directly, or (if you're using PHP) try https://github.com/oori/php-surveymonkey. We run it in production.
*note: actually, you're interested in setting the start_modified_date option for the "getRespondentList" function. but in general - see the API docs, modified date filter is available in more functions.

Resources