I am looking at the events API here
https://asana.com/developers/api-reference/events
I see that requests need a sync token which, I assume, is analogous to the SHA1 changeset hash of git. However, I didnt see how to ask for the first sync token. Also get back a sync token so that I know at that point what state the Also, Is the events API stable enough to use? I noticed it said that Note: The complete list of available events is still in flux.
thx, alan
The events API is not only unstable, it's currently disabled for stability reasons. Once it's back, the simple answer is that the first request you make errors, but also sends back a sync token you can use for the next request. Each response contains the sync token for the next request. And if it ever becomes invalid (expires, for example) you'll get another error and a new sync token. The error basically signals that you need to re-fetch the full state if you're trying to do something sync-ish. If you're only interested in reacting to events, you can skip that part, but you might have missed some in the meantime.
The documentation will be more fleshed out when it's ready for public consumption, and those answers should be more readily apparent.
Generate first sync token - provide the resource(project) id
curl -H "Authorization: Bearer authtoken" \
https://app.asana.com/api/1.0/events?resource=145619319717806&sync=1
In response - you will get first token to be used for event subscription:
{"errors":[{"message":"Sync token invalid or too old. If you are
attemping to keep resources in sync, you must fetch the full dataset
for this query now and use the new sync token for the next
sync."}],"sync":"token"}
Use the sync token provided from previous request, like:
curl -H "Authorization: Bearer authtoken" \
"https://app.asana.com/api/1.0/events?resource=145619319717806&sync=<sync-token>"
In response, you will get events and next sync token. Use this sync token for further events.
Reference: Asana Events API
Related
According to this documentation Synchronize messages API users have ability to synchronize messages with pretty simple skipToken mechanics. And it works well for fetching new messages in folders.
But! What I'm also interested is how to sync flags and status like is message read or not.
For example I synced all messages from Inbox folder. After that user goes to his Outlook account and reads message and set some flag for this message.
How can I get this info? Should I resync all messages to get only those changes?
Also how I get notion about message removal? If some user deleted message from inbox, how I get to know which message was deleted without fetching all messages again?
You should use the Microsoft Graph API and move away from using the Outlook Rest API as it has been deemphasized with no more effort being put into the developer experience.
Use Microsoft Graph to synchronize and track changes by using the Delta query feature.
An initial sync call will happen with a Delta query. Make sure you select the properties you care about:
GET https://graph.microsoft.com/v1.0/me/mailfolders/AQMkADNkNAAAgEMAAAA/messages/delta?$select=subject,sender,isRead
Prefer: odata.maxpagesize=50
If the response has an #odata.nextlink in the response JSON object, GET that URL to the next page of results.
If the response has an #odata.deltaLink in the response JSON object, cache that URL until the next time you want to check for changes.
I am able to generate an oauth2 access token (from a refresh token), which I believe should give me the ability to access the youtube data api functionally to delete/upload content.
Using python and the youtube api I need to delete and upload a (new) video to youtube periodically, say hourly.
All google python samples I've found seem to call the "DENY/ALLOW" screen which requires a copy/paste back in the calling app.
I can do this occasionally but otherwise want the process to be automated. I've read about service accounts which, which according to the linked post, are not supported by the youtube api. Offline access et.al. is also mentioned but in somewhat abstract terms i.e. no concrete python examples (that I have yet found). Another source mentioned an http get like below:
"GET access_token=ya29.GlxBBS89....ast987&part=snippet&mine=true"
but the following in python doesn't seem to work returning "response [400]" (bad request)
url = 'https://www.googleapis.com/youtube/v3/channels'
args = 'access_token: ' + token var + ', part: snippet, mine: true'
get_token = requests.get(url, data = args)
I have used Can we use google youtube data api without OAuth (and others) to get to this stage but need clarification for the next step.
**********************************Update*********************************
I have found that I can only generate access tokens for clients credentials configured as web apps. I am writing a desktop app so I may be barking up the wrong tree.
Or learning Django...
I found examples at https://developers.google.com/youtube/v3/guides/auth/installed-apps that helped.
curl -H "Authorization: Bearer <access_token>" https://www.googleapis.com/youtube/v3/channels?part=snippet&mine=true
curl https://www.googleapis.com/youtube/v3/channels?access_token=<access_token>&part=snippet&mine=true
The curl samples especially provided confirmation that I'm reaching the endpoint and returned helpful debugging info. They have exposed other issues which I'll ask in another question.
UPDATE: All calls to the API receive the following response:
failed [500] An error has occurred
Weirdly, my auth process (documented here) works perfectly, while all calls to the API (documented here), fail.
The Bigcommerce API is in transition from basic auth to oAuth. The documentation is consequently a little confusing.
I am trying to create a webhook using the new oAuth methodology. The documentation states that I need an oAuth access_token for the relevant store, which I have obtained.
The documentation also includes sample http request data:
{
"scope": "store/order/*",
"headers": {
"X-Custom-Auth-Header": "{secret_auth_password}"
},
"destination": "https://app.example.com/orders",
"is_active": true
}
In this context, I am assuming that {secret_auth_password} refers to the store's access_token. However, when I include the access_token here I get the following error:
failed [401] You are not authorized.
Thinking that this might be a scoping/permission issue, I have given my app the highest possible level of access through the app settings, but this did not work either.
Thanks in advance for any pointers.
Well after a couple of days of serious head-scratching (not to mention hair out-tearing) I worked out that this was all down to an error in my SSL intermediate certificate, which I have now fixed.
It was the old API returning an error of “UNABLE_TO_VERIFY_LEAF_SIGNATURE” that put me on the right track – the new API just returned:
500 – there is an error
or
404 – you are not authorized.
If you are using PHP I would recommend using the the Webhooks pull request combined with the OAuth pull request. They are both working fine together (I personally use them).
Webhooks pull - https://github.com/bigcommerce/bigcommerce-api-php/pull/101
OAuth pull - https://github.com/bigcommerce/bigcommerce-api-php/pull/88
Then to create a webhook you can just call createWebhook($object)
Object needs to include scope and destination.
Also - a side note.. Are you using SSL for the destination address. It won't work otherwise. You can use a self-signed cert to get around this though.
This will only work for setting up the webhooks though.
To actually receive them you need a valid certificate (else you get nothing).
Hope this helps.
I came across this same part of the documentation and was also confused by it. The proper headers to send for webhooks are the following:
"X-Auth-Client":"[YOUR_APPS_CLIENT_ID]",
"X-Auth-Token":"[OAUTH_ACCESS_TOKEN]"
In addition to using the headers that #FlyingL123 suggested, also take note of of the requirements as noted by BigCommerce:
Requirements
The following properties of the webhooks are required. The request
won’t be fulfilled unless these properties are valid.
scope
destination
I am working on a project that requires an automated SSIS package to
connect to SurveyMonkey data store via API to incrementally download survey
results for the day or specified time period for custom reporting and low scoring task assignment.
Via OAuth I can collect a long lived access token, but due to the automated
and infinite nature of my projects lifespan, I cannot manually initiate
OAuth2 token refreshes or complete manual re-authentication cycles.
Is there another method to automatically export this data upon a scheduled
request?
Additionally, for clarification for how long is a long lived access token
valid? 60 days?
Miles from surveymonkey.com support responded to me with a great answer. I hope it can help someone down the line.
Hi Rob,
Currently our tokens should not expire - this is not guaranteed and
may change in future, but we will send out an update well ahead of
time if this does ever change. The token you receive on completion of
OAuth lets you know how long the token will last for without user
intervention, currently it returns 'null' in the 'expires_in' field.
There is no other automated way to schedule the data to be exported
currently, however it sounds like our current setup should suit your
needs
In addition to Miles's reply, it is very straightforward to pull diffs from surveymonkey using modified dates. we keep "last sync" timestamp per-survey in our database, and update it after each successful data pull.
Use the REST api directly, or (if you're using PHP) try https://github.com/oori/php-surveymonkey. We run it in production.
*note: actually, you're interested in setting the start_modified_date option for the "getRespondentList" function. but in general - see the API docs, modified date filter is available in more functions.
I'm using the one-time code flow with my google+ sign in button implementation, but the user_id in the response from the tokeninfo endpoint doesn't match the id_token in the object my javascript my javascript callback receives from the sign in buton.
In the sample code in the documentation, the the user_id in the tokeninfo object is checked against a request parameter called gplus_id, but the sample javascript doesn't send this parameter, so I have no idea if I'm checking against the right thing.
So, to be clear about the particular sections of code I'm talking about:
The one-time code is processed on the server using this sample code, and it uses a request parameter called gplus_id.
The code in this section sends the one time code to the server, but as I can see, it doesn't send a gplus_id
It looks like step 6 on the example page is incomplete, and is supposed to be sending the gplus_id, but isn't.
Take a look at the connectServer function (and the function that calls it) in https://github.com/googleplus/gplus-quickstart-java/blob/master/index.html for a more complete example of how to get the user's ID and pass it to the server for verification.
(And I'll try to ping the people responsible for the documentation to get it updated and consistent across the platforms in the quickstart examples. You can also track bug 573 to see progress on them fixing the documentation.)
NOTE: It is worth noting, however, that sending the gplus_id is a bit redundant. You're already trusting the code sent from the client, and you're getting the ID through steps derived from the code. So while passing and checking the gplus_id is a nice sanity check, it really doesn't gain you any additional security.