I am trying to use the Slack API and specifically the conversations.history url. I am trying to pull just the last 14 days of messages, but I can not seem to figure out how to do so with the "Oldest" Parameter and passing the last 14 days through that. I want this to be automated so that everyday it shows just the last 14 days, rather then putting a Unix ts in of 14 days ago. Is this possible?
I Have tried to manipulate the ts, but nothing seems to work other then a number as the input. This will not work for me as I always want to pull in the latest 14 days.
The oldest property requires an absolute timestamp. You can not user relative times like last 14 days etc. So your app needs to calculate the correct timestamp when it calls the API.
Also take note of the correct format. It includes fractions of a second, e.g. 1512085950.000216.
Here the docu for reference.
Related
I worked with Zapier a few years ago and remember that they expect a minimum number of triggers and actions to go live on the app directory. However I'm not finding that document right now (or maybe it was never there and I'm mistaken?).
Does anyone have context on what's the least number of triggers and actions needed to go live with a Zapier app?
There's no hard and fast rule here. 1 is the minimum. Initially it's recommended to do no more than 5 total. From the docs:
We recommend your Zapier integration have no more than 5 of each (trigger, action, or search) at first; we suggest starting with your most popular 2-3 use cases.
You shouldn't pad your integration; it should have enough important functionality to be useful.
After some research I read in several places that Apple does not let applications run in backgroud except for some specific categories, but I need a solution and I could not find help in any documentation and / or research.
I have the need to leave an automatic task to occur at 10:45 pm which in the case would be to subtract the existing value in a variable to another variable, a discount calculation basically, however the user can choose which days of the week it will repeat itself , however the schedule it runs will always be at 10:45 pm, would anyone have a solution?
I'm writing some software to do charting and analysis of intraday stock data, and so far the only free (or even affordable) feed I've found which gives 15 minute data for the past week or so is Google Finance. But something I've noticed, which I don't understand and has caused many headaches, is that the responses from the API for 15 minute intervals seem to be very inconsistent.
So far I haven't seen this problem with the 30 minute interval, in this case the response is always correct. But if I specify an interval of 15 minutes (900 seconds), I get anywhere from 70 to 200 or more quotes back. The data is correct, but the responses seem to pretty much ignore the number of days I'm specifying. Also this happens for individual stocks, so it isn't a case of some stocks having missing data. Here's an example of an API request I'm sending:
https://www.google.com/finance/getprices?i=900&p=8d&f=d,o,h,l,c&q=INTC
If anyone could help I'd appreciate it, this API doesn't seem to be documented so it's been difficult to find any help with it.
Yes, google is not consistent in providing stock data. For the same reason i switched over to yahoo API, their data is pretty much consistent compared to google
I'm seeing issues where adding multiple entries to a playlist in a short amount of time seems to fail regularly without any error responses.
I'm using the json-c format with version 2.1 of the api. If I send POST requests to add 7 videos entries to a playlist then I see results of between 3-5 of them actually being added to the playlist.
I am getting back a 201 created response from the api for all requests.
Here's what a request looks like:
{"data":{"position":0,"video":{"duration":0,"id":"5gYXlTe0JTk","itemsPerPage":0,"rating":0,"startIndex":0,"totalItems":0}}}
and here's the response:
{"apiVersion":"2.1","data":{"id":"PLL_faWZNDjUU42ieNrViacdvqvG714P4QjvSDgGRg1kc","position":4,"author":"Lance Andersen","video":{"id":"5gYXlTe0JTk","uploaded":"2012-08-16T19:27:19.000Z","updated":"2012-09-28T20:20:39.000Z","uploader":"usanahealthsciences","category":"Education","title":"What other products does USANA offer?","description":"Discover USANA's other high-quality products: the Sens skin and hair care line, USANA Foods, the RESET weight-management program, and Rev3 Energy.","thumbnail":{"sqDefault":"http://i.ytimg.com/vi/5gYXlTe0JTk/default.jpg","hqDefault":"http://i.ytimg.com/vi/5gYXlTe0JTk/hqdefault.jpg"},"player":{"default":"http://www.youtube.com/watch?v=5gYXlTe0JTk&feature=youtube_gdata_player","mobile":"http://m.youtube.com/details?v=5gYXlTe0JTk"},"content":{"5":"http://www.youtube.com/v/5gYXlTe0JTk?version=3&f=playlists&d=Af8Xujyi4mT-Oo3oyndWLP8O88HsQjpE1a8d1GxQnGDm&app=youtube_gdata","1":"rtsp://v6.cache3.c.youtube.com/CkgLENy73wIaPwk5JbQ3lRcG5hMYDSANFEgGUglwbGF5bGlzdHNyIQH_F7o8ouJk_jqN6Mp3Viz_DvPB7EI6RNWvHdRsUJxg5gw=/0/0/0/video.3gp","6":"rtsp://v7.cache7.c.youtube.com/CkgLENy73wIaPwk5JbQ3lRcG5hMYESARFEgGUglwbGF5bGlzdHNyIQH_F7o8ouJk_jqN6Mp3Viz_DvPB7EI6RNWvHdRsUJxg5gw=/0/0/0/video.3gp"},"duration":72,"aspectRatio":"widescreen","rating":5.0,"likeCount":"6","ratingCount":6,"viewCount":1983,"favoriteCount":0,"commentCount":0,"accessControl":{"comment":"allowed","commentVote":"allowed","videoRespond":"moderated","rate":"allowed","embed":"allowed","list":"allowed","autoPlay":"allowed","syndicate":"allowed"}},"canEdit":true}}
The problem doesn't change if I set the position attribute.
If I send them sequentially with a 5 second delay between them then the results are more reliable with 6 of the 7 usually making it on the playlist.
It seems like there is a race condition happening on the api server side.
I'm not sure how to handle this problem since I am seeing zero errors in the api call responses.
I have considered doing batch processing, but can't find any documentation on it for the json-c format. I'm not sure it that would make a difference anyways.
Is there a solution to reliably adding playlist entries to a playlist?
This was fixed in and update to the youtube data apis around the 25th of October.
For a research project, I need to download the top 100 most used words on Twitter, multiple times per hour. However, as far as I can tell, the Twitter API only supports downloading the top 10 most used words ("trends").
My questions therefore are:
Am I missing something in the API? Is there another way to fetch more than 10 trends?
If there isn't, does anybody know of a workaround for this problem?
Put ?count=50 at the end of the URL to get the top 50. I haven't been able to get more than 50.
http://api.twitter.com/1/trends/current.json?count=50
you should monitor your timeline get all tweets , save in database, analyze via NLP, and save words(for example person names), after aggregate and get counts, for example "Obama 50 times, Java 10 times, linux 5 times"