Zapier sending Python JSON dict and skipping null values - zapier

From Limo anywhere trigger, I am getting the values in my node.js code action. Suppose we are getting following value from the trigger:
null,the four seasons hotel;
Zapier is sending only the four seasons hotel in the code. Is there a way in Zapier to get the rawJSON and parse it in the code?
My code :
let rows = "";
const totalRecords = firstNames.length;
const toLocations = inputData.to_location.split(",");
const firstNames = inputData.first_name.split(",");
const lastNames = inputData.last_name.split(",");
const pickupDates = inputData.pickup_date.split(",");
const fromLocations = inputData.from_location.split(",");
for(let i = 0; i < totalRecords; i++){
rows += `${firstNames[i]} ${lastNames[i]} ${fromLocations[i]} ${toLocations[i]} ${pickupDates[i]}`;
if(i !== (totalRecords - 1)){
rows += ",";
}
}
return {
rows: rows
};

Following is the complex solution of the Above problem :
Make a separate "Catch Raw Webhook" in Zapier.
Use the Webhook URL from #1 above and add it as a webhook action on existing Limo anywhere trigger. This webhook action will send json as post under "raw" parameter. Choose custom request then method will be POST, data pass through will be yes.
Add a code action after webhook trigger to parse the Python Dictionary string in Zapier :
import json, ast
output = {'res': json.dumps(ast.literal_eval(input_data["raw"]))}
This will now give you proper JSON compatible with Javascript and from here on you can deal with the data.
I had to use python in #3 because raw JSON was only compatible with Python and it looked like following :
{u'due_date': u'2019-03-22T00:00:00', u'terms': u'due_upon_receipt'}

Related

Script fails when running normally but in debug its fine

I'm developing a google spreadsheet that is automatically requesting information from a site, below is the code. The variable 'tokens' is an array consisting of about 60 different 3 letter unique identifiers. The problem that i have been getting is that the code keeps failing to request all information on the site. Instead it falls back (at random) on the validation part, and fills the array up with "Error!" strings. Sometimes its row 5, then 10-12, then 3, then multiple rows, etc. When i run it in debug mode everythings fine, can't seem to be able to reproduce the problem.
Already tried to place a sleep (100ms) but that fixed nothing. Also looked at the amount of traffic the API accepts (10 requests per second, 1.200 per minute, 100.000 per day) , it shouldn't be a problem.
Runtime is limited so i need it to be as efficient as possible. I'm thinking it is an issue of computational power after i pushed all values in the json request into the 'tokens' array. Is there a way to let the script wait as long as necessary for the changes to be committed?
function newGetOrders() {
var starttime = new Date().getTime().toString();
var refreshTime = new Date();
var tokens = retrieveTopBin();
var sheet = SpreadsheetApp.openById('aaafFzbXXRzSi-eXBu9Xh81Ne2r09vM8rLFkA4fY').getSheetByName("Sheet37");
sheet.getRange('A2:OL101').clear();
for (var i=0; i<tokens.length; i++) {
var request = UrlFetchApp.fetch("https://api.binance.com/api/v1/depth?symbol=" + tokens[i][0] + "BTC", {muteHttpExceptions:true});
var json = JSON.parse(request.getContentText());
tokens[i].push(refreshTime);
Utilities.sleep(100);
for (var k in json.bids) {
tokens[i].push(json.bids[k][0]);
tokens[i].push(json.bids[k][1]);
}
for (var k in json.asks) {
tokens[i].push(json.asks[k][0]);
tokens[i].push(json.asks[k][1]);
}
if (tokens[i].length < 402) {
for (var x=tokens[i].length; x<402; x++) {
tokens[i].push("ERROR!");
}
}
}
sheet.getRange(2, 1, tokens.length, 402).setValues(tokens);
}

Making a variable number of parallel HTTP requests with Gatling?

I am trying to model a server-to-server REST API interaction in Gatling 2.2.0. There are several interactions of the type "request a list and then request all items on the list at in parallel", but I can't seem to model this in Gatling. Code so far:
def groupBy(dimensions: Seq[String], metric: String) = {
http("group by")
.post(endpoint)
.body(...).asJSON
.check(
...
.saveAs("events")
)
}
scenario("Dashboard scenario")
.exec(groupBy(dimensions, metric)
.resources(
// a http() for each item in session("events"), plz
)
)
I have gotten as far as figuring out that parallel requests are performed by .resources(), but I don't understand how to generate a list of requests to feed it. Any input is appreciated.
Below approach is working for me. Seq of HttpRequestBuilder will be executed concurrently:
val numberOfParallelReq = 1000
val scn = scenario("Some scenario")
.exec(
http("first request")
.post(url)
.resources(parallelRequests: _*)
.body(StringBody(firstReqBody))
.check(status.is(200))
)
def parallelRequests: Seq[HttpRequestBuilder] =
(0 until numberOfParallelReq).map(i => generatePageRequest(i))
def generatePageRequest(id: Int): HttpRequestBuilder = {
val body = "Your request body here...."
http("page")
.post(url)
.body(StringBody(body))
.check(status.is(200))
}
Not very sure of your query but seems like you need to send parallel request which can be done by
setUp(scenorio.inject(atOnceUsers(NO_OF_USERS)));
Refer this http://gatling.io/docs/2.0.0-RC2/general/simulation_setup.html

I need to get more than 100 pages in my query

I want to get all video information posible from Youtube for my proyect. I know that the limit page is 100.
I do the next code:
ArrayList<String> videos = new ArrayList<>();
int i = 1;
String peticion = "http://gdata.youtube.com/feeds/api/videos?category=Comedy&alt=json&max-results=50&page=" + i;
URL oracle = new URL(peticion);
URLConnection yc = oracle.openConnection();
BufferedReader in = new BufferedReader(new InputStreamReader(
yc.getInputStream()));
String inputLine = in.readLine();
while (in.readLine() != null)
{
inputLine = inputLine + in.readLine();
}
System.out.println(inputLine);
JSONObject jsonObj = new JSONObject(inputLine);
JSONObject jsonFeed = jsonObj.getJSONObject("feed");
JSONArray jsonArr = jsonFeed.getJSONArray("entry");
while(i<=100)
{
for (int j = 0; j < jsonArr.length(); j++) {
videos.add(jsonArr.getJSONObject(j).getJSONObject("id").getString("$t"));
System.out.println("Numero " + videosTotales + jsonArr.getJSONObject(j).getJSONObject("id").getString("$t"));
videosTotales++;
}
i++;
}
When the program finish, I have 5000 videos per category, but I need much more, much much more, but the limit is page = 100.
So, how can I get more than 10 millions of videos?
Thank you!
Are those 5000 also unique id's ?
I see the use of max-results=50, but not a start-index parameter in your url.
There is a limit on the results you can get per request. There is also a limit on the number of requests that you can send within some time interval. By checking the statuscode of the response and any error message you can find these limits, as they may change in time.
Besides the category parameter, use some other parameters too. For instance, you may vary the q parameter (used with some keywords) and/or order parameter to get a different results set.
See the documentation for available parameters.
Note, that you are using api version 2, which is deprecated. There is an api version 3.

Node.js url.parse result back to string

I am trying to do some simple pagination.
To that end, I'm trying to parse the current URL, then produce links to the same query, but with incremented and decremented page parameters.
I've tried doing the following, but it produces the same link, without the new page parameter.
var parts = url.parse(req.url, true);
parts.query['page'] = 25;
console.log("Link: ", url.format(parts));
The documentation for the URL module seems to suggest that format is what I need but I'm doing something wrong.
I know I could iterate and build up the string manually, but I was hoping there's an existing method for this.
If you look at the latest documentation, you can see that url.format behaves in the following way:
search will be used in place of query
query (object; see querystring) will only be used if search is absent.
And when you modify query, search remains unchanged and it uses it. So to force it to use query, simply remove search from the object:
var url = require("url");
var parts = url.parse("http://test.com?page=25&foo=bar", true);
parts.query.page++;
delete parts.search;
console.log(url.format(parts)); //http://test.com/?page=26&foo=bar
Make sure you're always reading the latest version of the documentation, this will save you a lot of trouble.
Seems to me like it's a bug in node. You might try
// in requires
var url = require('url');
var qs = require('querystring');
// later
var parts = url.parse(req.url, true);
parts.query['page'] = 25;
parts.query = qs.stringify(parts.query);
console.log("Link: ", url.format(parts));
The other answer is good, but you could also do something like this. The querystring module is used to work with query strings.
var querystring = require('querystring');
var qs = querystring.parse(parts.query);
qs.page = 25;
parts.search = '?' + querystring.stringify(qs);
var newUrl = url.format(parts);
To dry out code and get at URL variables without needing to require('url') I used:
/*
Used the url module to parse and place the parameters into req.urlparams.
Follows the same pattern used for swagger API path variables that load
into the req.params scope.
*/
app.use(function(req, res, next) {
var url = require('url');
var queryURL = url.parse(req.url, true);
req.urlparams = queryURL.query;
next();
});
var myID = req.urlparams.myID;
This will parse and move the url variables into the req.urlparams variable. It runs early in the request workflow so is available for all expressjs paths.

Send HTTPService Request in flex 3 with '-' in the URl Paramerters to get Google Feeds

I am developing application in flex 3 which interacts with the Google feeds to produce my results. The URL to which i want to send request is something like this
http://books.google.com/books/feeds/volumes?q=football+-soccer&start-index=11&max-results=10
Now i can send and receive results with q parameter, but in the next two parameters has a '-' (start-index and max-results). I am using HTTPService to send the requeset like this.
SearchService.url = "http://books.google.com/books/feeds/volumes";
SearchService.method = "GET";
SearchService.contentType = "application/x-www-form-urlencoded"
Here SearchService is the HTTPService
var params:Object = new Object();
params.q = searchText;
params.start-index = 11;
params.max-results = 100;
service.SearchService.send(params);
Now my flex IDE throws me a error stating 'Cannot assign a non-reference value'. Only if i send the request with this parameters, i could put pagination in my application. So how can i send HTTPService request with '-' in the URL parameters?
You can do:
var params:Object = new Object();
params["q"] = searchText;
params["start-index"] = 11;
params["max-results"] = 100;
service.SearchService.send(params);
Validated and tested to work properly! :)

Resources