Does Karate Support Def Variable update for latest data [duplicate] - bdd

This question already has an answer here:
Karate API Testing - Reusing variables in different scenarios in the same feature file
(1 answer)
Closed 1 year ago.
I am doing a test case where it will call API and that data will use next API call as part of One Scenario.
I am passing testdata as part of example 4 records .Here I have under one scenario first Given API call output passing to second given API call.As part of comapare the results i need the first API call output data to compare with second API call results.
So is there any way to capture all four test records data first API call data in one variable (each time variable to update)
example :
*def var = 'hello'
var = var +'world'
Please need help

Please read the docs, copied below for convenience: https://github.com/intuit/karate#script-structure
Variables set using def in the Background will be re-set before every Scenario. If you are looking for a way to do something only once per Feature, take a look at callonce. On the other hand, if you are expecting a variable in the Background to be modified by one Scenario so that later ones can see the updated value - that is not how you should think of them, and you should combine your 'flow' into one scenario. Keep in mind that you should be able to comment-out a Scenario or skip some via tags without impacting any others. Note that the parallel runner will run Scenario-s in parallel, which means they can run in any order.
So please don't expect a variable in one Scenario to be update-able by another Scenario.
But within a Scenario if you want to "collect" data, there are many ways. For example try appending to a list - refer: https://github.com/intuit/karate#json-transforms
* def init = []
# do some API call
* karate.appendTo(init, response)

Related

GET or POST? Openapi 3.0.0 to retrieve element with complex set of conditions [duplicate]

This question already has answers here:
Design RESTful query API with a long list of query parameters [closed]
(4 answers)
How to send a huge parameter list to a GET request
(3 answers)
Rest POST VS GET if payload is huge
(5 answers)
How to desing RESTful advanced search/filter
(1 answer)
Closed 6 months ago.
I'm trying to design a (simple?) REST API using openapi3.0 and the Swagger tools (and sort of new to this, I should add).
One of the endpoints will retrieve a set of items (from a database of several thousand items) based on one or more 'filters', or set of conditions.
path:
/item/filter:
get:
Here is my first doubt: I have to pass a (potentially large) set of conditions to my query.
For example:
[
{"propertyName":"height","queryType":"EqualOrGreaterThan","targetvalue":"1200"},
{"propertyName":"color","queryType":"Equal","targetvalue":"green"},
{"propertyName":"weigth","queryType":"LessThan","targetvalue":"1.2"}
]
Does using the GET method imply that everything will be encoded into the URL itself? How will OAS/Swagger know that my URLs won't get too long? Or is this some parameter that I can tweak in the codegen part?
Alternatively, I could use the POST method, but wouldn't this be a betrayal of good API design, as this query would not be modifying the status of the system/database?
Also, GET or POST method aside, I will still need to model the complex query. Should I go with defining a 'filter' schema and have my request define an array of items with the 'filter' schema? Or am I missing a better API design strategy for a case like this?
Also, what about the backend? Will the format of the data the server receives be the same regardless of POST/GET and, above all, regardless of the programming language used for the backend? (.NET, Python Flask, Java, PHP,...)?
During tutorials it looks easy, but I'm sort of confused right now.

Integromat Scenario: One-shot module after iterating through a loop

I have created a scenario where I iterate through multiple modules with an array of data. This works fine.
After this completes, I want to run a module once before the scenario completes.
How do I add a module that won't get called in the loop?
There are few ways to achieve this,
Use Router to Create a new Route that will be triggered after the
first route is complete
Trigger new Scenario via Webhooks after you are done with the
scenario
If you are working with array, then using Array Aggregator or other
Aggregators will allow you to first complete the iteration and then
trigger the module you want to use
I am not sure exactly what you want to do after the iteration is complete, but setting the scenarios as displayed in the screenshot below should help you get started on this,
Using Router
For this you can create a router, the upper hand of the router is always executed first, so the iterator and other operations will be done there. After which, the next hand/route will be executed which will be the module you want to trigger at last.
However, If you want to pass some values from the first hand/route to the last one then you will need to set a variable and fetch it on the second route. See details here : https://www.integromat.com/en/help/converger
Using Aggregator Module
You can either use Array, Text or Numeric Aggregator to aggregate all the iteration operations and then trigger the module that you want to use at last.
As far as my knowledge goes, there is no Integromat default modules that can be configured before the scenario ends. We can leverage the Integromat API in future that is currently in development to do so.
I found a filter to be the most easy way of doing this. Essentially chekcing if this bundle position is equal to the total number of bundles!
If you're interested in doing something on the last iteration only, you can use a filter to check if the current bundle is equal to the total number of bundles
last bundle filter
They won't let me paste pics sigh

ADO API: Builds-List incomplete list

I'm calling this API method:
https://learn.microsoft.com/en-us/rest/api/azure/devops/build/builds/list?view=azure-devops-rest-6.0#response
My API url (with placeholder names):
https://dev.azure.com/MyOrgName/MyProjName/_apis/build/builds?api-version=6.1-preview.6
The results are mostly appropriate, except I get a filtered list of builds, and I can't seem to get all the builds I want. In particular, builds from several pipelines are simply missing, and I can't find any way to include them. There's no discernable reason why some builds are included, and some are not.
The filter options describe ways I could reduce it more, but that's not my goal. I want to retrieve builds which I am otherwise not getting. And I don't know what option that I don't know about which will get me the results I care about.
As you have already noticed, there is a maximum number of the objects that can be listed on the response body of each API call. Normally, if the objects you want to list are too many, they will be returned in multiple pages.
In the response body of each call, generally there is a parameter 'continuationToken' (see here). You can access the next response page via calling the API with this parameter.
GET https://dev.azure.com/{organization}/{project}/_apis/build/builds?continuationToken={continuationToken}&api-version=6.1-preview.6
For example:
the first call returns the list in the first page;
then run the second call with the parameter 'continuationToken' returned in the response of the first call to get the second page;
then get the third page using the 'continuationToken' returned in the second response;
. . .
until the last page.
If you want to traverse all the pages, you may need to call the API in a loop.

Using data movement sdk ,want to remove collection from docs in real time in marklogic?

Actually I am new to data movement SDK,I want to know how we can used data movement sdk to remove collection from docs which match's specific condition in real time in marklogic ?
Yes, DMSK can reprocess documents in the database including modifying the collections on the documents.
The most efficient way to change document collections on the server might be to take an approach similar to the out-of-the-box ApplyTransformListener (as summarized by
https://docs.marklogic.com/guide/java/data-movement#id_51555) but to execute a custom module instead of a transform.
Summarizing the main points:
Write an SJS (Server-Side JavaScript) module that declares a variable (using the JavaScript var statement) to receive the document URIs sent by the client and modifies the collections on those documents using a function such as
https://docs.marklogic.com/xdmp.documentSetCollections
Install the SJS module in the modules database as described here
https://docs.marklogic.com/guide/java/resourceservices#id_13008
Create a QueryBatcher to get the document URIs either from a query on the database or from a client iterator as described here:
https://docs.marklogic.com/guide/java/data-movement#id_46947
Supply a lambda function for the QueryBatcher.onUrisReady() method - see
https://docs.marklogic.com/javadoc/client/com/marklogic/client/datamovement/QueryBatcher.html#onUrisReady-com.marklogic.client.datamovement.QueryBatchListener-
In the lambda function, construct and execute a ServerEvaluationCall to the SJS module, assigning the variable to the URIs passed to the lambda function - see:
https://docs.marklogic.com/guide/java/resourceservices#id_84134
Be sure to register failure listeners using the QueryBatcher.onQueryFailure() ApplyTransformListener.onFailure​() methods to log error or otherwise respond to the unexpected.
Hoping that helps,

Split datetime value received from external API in Rails app

I have a datetime value which comes from the API in this format: 2015-07-07T17:30:00+00:00. I simply want to split it up between the date and time values at this point. I am not using an Active Record model and I prefer not to use an sql database if I can.
The way I have set up the app means that the value is "stored" like this in my view: #search.dining_date_and_time
I have tried two approaches to solving this problem:
Manually based on this previous stackoverflow question from 2012: Using multiple input fields for one attribute - but the error I get is the attribute is "nil" even though I put a "try"
Using this gem, https://github.com/ccallebs/split_date_time which is a bit more recent and seems to be a more elegant solution, but after closely following the doc, I get this error, saying my Search model is not initalized and there is no method: undefined method dining_date' for #<Search not initialized>
This is when instead I put #search.dining_date in the view, which seems to be the equivalent of the doc's example (its not that clear). The doc also says the method will be automatically generated.
Do I need to alter my model so I receive the data from the API in another way? ie. not get the variable back as #search.dining_date_and_time from the Search model for any of this to work?
Do I need an Active Record model so that before_filter or before_save logic works - so i can (re)concatenate after splitting so the data is sent back to the API in a format it understands. Can I avoid this - it seems a bit of overkill to restructure the whole app and put in a full database just so I can split and join date/time as needed.
Happy to provide further details, code snippets if required.
As I am not using a conventional Rails DB like MySql Lite or Postgresql, I found that the best solution to the problem was by using this jQuery date Format plugin: https://github.com/phstc/jquery-dateFormat to split the date and time values for display when I get the data back from the API.
The Github docs were not too expansive, but once I put the simply put the library file in my Rails javascript assets folder, I just had to write a few lines of jQuery to get the result and format I wanted:
$(function() {
var rawDateTime = $('#searchDiningDateTime').html();
// console.log(rawDateTime);
var cleanDate = $.format.date(rawDateTime, "ddd, dd/MM/yyyy");
// console.log(cleanDate);
$('#searchDiningDateTime').html(cleanDate);
var cleanTime = $.format.date(rawDateTime, "HH:mm");
// console.log(cleanTime);
$('#searchTime').html(cleanTime);
});
Next challenge: rejoin the values on submit, so the API can read the data by sending/receiving a valid request/response. (The values can't be split like this when sent to the remote service).

Resources