GraphQL Resolvers on client side for Apollo-iOS - ios

How do I write a resolver for making multiple API calls to fulfil a GraphQL query using Apollo-iOS on the client side (in my swift project)?
For Example: If, to construct a Person object from a query having name and age parameters - I have to fetch name from a service call and age from another service call and stitch them to form the Person object, then how would the resolver look like and where should I write it?
Any help is appreciated.

I believe GraphQL resolvers are only a concept server-side. The client simply asks for the data. The server is supposed to resolve the query sent by the client so your resolvers should make the necessary calls to different services to be able to resolve the query completely by accessing the single endpoint.
A more in-depth explanation of resolvers in GraphQL: https://medium.com/paypal-engineering/graphql-resolvers-best-practices-cd36fdbcef55
You could write your own swift code to grab data from two separate GraphQL endpoints/services if needed.

Related

OData in Datafactory

I have a task toget some data from an external supplier.
They have a Rest OData API. I have to connect using a subscription-key(APIKey).
When creating the OData LService, I add an Auth Header: "subscription-key" and in the Value field, I enter my key. After saving, I create a new dataset, and the OData LinkedService, provides me with the remote tables. I can choose the table I want and after that I create a pipeline to copy data from that table to my Azure SQL server.
This works fantastic :-)
However, after closing my browser and re-open it, the subscription key that I have entered earlier on the linked service, is now replaced with stars as it is a securestring. When I now run my pipeline, it will think that my key is the ten stars that have replaced my real key.
What am I doing wrong here ?
Also I would prefer to get my value from the KeyVault, but it seems that this is not possible on ODat connections....
Hope someone is able to provide some insight here :-)
BR Tom
From my testing I did not get any error on re-running. However coming to dynamic keys - I was not able to achieve it using the ODATA linked service.
Alternatively, if you can hit the ODATA endpoint with REST / HTTP Connector
You could - have a Web Activity to get the keys from the Key Vault and Set in the Variable.
WEB Activity URL : https://<your-keyvalut-name>.vault.azure.net/secrets/<your-secret-name>;
You could access the output of the web Activity using : #activity('Web1').output.value & Store in a variable.
You can reference this variable as the SUBSCRIPTION KEY for the subsequent steps in the REST/HTTP dataset.
You could pass it along the additional headers

Using Parameters of One Request to Dynamically Change the Response of Another

I have been using response templating to give dynamic responses, given that all the request and query parameters are associated with that request itself. However, I wanted to make a POST request with several parameters, and later use those parameters in a stubbed GET method's body response by using response templating. Is this something possible to do in wiremock? Any input is greatly appreciated, thank you!
Storing state between requests is not a default feature of WireMock outside of mocking the behavior through Stateful Behaviour, which is different from being actually stateful.
Without a custom plugin being able to share information between several requests is therefor not possible. In the WireMock documentation there is a section in the documentation on how to create such a plugin yourself. With a little development experience this is certainly doable.
On GitHub there are several plugin that create a storage mechanism to store information
WireMockCsv: store and retrieve information using HSQL Database.
wiremock-redis-extension does something similar using Redis.
An alternative to these approaches is to create mappings/data just before the test starts. For example generating all the responses beforehand and then using Templated BodyFileName tag to retrieve the just-in-time created file. Another way of achieving this result is to use the Admin API to create the mappings themselves directly.

How to manually update Relay store without querying server?

Let's say I have some data that I obtained through a non-graphql endpoint for example from third party server (firebase).
How do I put the data into the local relay store?
Is there an easy way to add / edit / overwrite data to relay store directly without going through query or mutation?
A non public RelayStoreData field is accessible from the Relay.Store instance and it gives you direct access to the records contained in the store. I haven't done anything with this myself but you could try modifying the cache directly like this:
RelayStore._storeData._cachedStore._records[recordId][fieldName]=newValue
I would use relay without a server, defining your graphql schema locally and doing your API requests from your graphql schema the same way you would query a database in your schema.
https://github.com/relay-tools/relay-local-schema

Is there a way to use Swagger just for validation without using the whole framework?

Suppose I have an existing Java service implementing a JSON HTTP API, and I want to add a Swagger schema and automatically validate requests and responses against it without retooling the service to use the Swagger framework / code generation. Is there anything providing a Java API that I can tie into and pass info about the requests / responses to validate?
(Just using a JSON schema validator would mean manually implementing a lot of the additional features in Swagger.)
I don't think there's anything ready to do this alone, but you can easily do this by the following:
Grab the SchemaValidator from the Swagger Inflector project. You can use this to validate inbound and outbound payloads
Assign a schema portion to your request/response definitions. That means you'll need to assign a specific section of the JSON schema to your operations
Create a filter for your API to grab the payloads and use the schema
That will let you easily see if the payloads match the expected structure.
Of course, this is all done for you automatically with Inflector but there should be enough of the raw components to help you do this inside your own implementation

simple bpel workflow : select query return multiple rows

I have to implement a simple bpel workflow, which only executes a select query on database. I have been able to create a Data Service wsdl file. Its flow is attached along with this question as an image file. Please have a look at that first. If you see the image, I some how ended up making a complex structure for parameter "Name" (auto-magically generated wsdl code by wso2 Data Service Server). It has a complex element called "Customer" which has 2 string values "Name" and "nid". I have also copied the wsdl file in case you need to see it. Here: http://pastebin.com/QTKZbdzn
I believe I am not sending any input parameters, while when I try to directly invoke the Data Service without Receive module, it gives an error, saying "No Start activity has been defined for the process".
Anyone who has implemented a similar BPEL workflow for the Data Service, please let me know. The data service works fine! I have tested it separately. thanks!
UPDATE
I ended up making a BPM like this:
I have to change the DSS also, so that I provide some input to the BPM. Like rather than "select * from customer" I am now implementing "select * from customer where nid = ?". It proved to be pretty succesfull. Thanks for helping me out joergl & vimesh. But if you still figure out how query with no where clause would work, update it here.
I have made a bpel flow with data service.
The very first thing we need to do is adding a receive element in the bpel flow. Actually it let us to send a request to the bps and at the same time bps makes a new instance with the request.
So then you can do whatever you wish, like invoking ESB proxies, DSS services, etc. while invoking the external service you can parameters to that request. Even though you are not sending any input parameters to the DSS service you should make a request to the DSS inside BPS in the correct format.(I mean the body part)
You can simply go ahead with the bpel samples available in this and then better start with DSS integrations.

Resources