I defined a GraphQL Mutation using graphql-relay but am having issues figuring out how to submit a mutation to it.
Here is the relevant schema:
const userType = new GraphQLObjectType({
name: 'User',
description: 'user',
fields: () => ({
id: {
type: new GraphQLNonNull(GraphQLString),
description: 'The UUID for the user.',
resolve(user) {
return user.uuid;
},
},
})
});
const registerUser = mutationWithClientMutationId({
name: 'registerUser',
inputFields: {
},
outputFields: {
user: {
type: userType,
resolve: (payload) => {
models.user.findById(payload.userId);
}
},
},
mutateAndGetPayload: (args) => {
var newUser = models.user.build().save();
return {
userId: newUser.id,
};
}
});
const rootMutation = new GraphQLObjectType({
name: 'RootMutationType',
fields: {
registerUser: registerUser,
},
});
const schema = new GraphQLSchema({
query: rootQuery,
mutation: rootMutation,
});
What should an HTTP call look like to register a new user and get back the userId?
Thanks!
I want to point out that I see that you're saying that your mutation requires no parameters - how does it know what the new user's details are? You'll probably need some parameters on that mutation, eventually. They would be available to your mutateAndGetPayload on that first function parameter. (I'm not saying every mutation needs parameters, but this one probably does)
If you're using Relay, there is some pretty good information on the official document as to how to use your mutations from Relay. Particularly down at the bottom where it shows the various mutator configs. If you're using connections, you may want to use RANGE_ADD to add this new account to the Relay store manually, otherwise if you'd like to perform a more broad refetch you can use FIELDS_CHANGE. You said you need the new user id after the mutation finishes. If you're using Relay, you may need to look into REQUIRED_CHILDREN to specify that regardless of the computed query that Relay will build, you always want that id to be queried.
The output of your mutation is a userType, so you'd be able to access it with a fragment on the payload type, which would probably be RegisterUserPayload, that might look something like ...
fragment on RegisterUserPayload {
user {
id
}
}
Now, that's assuming you're using Relay. If you'd like to try this out manually via GraphiQL, then you can use the examples of how to do mutations through there on the GraphQL Mutation docs. There's a direct example of how you'd query your mutation.
Last, since you asked how to do this at a low level of issuing the HTTP request yourself, for that you can look at express-graphql documentation, which explains how to query it.
I figured out a mutation format that worked:
mutation RootMutationType {
registerUser(input:{clientMutationId:"123"}){
clientMutationId, user { id }
}
}
Related
I don't see any good documentation about how to execute GraphQL APIs using F# with Http.fs
Kindly share if you have the correct syntax available or point to the correct documentation for the same. I was trying with the Star Wars API given here: https://www.rithmschool.com/blog/an-introduction-to-graphql-queries
URL: https://swapi.graph.cool
Header: 'Content-Type': 'application/json'
JSON Body:
query {
Film (title:"A New Hope" ) {
director
characters {
name
}
}
}
Expected Response same as: https://swapi.graph.cool/
I'm not familiar with Http.fs, but here is a small working example of calling the API using the F# Data Http utility:
Http.RequestString
( "https://swapi.graph.cool",
httpMethod="POST", headers=[ HttpRequestHeaders.ContentType("application/json") ],
body=TextRequest("{\"query\": \"{ allFilms { title } }\"}") )
The main thing is that the body needs to be a JSON value where the actual query is a string stored in a record with a field named "query", i.e. {"query": "...."}.
I used Vuex with vue-apollo.
But now i know vue-apollo store data in cache then i can use local state by using cache.
When i saw offical document from local state
I knew how can i get and set data from local state but there is no mention how to get data from remote server.
I saw this code from document, they just write to cache temp data.
cache.writeData({
data: {
todoItems: [
{
__typename: 'Item',
id: 'dqdBHJGgjgjg',
text: 'test',
done: true,
},
],
},
});
So, i think i can get remote data by using vue-apollo query like below.
apollo: {
world: {
query: gql`query {
hello
}`,
update: data => data.hello
}
}
After i get server data like above, i can query/mutate from local state.
But this is only my guess, is this correct?
I found example from github. My guess is right.
While performing Read operation on V4 ODatamodel I'm getting an error saying
oModel.read is not a function
Code
Error
Please let me know how to correct if I did something wrong.
This error is expected.
read method does not exist in oData Model V4.
See below:
read is not a function in V4
However, you can do the same thing with oData V2(recommended approach for working with oData as V4 has still some features missing)
Restrictions with oData V4
oData V2 vs oData V4
Nevertheless, if you need to bind the response items later with a table, you can
do it as:
var oModel = new sap.ui.model.odata.v4.ODataModel({
groupId: "$auto",
serviceUrl: "url",
synchronizationMode: "None",
operationMode: "Server"
}),
oSettings = new sap.ui.model.json.JSONModel({
bOnlyLarge: false,
bFilterGermany: false
});
var oTable = new sap.ui.table.Table({
columns: [{
label: "ProductName",
template: new sap.m.Text({
text: "{Country}"
}),
sortProperty: "ProductName"
}]
});
oTable.setModel(oModel);
oTable.bindRows({
path: "/Products"
});
var oModel = new sap.ui.model.odata.v4.ODataModel({
/* send requests directly. Use $auto for batch request wich will be send automatically on before rendering */
groupId : "$direct",
/* I'll just quote the API documentary:
Controls synchronization between different bindings which refer to the same data for the case data changes in one binding.
Must be set to 'None' which means bindings are not synchronized at all; all other values are not supported and lead to an error.
*/
synchronizationMode : "None",
/*
Root URL of the service to request data from.
*/
serviceUrl : "http://services.odata.org/TripPinRESTierService/",
/*
optional. Group ID that is used for update requests. If no update group ID is specified, mParameters.groupId is used.:
updateGroupId : "$direct"
*/
});
I want to track my custom processes through Zabbix (v2.4.8). I am generating the following json object and sending it through UserParameter=service.value[*],/usr/lib/zabbix/externalscripts/custom1.bash:
{
"data":[
{
"{#NAME}":"ntp",
"{#VALUE}":"1"
},
{
"{#NAME}":"mysql",
"{#VALUE}":"1"
},
{
"{#NAME}":"prometheus",
"{#VALUE}":"0"
},
{
"{#NAME}":"apache2",
"{#VALUE}":"0"
}
]
}
Also, creating an item prototype and graph prototype inside a new template with a new discovery rule, having the following information:
Discovery rule name: Service Graph
Type: Zabbix Agent
key: service.value
Item Prototype name: Service {#NAME} Graph
Type: Zabbix Agent
key: service.value[{#NAME},{#VALUE}]
Type of info: Numeric(Unsigned) & Decimal
When I apply these settings, the items keep giving the following error:
Not supported: Received value [{ "data":[ { "{#NAME}":"ntp", "{#VALUE}":"1" }, { "{#NAME}":"mysql", "{#VALUE}":"1" }, { "{#NAME}":"prometheus", "{#VALUE}":"0" }, { "{#NAME}":"apache2", "{#VALUE}":"0" } ]}] is not suitable for value type [Numeric (unsigned)] and data type [Decimal]
I have to create a graph prototype with these settings, so I cannot mention type as "Text" for obvious reasons.
Another question: The graphs thus generated are not clickable at all like the other existing graphs.
Please let me know where I am going wrong.
If your service.value key generates JSON, that should be used with the LLD rule only. You should not send any values in it. The key to be used in the prototypes should be like any normal key they only returns values it was asked for, do not use the LLD-generating key there.
Your current JSON looks like you might be able to use the built-in items for process monitoring, but that is hard to be sure about without additional detail.
Also note that [*] in the UserParameter definition is not needed if you do not pass parameters to this key.
In a schema with optional values such as code in the example:
'code': {
'type': 'string',
},
'name': {
'type': 'string',
'required': True,
},
'email': {
'type': 'string',
'required': True
}
Let's say there's an inserted document with a value for code. Can I unset the code key like mongodb $unset does, using Eve somehow?
One way to achieve this is to setup a default projection for the endpoint.
Limiting the Fieldset Exposed by the API Endpoint
By default API responses to GET requests will include all fields defined by the corresponding resource schema. The projection setting of the datasource resource keyword allows you to redefine the fields.
people = {
'datasource': {
'projection': {'username': 1}
}
}
The above setting will expose only the username field to GET requests, no matter the schema defined for the resource.
Another option is to leverage MongoDB Aggregation Framework itself. Just set the endpoint so that a aggregation is performed before data is returned to the client. The following should work (see the docs for details):
posts = {
'datasource': {
'aggregation': {
'pipeline': [{"$unset": "code"}]
}
}
}
You need Eve v0.7 for aggregation support.
I doubt you can do it with a PATCH request, but a PUT request should do.
import requests
# get original doc
resp = requests.get(document_url)
# prepare a new document
doc = resp.json()
new_doc = {k: v for k, v in doc.items() if not k.startswith('_')}
del new_doc['code']
# overwrite the complete document
resp = requests.put(document_url, json=new_doc, headers={'If-Match': doc['_etag']}