SAPUI5 OData V2 Batch Operations with multiple differed groups - odata

I'm trying to do a batch operation with multiple groupids in a single batch request. and need to get results in two different batch responses.
But when I initiate the request only one group has been submitted and only get one batch response.
Here is my code
oModel = this.getOwnerComponent().getModel("mymodel");
$.sap.itemArray.forEach(function (entry) {
if (p < $.sap.itemArray.length) {
var oData = {
"AUTO_ID": entry.Id,
"VALUE": entry.Value,
};
mParameters.groupId = "createGroup1";
oModel .create("/Table1", oData, mParameters);
}
p++;
});
for (p = 0; p < $.sap.itemArray2.length; p++) {
var oData = {
"Item2ID" : $.sap.itemArray2[p].ItemsId,
"Value" : $.sap.itemArray2[p].Value
};
mParameters.groupId = "createGroup2";
oModel .create("/Table2", oData, mParameters);
}
oModel.setDeferredGroups(["createGroup1","createGroup2"]);
oStyleSizeModel.submitChanges({
success: function(recievedObject){
var responses = recievedObject.__batchResponses;
},
error: function (oError) {
var oBody = oError.responseText;
}
});
Here is the response I got.
And here it only execute the first group's requests and only returned one batch response.
How to execute multiple batch requests with different group Ids and get each group's responses.

I believe you want to group each change separately. You need to use the concept of Change Set (not Group).
Use createEntry method instead of create.
For each call, use different changeSetId, but same groupId.
Call submitChanges passing the groupId.

Related

How to do a POST batch request in Miscrosoft Graph .NET SDK

I'm trying to do a batch request using MS Graph .NET SDK as shown here: https://learn.microsoft.com/en-us/graph/sdks/batch-requests?tabs=csharp
The only problem is that when I run the code, nothing happens.
I'm trying to move a set of emails (stored in a list) to another mail folder.
Am I missing anything?
The move request is here https://learn.microsoft.com/en-us/graph/api/message-move?view=graph-rest-1.0&tabs=http
When used in a single query it works, but not when batching.
Below you will find the code, in this case I'm looping to 20 just to test as 20 is the maximum queries per batch.
Thanks in advance.
for (int i = 0; i < 20; i++)
{
var mail = invalidMessages[i];
var userRequest = client.Me.Messages[mail.Id]
.Move(failureFolderID)
.Request();
requestID = batchRequestContent.AddBatchRequestStep(userRequest);
}
var returnedResponse = await client.Batch.Request().PostAsync(batchRequestContent);
EDIT: I tried to change the method to POST
userRequest.Method = System.Net.Http.HttpMethod.Post;
but I get a ServiceException: 'Code: BadRequest
Message: Write request id : fe23b1c1-663d-4499-829a-291d04a12b48 does not contain Content-Type header or body.'
The Microsoft Graph message-move API call you are attempting to use is a POST request
The Microsoft Batch API handles POST requests differently than the other API methods.
As per https://learn.microsoft.com/en-us/graph/sdks/batch-requests?tabs=csharp
POST requests are handled a bit differently.
The SDK request builders generate GET requests, so
you must get the HttpRequestMessage and convert to a POST
To have a successful post with the batch API you need to
Create an HttpRequestMessage
provide a value for the HttpRequestMessage's Content property which houses the POST requests payload
So if I applied this to your code I would first create a class to represent the POST payload for the message-move API. As per https://learn.microsoft.com/en-us/graph/api/message-move?view=graph-rest-1.0&tabs=http
the POST API has one property called destinationId
destinationId - The destination folder ID, or a well-known folder name. For a list of supported well-known folder names, see mailFolder resource type.
public class MailMovePayload
{
public string destinationId { get; set; }
}
then I would use an instance of this class in this modified version of you code
string str = events.Content.ReadAsStringAsync().Result;
events.Method = HttpMethod.Post;
for (int i = 0; i < 20; i++)
{
var mail = invalidMessages[i];
//get the request message object from your request
var userRequestMessage = client.Me.Messages[mail.Id]
.Move(failureFolderID)
.GetHttpRequestMessage();
//set the message API method
userRequestMessage.Method = HttpMethod.Post;
//create the payload, I am assuming failureFolderID is
//the name of the folder where the mail will be moved to
var payloadData = new MailMovePayload { destinationId = failureFolderID };
//make the JSON payload for the request message
userRequestMessage.Content = new StringContent(JsonConvert.SerializeObject(payloadData), Encoding.UTF8, "application/json")
requestID = batchRequestContent.AddBatchRequestStep(userRequestMessage);
}
var returnedResponse = await client.Batch.Request().PostAsync(batchRequestContent);
return httpRequestMessage;
}

Avoid Service invoked for too many times in one day Message in Google Sheet

I tried to get the redirect url in google sheet with the cache function to avoid the error message: Service invoked for too many times in one day. However, the following code return an error message, may i know how to fix it
function getRedirects(url) {
var params = {
'followRedirects': false,
'muteHttpExceptions': true
};
var followedUrls = [url];
var cache = CacheService.getScriptCache();
var properties = PropertiesService.getScriptProperties();
try {
let res = cache.get(url);
while (true) {
var res = UrlFetchApp.fetch(url, params);
if (res.getResponseCode() < 300 || res.getResponseCode() > 399) {
return followedUrls;
}
var url = res.getHeaders()['Location'];
followedUrls.push(url);
}
}
}
You are reaching Apps Script quota limit since there's a while-true
As per the documentation Quotas are set at different levels for users of consumer (such as gmail.com). Which in your case is URL Fetch calls - 20,000 / day
I'd change the while (true) line and foresee how many requests will be made beforehand. When you use var res = UrlFetchApp.fetch(url, params); a request / day is consumed so keep in mind that unless you own a Workspace Account you won't be able to change your quota limit.
Reference
Quotas for Google Services

How do I get a continuation token for a bulk INSERT on Azure Cosmos DB?

I want to upload a CSV file that represents 10k documents to be added to my Cosmos DB collection in a manner that's fast and atomic. I have a stored procedure like the following pseudo-code:
function createDocsFromCSV(csv_text) {
function parse(txt) { // ... parsing code here ... }
var collection = getContext().getCollection();
var response = getContext().getResponse();
var docs_to_create = parse(csv_text);
for(var ii=0; ii<docs_to_create.length; ii++) {
var accepted = collection.createDocument(collection.getSelfLink(),
docs_to_create[ii],
function(err, doc_created) {
if(err) throw new Error('Error' + err.message);
});
if(!accepted) {
throw new Error('Timed out creating document ' + ii);
}
}
}
When I run it, the stored procedure creates about 1200 documents before timing out (and therefore rolling back and not creating any documents).
Previously I had success updating (instead of creating) thousands of documents in a stored procedure using continuation tokens and this answer as guidance: https://stackoverflow.com/a/34761098/277504. But after searching documentation (e.g. https://azure.github.io/azure-documentdb-js-server/Collection.html) I don't see a way to get continuation tokens from creating documents like I do for querying documents.
Is there a way to take advantage of stored procedures for bulk document creation?
It’s important to note that stored procedures have bounded execution, in which all operations must complete within the server specified request timeout duration. If an operation does not complete with that time limit, the transaction is automatically rolled back.
In order to simplify development to handle time limits, all CRUD (Create, Read, Update, and Delete) operations return a Boolean value that represents whether that operation will complete. This Boolean value can be used a signal to wrap up execution and for implementing a continuation based model to resume execution (this is illustrated in our code samples below). More details, please refer to the doc.
The bulk-insert stored procedure provided above implements the continuation model by returning the number of documents successfully created.
pseudo-code:
function createDocsFromCSV(csv_text,count) {
function parse(txt) { // ... parsing code here ... }
var collection = getContext().getCollection();
var response = getContext().getResponse();
var docs_to_create = parse(csv_text);
for(var ii=count; ii<docs_to_create.length; ii++) {
var accepted = collection.createDocument(collection.getSelfLink(),
docs_to_create[ii],
function(err, doc_created) {
if(err) throw new Error('Error' + err.message);
});
if(!accepted) {
getContext().getResponse().setBody(count);
}
}
}
Then you could check the output document count on the client side and re-run the stored procedure with the count parameter to create the remaining set of documents until the count larger than the length of csv_text.
Hope it helps you.

Creating multiple entities in single request in Microsoft Dynamics CRM (OData)

I know how to create a single entity in single request. However, one requirement wants me to create multiple entities (in my case it's multiple entries in ContactSet). I tried putting array to
POST /XRMServices/2011/OrganizationData.svc/ContactSet
[{
"MobilePhone": "+0012 555 555 555",
"YomiFullName" : "Demo User 1",
"GenderCode" : {
"Value" : 1
}
.....
<data removed for sanity>
.....
},
{
"MobilePhone": "+0012 555 555 111",
"YomiFullName" : "Demo User 2",
"GenderCode" : {
"Value" : 1
}
.....
<data removed for sanity>
.....
}]
However this does not work and I could not find any documentation explaining me ways to achieve this. Any help would be greatly appreciated.
You need to use an ExecuteMultipleRequest, I don't believe this is available in Rest service however, but is available in the SOAP service.
// Get a reference to the organization service.
using (_serviceProxy = new OrganizationServiceProxy(serverConfig.OrganizationUri, serverConfig.HomeRealmUri,serverConfig.Credentials, serverConfig.DeviceCredentials))
{
// Enable early-bound type support to add/update entity records required for this sample.
_serviceProxy.EnableProxyTypes();
#region Execute Multiple with Results
// Create an ExecuteMultipleRequest object.
requestWithResults = new ExecuteMultipleRequest()
{
// Assign settings that define execution behavior: continue on error, return responses.
Settings = new ExecuteMultipleSettings()
{
ContinueOnError = false,
ReturnResponses = true
},
// Create an empty organization request collection.
Requests = new OrganizationRequestCollection()
};
// Create several (local, in memory) entities in a collection.
EntityCollection input = GetCollectionOfEntitiesToCreate();
// Add a CreateRequest for each entity to the request collection.
foreach (var entity in input.Entities)
{
CreateRequest createRequest = new CreateRequest { Target = entity };
requestWithResults.Requests.Add(createRequest);
}
// Execute all the requests in the request collection using a single web method call.
ExecuteMultipleResponse responseWithResults =
(ExecuteMultipleResponse)_serviceProxy.Execute(requestWithResults);
// Display the results returned in the responses.
foreach (var responseItem in responseWithResults.Responses)
{
// A valid response.
if (responseItem.Response != null)
DisplayResponse(requestWithResults.Requests[responseItem.RequestIndex], responseItem.Response);
// An error has occurred.
else if (responseItem.Fault != null)
DisplayFault(requestWithResults.Requests[responseItem.RequestIndex],
responseItem.RequestIndex, responseItem.Fault);
}
}
ExecuteMultipleRequest is a good but not the only way. If you use CRM 2016 you can use Batch operations that is available in new WebApi. Check article that describes it - https://msdn.microsoft.com/en-us/library/mt607719.aspx
You can use a Web API action (see MSDN) to execute an ExecuteTransactionRequest, as described here. Subject of the example on MSDN is the WinOpportunityRequest, but it should work with any supported request, including custom actions.

Passing variables between IOS with InvokeAPI and mobile services custom API

I can do a quick SQL query within a custom API using the InbokeAPI method on iOS, but how would I pass a variable between iOS and the SQL Query within the customAPi so I can choose the ID row for example ?
When you call invoke API you can pass a JSON object over as the body. In your server side script, you can then pull that value out and use it as a parameter for your sql query. So on the client side you'd do something like this:
NSDictionary *postValues = #{ #"myParam": #"myValue"};
[self.client invokeAPI:#"myAPI" body:postValues
Then on the server side in your custom API script, you'd do something like this:
var value = request.body.myParam;
var sql = "SELECT * FROM Table WHERE Column = ?";
var mssql = request.service.mssql;
mssql.queryRaw(sql, [value], {
success: function(results) {
//do stuff for success
}, error: function(error) {
//do stuff for error
});

Resources