entity framework, wait for results - asp.net-mvc

I'm writing some code where, most commonly, no results will be returned from a query against Entity Framework. This request has been submitted by some jQuery code, and if I reply with "no results", it's just going to turn around and make the same request again - so I'd like to not respond until either some results are available, or a reasonable amount of time (e.g. 30 seconds) have passed (however, I don't want to cache results for 30 seconds - 30 seconds is a reasonable amount of time to not send a response to the query - if results become available, I want them available "immediately")
How do I best go about this. I tried sleeping between re-querying, but it a) doesn't seem to be working (every request that starts with no results waits the full 30 seconds), and b) will tie up an asp.net thread.
So how do I convert my code to not tie up asp.net threads, and to respond once results are available?
[HttpGet]
public ActionResult LoadEventsSince(Guid lastEvent, int maxEvents)
{
maxEvents = Math.Min(50, maxEvents); //No more than 50
using (var dbctxt = new DbContext())
{
var evt = dbctxt.Events.Find(lastEvent);
var afterEvents = (from et in evt.Session.Events
where et.OccurredAt > evt.OccurredAt
orderby et.OccurredAt
select new { EventId = et.EventId, EventType = et.EventType, Control = et.Control, Value = et.Value }).Take(maxEvents);
var cycles = 30;
while (afterEvents.Count() == 0 && cycles-- > 0)
{
System.Threading.Thread.Sleep(1000);
}
return Json(afterEvents.ToArray(), JsonRequestBehavior.AllowGet);
}
}

check out this mix 11 session: "Pragmatic JavaScript jQuery & AJAX with ASP.NET".
At the very end of it (about 40-45 minutes into the session) there is a demo right for you.
I'm prety sure you'll say wow..
Damian Edwards promissed to post more about the technique on his blog, but we are yet to see it..

See > Reverse ajax Comet/Polling implementation for ASP.NET MVC?.
You need to go with long polling. It basically sends a request to the server and the server just keeps it in the queue. It accumulates all the queries and as soon as it receives some data it sends the response to each of the queued requests.
EDIT: Also this is interesting > Comet implementation for ASP.NET?

Related

Wait Until OData Request Completed

I have a requirement to oData service inside onExit hook method. I have written below code but unfortunately oData service is not getting called as view is destroyed immediately . Is there any way to add wait or delay the destroy of view until oData read request is complete ?
window.addEventListener("beforeunload", (event) => {
this.fnUnlockGremienVersion();
var s;
event = event || window.event;
if (this._PreviousGreVersion) {
s = "Your most recent changes are still being saved. " +
"If you close the window now, they may not be saved.";
event.returnValue = s;
return s;
}
});
UI5 has no support for such a workflow and it's not only "UI5", in fact webpages can't block/trap a user within a page for various reasons. Just remember the annoying un-closable webpages from the 2000s. Browser dropped support for such things; except a very simple pop-up api which is supported by UI5.
i assume you are in the shell.
Use the dirty flag properly(set it as long there are unsaved changes or a request still running) and the user will get a popup.
https://sapui5.hana.ondemand.com/sdk/#/api/sap.ushell.services.Container%23methods/setDirtyFlag

Best way to continue to query Quickbooks data

I have a problem with the qbxml.
I'm trying to migrate the qb customers, items etc to zohobooks.
I want to grab 50 customers first from quickbooks and calling zohobooks apis to create contacts on there. and again another 50 customers from quickbooks and to zohobooks.
The problem is I'm sure how can I continue to query after calling the zohobooks apis?
When I tried to use the same iteratorID from the first query response I got nothing from QB.
I'm building desktop app using .net, please advise me the best option to track the migration and where I'm.
Assume that I have 150 customers and for some reason stopped migrating after 100customers, in this case how can I get the last 50 customers next time?
public string customerQueryXml()
{
XmlDocument inputXMLDoc = new XmlDocument();
inputXMLDoc.AppendChild(inputXMLDoc.CreateXmlDeclaration("1.0", null, null));
inputXMLDoc.AppendChild(inputXMLDoc.CreateProcessingInstruction("qbposxml", "version=\"1.0\""));
XmlElement qbXML = inputXMLDoc.CreateElement("QBPOSXML");
inputXMLDoc.AppendChild(qbXML);
XmlElement qbXMLMsgsRq = inputXMLDoc.CreateElement("QBPOSXMLMsgsRq");
qbXML.AppendChild(qbXMLMsgsRq);
qbXMLMsgsRq.SetAttribute("onError", "stopOnError");
XmlElement customerQueryRq = inputXMLDoc.CreateElement("CustomerQueryRq");
qbXMLMsgsRq.AppendChild(customerQueryRq);
//customerQueryRq.SetAttribute("requestID", "1");
//customerQueryRq.SetAttribute("iterator", "Start");
customerQueryRq.SetAttribute("requestID", "2");
customerQueryRq.SetAttribute("iterator", "Continue");
customerQueryRq.SetAttribute("iteratorID", "{A1601C19-C6DC-43C0-AE43-6F45088C39F2}");
// for test only, read 10 customers
XmlElement MaxReturned = inputXMLDoc.CreateElement("MaxReturned");
customerQueryRq.AppendChild(MaxReturned).InnerText = "50";
XmlElement ownerID = inputXMLDoc.CreateElement("OwnerID");
customerQueryRq.AppendChild(ownerID).InnerText = "0";
XmlElement timeModifiedRangeFilter = inputXMLDoc.CreateElement("TimeModifiedRangeFilter");
customerQueryRq.AppendChild(timeModifiedRangeFilter);
XmlElement fromTimeModified = inputXMLDoc.CreateElement("FromTimeModified");
timeModifiedRangeFilter.AppendChild(fromTimeModified).InnerText = "1980-01-01T00:00:00";
XmlElement toTimeModified = inputXMLDoc.CreateElement("ToTimeModified");
timeModifiedRangeFilter.AppendChild(toTimeModified).InnerText = DateTime.Now.ToString("yyyy-MM-ddTHH:mm:ss");
return inputXMLDoc.OuterXml;
}
EDIT:
I noticed that I have to use the iteratorID in the same request. By the way I have no problem with the qbxml itself.
My question is how can I continue to query the customers, items or whatever on another request?
ProcessRequest (first time)
migrated xml data to another system
and after that for whatever reason I stopped the request
here, can I continue to query on another ProcessRequest?
Iterators have to be used within a single Session. e.g. this will work:
Connect to QuickBooks (establish a session)
Do a request to create an iterator and get the first page of records
Do another request to continue the iterator
Do another request to continue the iterator
While this will not work, and is not something supported by QuickBooks:
Connect to QuickBooks (establish a session)
Do a request to create an iterator and get the first page of records
Disconnect
Do a request to create an iterator and get the first page of records

Receving Google Speech streaming result when SingleUtterance equals true

I am using the Google Speech API to transcribe streaming audio. I would like to receive the result transcript. I set MaxAlternatives = 0 so I won't receive interim results, only the final transcription result. I also set SingleUtterance = true. My understanding is that I will get the first response as SpeechEventType = EndOfSingleUtterance and then the second and final response will contain the results, where I can access the transcription. Below is the code I have tested (slightly adapted from the docs: https://cloud.google.com/speech-to-text/docs/streaming-recognize). However, the two foreach loops are not very elegant and I imagine would only be necessary if InterimResults = true. So how can I receive the single transcription result from a streaming request as described above? I notice that there is about three seconds between when I finish talking, and when the transcript is received so I think that there is something wrong with how I am reading the results.
while (await streamingCall.ResponseStream.MoveNext(default(CancellationToken)))
{
foreach (var result in streamingCall.ResponseStream.Current.Results)
{
foreach (var alternative in result.Alternatives)
{
//Do stuff with alternatvie.Transcript
}
}
}

What is the maximum HTTP GET request length for a YouTube API?

I want to use youtube video:list api to get details of multiple videos in single request. As per the api documentation, I can send comma separated videoId list as id parameter. But what is the maximum length possible?
I know the GET request limit is dependent on both the server and the client. In my case I am making the request from server-side and not from browser. Hence the maximum length could be configured on my end. But what is the maximum length acceptable for youtube?
UPDATE: Though official documentation couldn't find, current limit is 50 ids from the tests performed as explained by Tempus. I am adding a code below with 51 different video ids (1 is commented) for those who want to check this in future.
var key = prompt("Please enter your key here");
if (!key) {
alert("No key entered");
} else {
var videoIds = ["RgKAFK5djSk",
"fRh_vgS2dFE",
"OPf0YbXqDm0",
"KYniUCGPGLs",
"e-ORhEE9VVg",
"nfWlot6h_JM",
"NUsoVlDFqZg",
"YqeW9_5kURI",
"YQHsXMglC9A",
"CevxZvSJLk8",
"09R8_2nJtjg",
"HP-MbfHFUqs",
"7PCkvCPvDXk",
"0KSOMA3QBU0",
"hT_nvWreIhg",
"kffacxfA7G4",
"DK_0jXPuIr0",
"2vjPBrBU-TM",
"lp-EO5I60KA",
"5GL9JoH4Sws",
"kOkQ4T5WO9E",
"AJtDXIazrMo",
"RBumgq5yVrA",
"pRpeEdMmmQ0",
"YBHQbu5rbdQ",
"PT2_F-1esPk",
"uelHwf8o7_U",
"KQ6zr6kCPj8",
"IcrbM1l_BoI",
"vjW8wmF5VWc",
"PIh2xe4jnpk",
"QFs3PIZb3js",
"TapXs54Ah3E",
"uxpDa-c-4Mc",
"oyEuk8j8imI",
"ebXbLfLACGM",
"kHSFpGBFGHY",
"CGyEd0aKWZE",
"rYEDA3JcQqw",
"fLexgOxsZu0",
"450p7goxZqg",
"ASO_zypdnsQ",
"t4H_Zoh7G5A",
"QK8mJJJvaes",
"QcIy9NiNbmo",
"yzTuBuRdAyA",
"L0MK7qz13bU",
"uO59tfQ2TbA",
"kkx-7fsiWgg",
"EgqUJOudrcM",
// "60ItHLz5WEA" // 51st VideoID. Uncomment it to see error
];
var url = "https://www.googleapis.com/youtube/v3/videos?part=statistics&key=" + key + "&id=" + videoIds.join(",");
var xmlHttp = new XMLHttpRequest();
xmlHttp.onreadystatechange = function() {
(xmlHttp.readyState == 4) && alert("HTTP Status code: " + xmlHttp.status);
}
xmlHttp.open("GET", url, true);
xmlHttp.send(null);
}
The answer is 50. Reason being, is that is all you will get back.
As some calls can have quite a few results depending on search criteria and available results, they have capped the "maxResults" at 50.
Acception to this is the CommentThreads which are up to 100.
This is (as you can work out) to speed page loads and call times.
EDIT:
This can be tested out HERE in the "Try api" part.
You will need to put 50 videoID's into the "id" field separated by coma's.
Then ad one more ID to get 51 and test again. You should receive a "400" response.
P.S. they do not need to be unique ID's. So have a few and then copy and paste as many times as needed ;-)

Facebook connect graph status objects have comments capped at 25

Does anyone know why no matter how many comments a given graph status update object has, it will cap the comments at 25? I have a feeling it only returns a 'sample' of the actual comments on the object. How do I force it to get them all without using the FQL APIs?
This is just the way the Graph API works. Take a look at the API docs. You get 25 at a time and have to loop through them. You can use the timestamp (created_time) of the last comment in the batch as a parameter in the next Graph API call or you can use the offset parameter. Which is what I've been doing. I was running into some screwiness using created_time. This is an example from my C# test app. Ignore the references to the PostComment object that's just a data structure I created to hold the data I'm pulling. The magic (and the process i'm referencing) is in the parameters being passed to the graph API call:
parameters.Add("offset", numPostComments);
parameters.Add("limit", 25);
I'm fairly certain you can set the "limit" to anything 25 or below.
do
{
foreach (var comment in comments.data)
{
numPostComments++;
PostComment pc = new PostComment();
pc.Post_ID = p.Id;
pc.Facebook_ID = comment.id;
pc.From = comment.from.name;
if (comment.likes != null)
pc.Likes = (int)comment.likes;
pc.CommentDate = DateTime.Parse(comment.created_time);
pc.CommentText = comment.message;
p.Comments.Add(pc);
}
// Create new Parameters object for call to API
Dictionary<string, object> parameters = new Dictionary<string, object>();
parameters.Add("offset", numPostComments);
parameters.Add("limit", 25);
// Call the API to get the next block of 25
comments = client.Get(string.Format("{0}/comments", p.Facebook_ID), parameters);
} while (comments.data.Count > 0);

Resources