I need to save user temporary data (500 users). I am currently using:
DataTable myData = new DataTable();
myData.Columns.Add("id", typeof(int));
myData.Columns.Add("name", typeof(string));
...
myData.Rows.Add("1", "name1", ...);
myData.Rows.Add("2", "name2", ...);
...
myData.Rows.Add("500", "name500" ,...);
New rows are added/edited, eg 50x per second from one user... and every 1 minute the data are sent to Mysql database.
Is this method sufficiently stable and fast to add/edit/delete a large amount of temporary data?
I was thinking about saving to an xml file, but I think my way is faster ...
Thank you for any advice.
Edit:
I have a server and the users (clients) are connected to server, and they send data to server and server send data to them.
E.g. When the client send a message to others clients.
Related
I have a problem with the qbxml.
I'm trying to migrate the qb customers, items etc to zohobooks.
I want to grab 50 customers first from quickbooks and calling zohobooks apis to create contacts on there. and again another 50 customers from quickbooks and to zohobooks.
The problem is I'm sure how can I continue to query after calling the zohobooks apis?
When I tried to use the same iteratorID from the first query response I got nothing from QB.
I'm building desktop app using .net, please advise me the best option to track the migration and where I'm.
Assume that I have 150 customers and for some reason stopped migrating after 100customers, in this case how can I get the last 50 customers next time?
public string customerQueryXml()
{
XmlDocument inputXMLDoc = new XmlDocument();
inputXMLDoc.AppendChild(inputXMLDoc.CreateXmlDeclaration("1.0", null, null));
inputXMLDoc.AppendChild(inputXMLDoc.CreateProcessingInstruction("qbposxml", "version=\"1.0\""));
XmlElement qbXML = inputXMLDoc.CreateElement("QBPOSXML");
inputXMLDoc.AppendChild(qbXML);
XmlElement qbXMLMsgsRq = inputXMLDoc.CreateElement("QBPOSXMLMsgsRq");
qbXML.AppendChild(qbXMLMsgsRq);
qbXMLMsgsRq.SetAttribute("onError", "stopOnError");
XmlElement customerQueryRq = inputXMLDoc.CreateElement("CustomerQueryRq");
qbXMLMsgsRq.AppendChild(customerQueryRq);
//customerQueryRq.SetAttribute("requestID", "1");
//customerQueryRq.SetAttribute("iterator", "Start");
customerQueryRq.SetAttribute("requestID", "2");
customerQueryRq.SetAttribute("iterator", "Continue");
customerQueryRq.SetAttribute("iteratorID", "{A1601C19-C6DC-43C0-AE43-6F45088C39F2}");
// for test only, read 10 customers
XmlElement MaxReturned = inputXMLDoc.CreateElement("MaxReturned");
customerQueryRq.AppendChild(MaxReturned).InnerText = "50";
XmlElement ownerID = inputXMLDoc.CreateElement("OwnerID");
customerQueryRq.AppendChild(ownerID).InnerText = "0";
XmlElement timeModifiedRangeFilter = inputXMLDoc.CreateElement("TimeModifiedRangeFilter");
customerQueryRq.AppendChild(timeModifiedRangeFilter);
XmlElement fromTimeModified = inputXMLDoc.CreateElement("FromTimeModified");
timeModifiedRangeFilter.AppendChild(fromTimeModified).InnerText = "1980-01-01T00:00:00";
XmlElement toTimeModified = inputXMLDoc.CreateElement("ToTimeModified");
timeModifiedRangeFilter.AppendChild(toTimeModified).InnerText = DateTime.Now.ToString("yyyy-MM-ddTHH:mm:ss");
return inputXMLDoc.OuterXml;
}
EDIT:
I noticed that I have to use the iteratorID in the same request. By the way I have no problem with the qbxml itself.
My question is how can I continue to query the customers, items or whatever on another request?
ProcessRequest (first time)
migrated xml data to another system
and after that for whatever reason I stopped the request
here, can I continue to query on another ProcessRequest?
Iterators have to be used within a single Session. e.g. this will work:
Connect to QuickBooks (establish a session)
Do a request to create an iterator and get the first page of records
Do another request to continue the iterator
Do another request to continue the iterator
While this will not work, and is not something supported by QuickBooks:
Connect to QuickBooks (establish a session)
Do a request to create an iterator and get the first page of records
Disconnect
Do a request to create an iterator and get the first page of records
I want to persist some parts of data from Relay store and load it again at later sessions for better user experience. (I am using Relay with react native for context).
The data can be relatively large (up to few thousands of records) and doesn't need to be 100% in sync with the server.
I want to persist the records across sessions as I don't want to refetch the data every time user opens an app. It will be both burden to the server and bad user experience (loading time).
You have access to Relay's full store in the environment file you create when setting up Relay. If you try console logging out recordSource you should see your entire store, and it should update every time Relay processes a new operation (Query/Mutation/Subscription), so maybe all you have to do is store that in your persisted storage (i.e. localStorage).
Example:
// your-app-name/src/RelayEnvironment.js
import {Environment, Network, RecordSource, Store} from 'relay-runtime';
import fetchGraphQL from './fetchGraphQL';
async function fetchRelay(params, variables) {
console.log(`fetching query ${params.name} with ${JSON.stringify(variables)}`);
return fetchGraphQL(params.text, variables);
}
const recordSource = new RecordSource();
console.log(recordSource);
// Store `recordSource` in persisted storage (i.e. localStorage) here.
if(typeof window !== "undefined") { // optional if you're not doing SSR
window.localStorage.setItem("relayStore", JSON.stringify(recordSource));
}
export default new Environment({
network: Network.create(fetchRelay),
store: new Store(recordSource),
});
Given a list of Keys, I want to pull out multiple values from Azure Redis Cache.
How do we perform multiple operations at the same time with Azure Redis Cache?
Our data is a int/ComplexObject pair. Our data is located in SQL Server. We currently get the list by converting our List<int> of keys into a XElement object and passing that into a stored procedure - but our key size is quite small (3000 keys) - so the same data is being accessed again and again by multiple users.
It would be great if we can just cache the 3000 key/value pairs once - and then access them with something like: cache.GetValues(List<int> keys)
There is nothing special for Azure Redis cache. You would want to do the transaction operation supported in Redis as shown here http://redis.io/topics/transactions
If you using Stack Exchange Redis client then you can refer to this page https://github.com/StackExchange/StackExchange.Redis/blob/master/Docs/Transactions.md
Look at the MGet (http://redis.io/commands/mget) and MSet (http://redis.io/commands/mset) functionality that Redis has. These are supported on the StackExchange.Redis client.
private static void MGet(CancellationToken cancellationToken)
{
var pairs = new KeyValuePair<RedisKey, RedisValue>[] {
new KeyValuePair<RedisKey,RedisValue>("key1", "value1"),
new KeyValuePair<RedisKey,RedisValue>("key2", "value2"),
new KeyValuePair<RedisKey,RedisValue>("key3", "value3"),
new KeyValuePair<RedisKey,RedisValue>("key4", "value4"),
new KeyValuePair<RedisKey,RedisValue>("key5", "value5"),
};
var keys = pairs.Select(p => p.Key).ToArray();
Connection.GetDatabase().StringSet(pairs);
var values = Connection.GetDatabase().StringGet(keys);
}
You will want to keep in mind that getting or setting too many keys on a single command can lead to performance problems.
I've signed up for the free month trial of Azure, and I have created a Mobile Service. I'm using iOS, so I downloaded the model Todo app for iOS.
I am now trying to use Table Storage in the back end instead of a MSSQL store; I have found instructions on using Table Storage here: http://azure.microsoft.com/en-us/documentation/articles/storage-nodejs-how-to-use-table-storage/
However, my app is still storing todo items in the MSSQL storage. I've been told that I don't need to do anything in the client to make the switch, so I assume everything I need to do must be done in the node.js scripts. But I'm clearly missing something.
One thing that confuses me is that after I downloaded the generated node.js script for the Todo app, I didn't see anything in it that seemed to be explicitly talking to the MSSQL database.
Any pointers would be greatly appreciated.
EDIT:
here's my todoitem.insert.js:
var azure = require('azure-storage');
var tableSvc = azure.createTableService();
function insert(item, user, request) {
// request.execute();
console.log('Request received');
console.log(request);
var entGen = azure.TableUtilities.entityGenerator;
var task = {
PartitionKey: entGen.String('learningazure'),
RowKey: entGen.String('1'),
description: entGen.String('add something to TS'),
dueDate: entGen.DateTime(new Date(Date.UTC(2014, 11, 5))),
};
tableSvc.insertEntity('codedelphi',task, {echoContent: true}, function (error, result, response) {
if(!error){
// Entity inserted
console.log('No error on table insert: task created.');
request.respond(statusCodes.SUCCESS, 'OK.');
} else {
console.log('Houston, we have a problem. Entity not added to table.');
console.log(error);
}
});
console.log(JSON.stringify(item, null, 4));
}
tableSvc.createTableIfNotExists('codedelphi', function(error, result, response){
if(!error){
// Table exists or created
console.log('No error, table should exist');
} else {
console.log('We have a problem.');
console.log(error);
}
});
Mobile Services has the built in capability to handle talking to your SQL Database for you. When your script calls "request.execute()" that triggers whatever the request is (insert, update, delete, select) to be ran against the SQL database. Talking to Table Storage instead of SQL requires you to edit those scripts to explicitly talk to Table Storage (i.e. perform your insert, update, deletes, and reads). Today there is no magic switch which will change your "request.execute" from talking to SQL to talk to Table Storage. If you've already edited your scripts to talk to Table Storage and it's not working / you still see data stored in your SQL database, I would suspect that you are either still calling "request.execute" in your scripts, or you haven't pushed them to your Mobile Service (if you've pulled them down locally and then need to push them back to your service). If you've done all of the above, update your question with the Node.js script in question so we can see it.
As Chris pointed out, you are most likely still calling request.execute() from your table scripts. By design, this will explicitly talk to the MSSQL database you configured your application with. You will have to edit your table scripts to not perform "request.execute()" and instead interact with the TableService object.
If you follow the tutorial, and do the following:
1. Import the package.
2. Create the table service object.
3. Create an entity (and modify the variables to store the data you need)
4. Write the entity to your table service.
You should see data being written to table storage rather than SQL database.
Give it a shot and ping back, we'll help you out.
I'm creating a page that outputs a list of 1000-3000 records. The current flow is:
User loads a page
jQuery hits the server for all the records and injects them into the page.
Problem here is that those records for some users can take 3+ seconds to return which is a horrible UX.
What I would like to do is the following:
1. User loads a page
2. jQuery hits the server and gets at most 100 records. Then keeps hitting the server in a loop until the records loaded equal the max records.
Idea here is the user gets to see records quickly and doesn't think something broke.
So it's not really an infinite scroll as I don't care about the scroll position but it seems like a similar flow.
How in jQuery can I the the server in a loop? And how in rails can I query taking into account a offset and limit?
Thank you
You can simply query the server for a batch of data over and over again.
There are numerous APIs you can implement. Like:
client: GET request /url/
server: {
data: [ ... ]
rest: resturl
}
client GET request resturl
repeat.
Or you can get the client to pass in parameters saying you want resource 1-100, then 101-200 and do this in a loop.
All the while you will render the data as it comes in.
Your server either needs to let you pass in parameters saying you want record i to i + n.
Or your server needs to get all the data. Store it somewhere then return a chunk of the data along with some kind unique id or url to request another chunk of data and repeat this.
// pseudo jquery code
function next(data) {
render(data.records);
$.when(getData(data.uniqueId)).then(next);
}
function getData(id) {
return $.ajax({
type: "GET",
url: ...
data {
// when id is undefined get server to load all data
// when id is defined get server to send subset of data stored # id.
id: id
},
...
});
}
$.when(getData()).then(next);