Considerations while using Azure offline Sync - asp.net-mvc

Here are a few details in my current setup to keep in mind for this discussion to be meaningful.
Client App: Xamarin Forms app, uses Azure offline Sync
Web Access: There is a Asp.Net based web front end that also accesses the Mobile Service to read/ write data to the underlying tables
Backend: Managed dot net based App Service hosted on Azure
My Question: As of now I pass a null value to Incremental Sync parameter to ignore any incremental sync (while using PullAsync method) & get all data from the server. This however is inefficient as it increases the load time as the volume of data increases.
If I use a constant value for Incremental Sync parameter for the table under consideration, this is what it does
Each time a pull operation returns a set of results, the latest updatedAt timestamp from that result set is stored in the SDK local system tables. Subsequent pull operations will only retrieve records after that timestamp.
as read here
The problem is, my web access also updates the data from web & it also uses Azure App service framework to do that. In this case, if I use the incremental sync parameter do a Pull of data, then there is a change on the data from the web app & if the mobile app again does a Push, Pull.. will it get the latest changes from the web too?
My understanding is that incremental sync only works with the data the context is aware of, but in this case because the web app is also writing the data, which the context is not aware of, then it misses some of the data & the mobile app never downloads it.
Is there a way to fix this behavior? So I can start using incremental sync & still be able to ensure that the app always gets all the data from server without missing any bits?
Update:
#Adrian it's not working for me. May be because the way my Tables are structured, or the way i am dealing with them is not correct.
I have a User Table & a Department Table, I also have a UserDepartment Table.
When I sync the app, the data in this UserDepartment Table gets duplicated on the app.
The reason could be because, when I edit the user on web to assign departments to them, I delete all previous User Department entries from UD table & recreate new ones based on current selection. Do you think this could be causing issues with azure client SDK & thus duplicating some data for me?

If your web app and mobile app update through the same Mobile App API (i.e. using the .NET SDK and JS SDK respectively), then the updatedAt is set appropriately for both pushes and the other side should pick up the change. I've done this myself several times and it works.
One the web side (using the JS sdk), you just use an online table. On the mobile side (with the .NET SDK), you use incremental sync.

Related

CloudKit - no server-side logic?

With CloudKit, you can focus on your client-side app development and let iCloud eliminate the need to write server-side application logic. CloudKit provides you with Authentication, private and public database, structured and asset storage services — all for free with very high limits.
You cannot upload any code to run on Apple's servers?
I've heard it being compared to Google App Engine and other cloud computing platforms, but without the ability to run your own code, isn't the whole thing pretty limited and not really comparable?
For example, if I want to build a news app which periodically pushes stories on topics that the user is interested, then this can't be done just using CloudKit because I would need scheduled jobs and data processing on the server.
Any thoughts?
Server-side
As you said CloudKit doesn't allow server-side code.
But there are possibilities.
Crons
You don't want to connect to the iCloud Dashboard everyday in order to perform the push by adding a record. One solution here is to code an app on a mac server (I guess mac mini as server will become more popular with CloudKit) that add a new Daily CKRecord every day.
Subscriptions
Subscriptions concept is that the client registers for specific updates. You can create a record type called Daily for instance and make users register to it. You should check the Apple documentation and WWDC14 videos (even if Subscriptions are not detailed, it's a good start point).
The good thing is push notifications are linked with the subscription concept. So basically you say: Send my a notification for each new CKRecord of type Daily added.
BaaS party
What is the point for using CloudKit (vs Parse and other?)
Price: CloudKit has a really nice pricing
Ready to go: 2 clicks inside XCode and you are ready to go
User consistency: you get free user login for all his devices through their iCloud account. With a very good privacy system. And you can get relationships with a smart system.
But:
You are stick on Apple platform. We don't even know if we could export the data..
Only data-centered for now (no server-side code)
The CloudKit dashboard is too limited
The future
CloudKit is still pretty new. At the WWDC some guys behind it made me understand that they are still heavily working on it. My bets are they are working on 2 important points :
Server side code execution through remote scheduled tasks
CloudKit for Analytics (Visualization side)
Edit: Apple guys are fully aware and concerned about the lack of web access for the data. It means that one day it may be accessible from other platforms. I read in a comment that Apple probably would have bought Parse if CloudKit wasn't better, AFAIK they tried to buy Parse (skills buy it's said, but we don't really know).
Update WWDC15
CloudKit is now available in JS and some dashboard are available now. Wait and see.
Update February 2016
CloudKit Now Supports Server-to-Server Web Service Requests
Web Services Reference
In some cases, we do not need server-side logic, and just storing static data can cover all the usage scenario.
In this case, it would be very helpful if there's a free accessible storage that you can store something. CloudKit provides such stuffs rather then full service platform.
Yes it is limited. Anyway can be useful for some people. For example, your case actually can be supported CloudKit. Though CloudKit is just a static storage, it support subscription. Which monitors a set of conditions and pushes the event notification to client. It's fortunate that the only background job feature supported by CloudKit is just what you need.
Anyway, if you need more, then you might need to consider full fledged servers. Usually simple web services with simple server-side code execution support are also limited.
You cannot upload any code to run on Apple's servers?
You can and you can't. You can't upload code / SOAP based web services to the server, instead of it you can upload / store observers on the server, called subscription.
whole thing pretty limited and not really comparable?
I would say in CloudKit and in MBaas client communicates with server though a more narrower more robust interface: you can not upload exotic web service to do XML parsing, database manipulations and based on it trigger push notifications, but RestFull architecture allows you to perform the 4 basic operation on the data store, and with subscription client can get notified about INSERT / UPDATE / DELETE operations performed on tables.
I think MBaas is just the next step in evolution of server - client architecture. First it seems it is limiting, but you can do all as in SOAP based web services world. Development is extremely fast / scalable / comfortable to use and easier to control things like permissions / setup, maintain server, security needs almost no effort.
Believe it or not, you can actually get REALLY far with this approach.
I've not used CloudKit, but I can describe for you my application stack:
AngularJS (or your favorite client side HTML rendering framework): A single page will host a series of templates/controllers selected by the router and driven by users changing the anchor to select which page they're on.
Firebase.io (or your favorite cloud storage): Any dynamic data goes into the cloud document store. The controller needs to load the data and render the template on the client, and when the data changes, send the data back. This also provides the authentication and authorization as well, since you can limit access to the data.
Now you need a place to serve the HTML/CSS/JS/images... which requires no 'server side code execution', just a web server where you can put the assets.
Using this technique you could store all the user's topics in the database for that user, and when the page loads, go and aggregate all the sources for those topics (also stored in the database) completely client side. There's nothing in your example application which actually requires server side execution that I can see, so long as you have cloud storage which will provide you with authentication and authorization services, and a 'dumb' web server for serving up static assets.
CloudKit isn't a full-fledged web hosting service. Instead, it's an SDK for iCloud. You shouldn't be putting a web site up there, just storing user data that you may want to use in multiple applications or platforms.
iCloud APIs enable your apps to store app data in iCloud, keeping your apps up to date automatically. Use iCloud to give your users a consistent and seamless experience across iCloud-enabled devices.

iOS app database and cloud storage

If I want to setup a database for an iOS app, and kind of make the app work in a sort of MVC style, is this possible through iOS alone, or do I need to create a web application to handle database interaction etc?
I would really like to be able to set up cloud storage/databasing directly from an iOS app. Is this possible? Does Apple have a platform for this, or do I need my own server?
In iOS the work with a web based services is very easy but on the other hand: need a "connection file".
it works like this:
the iphone has the info > sends it to a php/asp/asp.net/server side page, this page is taking (by "get" or "post") the info sent and than inserts it into the database, saves the file on the server and etc..
For reciving content from a web based service you also need a server side file to handle your trafic to the database, but now the output you get is in JSON.
you get back to the app an array (the file sends the array back) and than you use the array for handeling the conent you just got.
It may sound very hard but it is not as much as it seems. For your questions: there is not a way to connect directly through the app to a Cloud DataBase.
Now, when talking about owning a server and so: there are not so many companies which provide a cloud databse alone. but even if you get a cloud data base you can always host a domain somewhere cheap and the pages are only a few KB in weight any way...
And those files will be for connecting to the DataBase (The middle-men for the app).
That is a short explanation about connecting to web based services in iOS Developing.

the reason not to access directly from xcode to mssql?

I am planning to build an iOS app with using DB(Ms-Sql).
However, people recommends not to access DB from Xcode.
They recommend me to use php or asp for accessing db through a webpage.
I want to know the reason.
Also I am going to use DB only for (view) select line (not insert, update nor delete).
so is it possible to access directly to db for viewing purpose only?
thank you
It's generally bad for an application (mobile, web, any client) to directly have access to any database for security issues. Clients really should not be accessing the database, which is often holding very private/secure data. It opens up vulnerabilities (i.e., sql injection attack).
A set of web services written in php or java or some back-end technology is a more secure, scalable system. These web services can connect to the database and retrieve data. Your iOS application can call the web services and receive the data..for example in the form of XML or JSON.

What Data System do companies like Yell.com use in their apps?

I currently develop an iOS app for a local business directory, and I use SQLite. This sadly means I must do several hours of data entry when new businesses are added and push the updated DB out, because the desktop site uses the Joomla CMS.
Obviously companies that provide directory services don't have to worry about such things. How do they do it? Core Data accompanied by a screen scraper?
PS. I apologise if this question is inappropriate to be asked on StackOverflow, I didn't know where else to ask.
Generally these companies have a client/server architecture where the data lives on a centralised server and the mobile apps pull the data through an exposed API over the internet.
To replicate this yourself, you would have a server with all the data and expose it through an API/web service (so you'd need to think about authentication and security) which your mobile app pulls from when it needs to update the database or just have the query sent to the web service and return the appropriate results so the database does not live on the iOS device itself. The downside to the first approach (updating the DB) is you'd need to wait for the DB to fully update before the user could use the application and the downside to the second approach is to make queries, the client would need an active internet connection.
The first thing you'd want to look at is if/how you can expose the data stored in the Joomla CMS through an API (XML/JSON?)

How to migrate multiple users' Access db's to one single SQLServer db

UPDATED 2010-11-25
A legacy stand-alone application (A1) is being re-created as a web application (A2).
A1 is written in Delphi 7 and uses a MS Access database to store the data. A1 has been distributed to ~1000 active users that we have no control over during the build of A2.
The database has ~50 tables, some which contain user data, some which contain template data (which does not need to be copied); 3-4 of these user tables are larger (<5000 records), the rest is small (<100).
Once A2 is 'live', users of A1 should be able to migrate to A2. I'm looking for a comparison of scenario's to do so.
One option is to develop a stand-alone 'update' tool for these users, and have this update tool talk to the A2 database through webservices.
Another option is to allow users to upload their Access db (~15 MB) database to our server, run some kind of SSIS package (overnight, perhaps) to get this into A2 for that user, and delete the Access db afterward.
Am I missing options? Which option is 'best' (I understand this may be somewhat subjective, but hopefully the pro's and cons for the scenario's can at least be made clear).
I'll gladly make this a community wiki if so demanded.
UPDATE 2010-11-23: it has been suggested that a variant of scenario 1 would be to have the update tool/application talk directly to the production database. Is this feasible?
UPDATE 2011-11: By now, this has been taken into production. Users upload the .zip file the .mdb is in, which is unpacked and placed in a secure location. A nightly SSIS scheduled job comes along and moves the data to staging tables, which are then moved into production through SP's.
I would lean toward uploading the complete database and running the conversion on the server.
In either case you need to write a conversion program. The real questions is how much of the conversion you deploy and run on the customers' computers. I would keep that part as simple as possible, i.e. just the upload. That way if you find any bugs or unexpected data during the conversion you can simply update the server and not need to re-deploy your conversion program.
The total amount of data you are talking about is not too large to upload, and it sounds like the majority of it would need to be uploaded in any case.
If you install a conversion program locally it would need a way to recover from a conversion that stopped part way through. That can be a lot more complicated than simply restarting an upload of the access database.
Also you don't indicate there would be any need for the web services after the conversions are done. The effort to put those services together, and keep them running and secure during the conversions would be far more than a simple upload application or web form.
Another factor is how quickly your customers would convert. If some of them will run the current application for some time period you may need to update your conversion application as the server database changes over time. If you upload the database and run the conversion on the server then only the server conversion program would need to be updated. There would not be any risk of a customer downloading the conversion program but not running it until after the server databases were updated.
We have a similar case where we choose to run the conversion on the server. We built a web page for the user to upload their files. In that case there is nothing to deploy for the new application. The only downside we found is getting the user to select the correct file. If you use a web form for the upload you can't pre-select file name for the user because of security restrictions. In our case we knew where the file was located but the customers did not. We provide directions on the upload page for the users to help them out. You could avoid this by writing a small desktop application to perform the upload for the users.
The only downside I see to writing a server based conversion is some of your template data will be uploaded that is un-needed. That is a small amount of data anyway.
Server Pros:
- No need to re-deploy the conversion due to bugs, unexpected data, or changes to the server database
- Easier to secure (possibly), there is only one access point - the upload. Of course you are accepting customer data in the form of an access database so you still can't trust anything in it.
Server Cons:
- Upload un-needed template data
Desktop Pros:
- ? I'm having trouble coming up with any
Desktop Cons:
- May need multiple versions deployed
As to talking to a server database directly. I have one application that talks to a hosted database directly to avoid creating web services. It works OK, but if given the chance I would not take that route again. The internet is dropped on a regular basis and the SQL Providers do not recover very well. We have trained our clients just to try again when that happens. We did this to avoid creating web services for our desktop application. We just reference the IP address in the server connection string. There is an entire list of security reasons not to take this route - we were comfortable with our security setup and possible risks. In the end the trade off of using the desktop application with no modifications was not worth having an unstable product.
Since a new database server to be likely one the standard database engines in the industry, why not consider linking the access application to this database server? That way you can simply send your data up to sql server that way.
I'm not really sure why you'd consider even suggest using a set of web services to a database engine when access supports an ODBC link to that database engine. So one potential upgrade path would be to simply issue a new application in access that has to be placed in the same directory as to where their current existing data file (and application) is now. Then on startup this application can simply RE link all of its tables to your existing database, plus come with a pre link set of tables to the database server. This is going to be far less work in building up some type of web services approach. I suppose part of this centers around where the database servers going to be hosted, but in most cases perhaps during the migration period, you have the database server running somewhere where everyone can get access to it. And a good many web providers allow external links to their database now.
It's also not clear that on the database server system you're going to create separate databases for each one, or as you suggest in your title it's all going to be placed into one database. Since is going to be placed into one database, then during the upsizing, an additional column that identifies the user location or however you plan to distinguish each database will be added during this upsizing process to distinguish each user set of data.
How easy this type of migration be will depend on the schema and database layout that the developers are using for the new system. Hopefully and obviously it has provisions for each user or location or however you plan to distinguish each individual user of the system. So, I don't suggest web services, but do suggest linking tables from the Access application to the instance of SQL server (or whatever server you run).
How best to do this will depend on the referential integrity and business rules that must be enforced, if there are any. For example, is there the possibility of duplicates when the databases are merged? I gather they are being merged from your somewhat cryptic statement: "And yes, one database for all, aspnet membership for user id's".
If you have no control of the 1000+ users of A1, how are you going to get them all to convert to A2?
Have you considered giving them an SQL Server Express DB to upgrade to, and letting them host the Web App on their own servers?

Resources