Microsoft has added differential sync to OneDrive so that the sync client only uploads changes to a file within OneDrive rather than uploading the entire file again. This is very useful since it save a lot of time and bandwidth. I would like to use Microsoft Graph APIs in order to sync the data in my app using Onedrive, so users can access it on multiple platforms. However, in the APIs provided by Microsoft when a file is modified locally then in order to sync the file in the cloud, the entire file needs to be uploaded again. This is quite time consuming (especially for slow network connections) and wastes unnecessary bandwidth. Is there a way to use differential sync with the Microsoft Graph APIs to solve this issue?
Related
Thanks for reading this.
I've been searching for an easy way to upload a csv file generated in the users iPhone app to a secure storage drive.
For a research project, we are looking to ask participants to answer a couple of questions that will be compiled into a csv file.
All those files need to be uploaded anonymised to one secure storage location for further processing of the results.
I have looked into Google Drive (didn't work for iOS), Softlayer Object Storage but there doesn't seem to be an easy out of the box solution.
I have been jumping through customer support for two weeks.
In an ideal world:
CSV file created in the iPhone app
User agrees to share the file
CSV file uploaded through the iPhone app into a secure storage
Ideal solution:
HIIPA complaint or secure solution
ready wrappers for implementing in xCode
Half a day of work max
Thanks again,
Best,
Joseph
maybe this can help you:
http://blog.softlayer.com/2014/softlayer-security-questions-and-answers
"5: How is my data kept private? How can I confirm that SoftLayer can’t read my confidential data?
A: This question is common customers who deal with sensitive workloads such as HIPAA-protected documentation, employee records, case files, and so on.
SoftLayer customers are encouraged to deploy a gateway device (e.g. Vyatta appliance) on which they can configure encryption protocols. Because the gateway device is the first hop into SoftLayer’s network, it provides an encrypted tunnel to traverse the VLANs that reside on SoftLayer. When securing compute and storage resources, customers can deploy single tenant dedicated storage devices to establish isolated workloads, and they can even encrypt their hard drives from the OS level to protect data at rest. Encrypting the hard drive helps safeguard data even if SoftLayer were to replace a drive or something similar."
It seems that you will need to create your server to store your data and deploy a gateway device as well.
In another hand about object storage I did not find any information if it supports HIIPA, but as the softlayer object storage is based on Open Stack I think that if Open Stack supports HIIPA Softlayer's object storage should support it as well. Just in case here some documentation about how to work with object storage using REST:
https://sldn.softlayer.com/blog/waelriac/managing-softlayer-object-storage-through-rest-apis
Regards
We finally settled of Google Firebase iOS SDK. The Firebase Storage function is fantastic.
It was very easy to integrate and the support was excellent.
We used a file upload function to the firebase storage.
Highly recommended!
I am working on an app that collects user data including photos. It's mandated that this app should work in offline mode - meaning that the user can complete surveys and take photos without an internet connection and that data should sync back to a remote database. How is this generally handled? Do I create a local database with Core Data and write an additional layer to manage saving/reading from a server? Are there any frameworks that help facilitate that syncing?
I have also been looking into backend services such as Firebase that include iOS SDKs that appear to handle a lot of the heavy lifting of offline support, but it does not appear to support offline syncing of image files through the Firebase Storage SDK.
Can anyone recommend the least painful way to handle this?
Couchbase Mobile / Couchbase Lite is probably the best solution I've come across so far.
It allows offline data storage including binary data, and online syncing with a CouchDB compatible server. It works best with their Couchbase Server / Sync Gateway combination, but if you don't need to use filtered replication or 'channels' (e.g. for syncing data specific to a single user with a shared database), you can use Cloudant which saves you having to set up your own server.
Its also available across most platforms.
Generally for images it is best to use NSFileManager and save your images in either the documents directory or the caches directory depending on the types of images you are storing. Core Data or Firebase are databases that are more qualified for data than images although they do support arbitrary data storage.
You can also try SDWebImage which has a lot of features around loading and storing images.
We are considering using Azure offline data sync for our app which usually has very sporadic connectivity (In most cases users sync their data once a day). Thing is that mobile app needs to hold a lot of data (tens of thousands of products). Currently we have our own sync solution which works fine with sqlite.
My question is, do you have any experience or thoughts about performance of Azure offline data sync? Will it be able to handle really large datasets?
Thanks you
Azure mobile service is the cloud version of popular Microsoft sync framework. This is a light weight json API which tracks changes between local and remote data store. It transfers only changed rows, hence data traffic will be minimum. But when you sync very first time and you have huge data, it might be a problem.
You could overcome this problem by carefully designing your database structure. Azure SDK provides api to sync table by table, which gives you enough flexibility to choose what to sync and not.
I want to provide in Azure MVC web site a Download link for files that are stored in Blob storage. I do not want the users see my blob storage Url and I want to provide my own dowload link to provide the name of the file by this as well.
I think this can be done with passing(forwarding) the stream. Found many similar questions here in SO, eg here: Download/Stream file from URL - asp.net.
The problem what I see is here: Imagine 1000 users start downloading one file simultaneously. This will totaly kill my server as there is limited number of threads in the pool right?
I should say, that the files I want to forward are about 100MB big so 1 request can take about 10 minutes.
I am right or can I do it with no risks? Would async method in MVC5 help? Thx!
Update: My azure example is here only to give some background. I am actualy interrested in the theoretical problem of the Long Streaming Methods in MVC.
in your situation Lukas, I'd actually recommend you look at using the local, temporary storage area for the blob and serve it up from there. This will result in a delay in delivering the file the first time, but all subsequent requests will be faster (in my experience) and result in fewer azure storage transaction calls. it also then eliminates the risk of running into throttling on the azure storage account or blob. Your throughput limits would be based on the outbound bandwidth of the vm instance and number of connections it can support. I have a sample for this type of approach at: http://brentdacodemonkey.wordpress.com/2012/08/02/local-file-cache-in-windows-azure/
I'm thinking about to write a restful service which is able to upload and stream large video files (GB) (in future it might not only be videos and could also be large documents.
I researched so far and what really makes sense to me could be to use off:
WCF Data Services and Implement IDataServiceStreamProvider and on the back-end I want to Strore the large files into SQL SERVER 2008 using the new SQL Type FILESTREAM.Looks also like I had to use some Win 32 API to access the filesystem SafeFileHandle handle = SqlNativeClient.OpenSqlFilestream
Since WCF Data Services likes to play with Entity Framework or Linq-To-SQL who can be the streaming implementation and is there a support for the SQL Server Filestream Type?
this is the plan but I don't know how to assemble it together... I thougt about chunking the large files and to be able to resume and cancel.
For the upload: I am not sure to use the silverlight upload control or some other nifty ajax tool.
Can anyone point me in the right direction here... or would u think this is this a way to go? Thoughts, Links? whould be great...
I did something where I was sending huge data files. I used these two examples to help write my code
http://msdn.microsoft.com/en-us/library/ms751463.aspx
http://www.codeproject.com/KB/WCF/WCFDownloadUploadService.aspx
This is a very important number to know 2147483647
silverfighter:
Only on IIS6, I could not configure WCF Data Services to send more than 30 MB Stream over the network. I believe it is not built for large stream transactions. Just try to upload a 27 MB file and monitor the relevant w3wp process, you will be surprised by the amount of memory consumed.
The solution was to create a WCF Service Application hosted under its own w3wp process and responsible only for download / upload over WCF. i recommend you use the following project http://www.codeproject.com/Articles/166763/WCF-Streaming-Upload-Download-Files-Over-HTTP
Hope the above could help.
Not Related to the question but related to answer of #Houssam Hamdan :
The 30 MB limit is not because of WCF data services but it's IIS's limitation that can be changed through config file and settings of IIS and catching some exceptions thrown by IIS