I have an iOS application that allows the users to download content and then use it. During development i used Dropbox for data storage and download purposes. Now i'm looking for a deployment solution. A service that can handle many concurrents downloads with decent download speed, provides sufficient storage space, is secure, and not very expensive. Optionally, if it provides Download Tracking/Monitoring features, then that'll be a plus.
Amazon S3 looks like a viable option. What other choices do i have?
So, we ended up using Rackspace which we think is more appropriate for our situation. And regarding the bandwidth limitation of Dropbox, here's what their representative said:
We automatically ban public links when they are responsible for an
uncommonly large amount of traffic. These include all of the sharing
links, not just the Public folder links.
The limit is 10GB/day for free accounts. Paid accounts have a much
higher limit of 250GB/day. While we want you to be able to share your
files with your friends we can't be a content delivery network for the
entire Internet.
Links are banned temporarily (3 days for the first time) and accounts
will eventually restore their public links.
So, dropbox was not an option.
Related
I am looking for a mechanism to accomplish a two-way storage mirroring.
I have two storages, both used for reads & writes at the same time.
any file wrote to one of these storages should be available for reading in the second one ASAP (the period of time should exceed no more than a few seconds).
in case one storage is down, the second one is already a full copy, and can serve any file requested.
new files should be synced to the breaking storage once it's up again.
for more case understanding here is my use case:
I am deploying an asp.net application into two sites (Site-A | Site-B), with a load balancer in between.
each site will have its own NAS storage (Storage-A | Storage-B).
Now when a user uploads a file to the application it will be saved to one storage which is linked to the site that handled the request, let's assume it was Storage-A.
Then, another user needs to download the file, but now his request handled by Site-B
means the file will be looked for inside Storage-B, and it should be available through the two-way mirroring.
Further information:
there is a 5-kilometer distance between the sites, and it's all private network and has no internet access.
network speed is 1Gb but can be increased if needed.
OS used is Windows server 2019.
I've searched a lot but all solution founds were including cloud services or clustering with one way mirroring.
happy to hear any suggestions, and pardon my deliver as it's the first question for me here.
Thanks for reading this.
I've been searching for an easy way to upload a csv file generated in the users iPhone app to a secure storage drive.
For a research project, we are looking to ask participants to answer a couple of questions that will be compiled into a csv file.
All those files need to be uploaded anonymised to one secure storage location for further processing of the results.
I have looked into Google Drive (didn't work for iOS), Softlayer Object Storage but there doesn't seem to be an easy out of the box solution.
I have been jumping through customer support for two weeks.
In an ideal world:
CSV file created in the iPhone app
User agrees to share the file
CSV file uploaded through the iPhone app into a secure storage
Ideal solution:
HIIPA complaint or secure solution
ready wrappers for implementing in xCode
Half a day of work max
Thanks again,
Best,
Joseph
maybe this can help you:
http://blog.softlayer.com/2014/softlayer-security-questions-and-answers
"5: How is my data kept private? How can I confirm that SoftLayer can’t read my confidential data?
A: This question is common customers who deal with sensitive workloads such as HIPAA-protected documentation, employee records, case files, and so on.
SoftLayer customers are encouraged to deploy a gateway device (e.g. Vyatta appliance) on which they can configure encryption protocols. Because the gateway device is the first hop into SoftLayer’s network, it provides an encrypted tunnel to traverse the VLANs that reside on SoftLayer. When securing compute and storage resources, customers can deploy single tenant dedicated storage devices to establish isolated workloads, and they can even encrypt their hard drives from the OS level to protect data at rest. Encrypting the hard drive helps safeguard data even if SoftLayer were to replace a drive or something similar."
It seems that you will need to create your server to store your data and deploy a gateway device as well.
In another hand about object storage I did not find any information if it supports HIIPA, but as the softlayer object storage is based on Open Stack I think that if Open Stack supports HIIPA Softlayer's object storage should support it as well. Just in case here some documentation about how to work with object storage using REST:
https://sldn.softlayer.com/blog/waelriac/managing-softlayer-object-storage-through-rest-apis
Regards
We finally settled of Google Firebase iOS SDK. The Firebase Storage function is fantastic.
It was very easy to integrate and the support was excellent.
We used a file upload function to the firebase storage.
Highly recommended!
I want to provide in Azure MVC web site a Download link for files that are stored in Blob storage. I do not want the users see my blob storage Url and I want to provide my own dowload link to provide the name of the file by this as well.
I think this can be done with passing(forwarding) the stream. Found many similar questions here in SO, eg here: Download/Stream file from URL - asp.net.
The problem what I see is here: Imagine 1000 users start downloading one file simultaneously. This will totaly kill my server as there is limited number of threads in the pool right?
I should say, that the files I want to forward are about 100MB big so 1 request can take about 10 minutes.
I am right or can I do it with no risks? Would async method in MVC5 help? Thx!
Update: My azure example is here only to give some background. I am actualy interrested in the theoretical problem of the Long Streaming Methods in MVC.
in your situation Lukas, I'd actually recommend you look at using the local, temporary storage area for the blob and serve it up from there. This will result in a delay in delivering the file the first time, but all subsequent requests will be faster (in my experience) and result in fewer azure storage transaction calls. it also then eliminates the risk of running into throttling on the azure storage account or blob. Your throughput limits would be based on the outbound bandwidth of the vm instance and number of connections it can support. I have a sample for this type of approach at: http://brentdacodemonkey.wordpress.com/2012/08/02/local-file-cache-in-windows-azure/
The intention behind is to keep the initial app bundle size small, but provide some documents that can be downloaded later on for free.
Is there a way to do this with the Apple hosted content feature?
I agree this would be a nice usage scenario, but it is not possible at the moment. You will have to host the content yourself. Amazon S3 provides a very stable and cheap solution for this scenario with low-to-zero maintenance cost.
I need to design a system where we have a central Rails website for maintaining product information, some of which is rich media (photos, movies etc.) and we need a way to efficiently access this central information from a series of information kiosks. The central system will be used to update and control access to the information and the kiosks will primarily display this with no editing required. The only traffic which is likely to move back from kiosk to central site is usage information which is not bandwidth constrained.
My initial thoughts are to run separate Rails servers on each kiosk and 'somehow' (eg. scheduled rake task) synchronise the relevant content from the central server to each kiosk. Note that the kiosks won't all have the same content on them as it will be location dependent. We might need to employ something like Amazon S3 storage to host content.
Another option would be to employ some sort of advanced caching (ie. more advanced than standard browser caching) on each kiosk to minimise network bandwidth requirements and speed things up. I've used 'squid' before but only as a general purpose site cache server, I don't know if it can step up to what I need here.
So, my question is whether anyone out there has attempted anything like this before and what sort of architecture you found to work. I'd be interested in hearing if there are any Rails plugins which are relevant to my requirements and/or any smart caching servers.
Many thanks,
Craig.
I know it's not possible for every application, but you could generate static cache of the content and use a scheduled task to update each kiosk from that cache. Then you don't have to maintain rails servers in each one.
Depending on what you're running on kiosks, if you need a bit more interactivity, you can run a sinatra or a camping app. Those are a fair bit lighter weight than rails. You can communicate through XML. If you're running a flash app on the kiosk, look at rubyamf library.