Test download speed from database using app and maximum download speed - ios

I am using Amazon S3 as my image storage platform for my app. However, I am finding that the download speed is quite slow. I am using Alamofire with the link to the image in S3 to download my image. I want to test the following
Download speed from Amazon S3 (If it is too slow I need to consider other service). I want to test it on my phone (Wifi, and Cellular) and also find out the maximum download speed.
My images are approximately 200kb to 400kb each. They are displayed in tableview cells. I am looking for an accurate bench mark to compare with what Amazon would expect me to get. Is there some sort of smart way or framework to do so?

The AWS iOS SDK can be used to facilitate your use of S3.
There is a downloadToURL method that will help download files.
https://github.com/aws/aws-sdk-ios/blob/master/AWSS3/AWSS3TransferUtility.m#L358
You can experiment with accelerated mode to see if this solves your problem.
https://github.com/aws/aws-sdk-ios/blob/master/AWSS3/AWSS3TransferUtility.m#L392

Related

How to compress images/video with Google Cloud Storage create_upload_url

I have an iOS app in which users upload images directly to Google Cloud Storage. I achieve this by using the function blobstore.create_upload_url in my Python Google App Engine backend, and sending the URL to my app.
However, displaying these images back to the user is taking too long as the images are too large, so I realize I need to compress these images in Google Cloud Storage.
Do you know how this can be achieved programmatically? I know I can execute terminal commands to get compress and re-upload images to Google Cloud Storage but I need to do this programmatically within my app.
I'm not entirely sure what type of compression you are looking for, but app engine provides a image service allowing you to resize the images.
From their docs, you can use the image service with the blob store as a source -
The Images service can use a Blobstore value as the source of a
transformation. The source image can be as large as the maximum size
for a Blobstore value. The Images service still returns the
transformed image to the application, so the transformed image must be
smaller than 32 megabytes. This is useful for making thumbnail images
of large photographs uploaded by users.
Something along these lines -
from google.appengine.api import images
image = Image(blob_key=YOUR_BLOB_KEY)
image.resize(width=YOUR_WIDTH, height=YOUR_HEIGHT)
compressed_image = image.execute_transforms(output_encoding=images.JPEG)
Here's an overview of the API and list api reference -
https://cloud.google.com/appengine/docs/standard/python/images/
https://cloud.google.com/appengine/docs/standard/python/refdocs/google.appengine.api.images

Firebase Storage: How to reduce requests? (iOS)

I'm developing a chat app with Firebase. Am currently still in development phase.
Profile pictures of test users are uploaded to Firebase Storage, and are downloaded in the home screen (with all the pictures). I realized that with that I very quickly used up storage download requests (easily hit 3,000 requests in one night, and hit the free plan quota!).
What are some best practices I could use to minimize download requests? Just to be sure I'm doing it right - I'm sending a GET request to the Firebase Storage url directly: https://firebasestorage.googleapis.com/... to download the image. Is that the right way to do it?
Two suggestions that might help:
Cache your images! If you keep requesting the same images over and over again over the network, that's going to use up your quota pretty fast. Not to mention your user's battery and network traffic. After you retrieve an image from the network, save it locally, and then the next time you need an image, look for it locally before you make another network request. Or consider using a library like PINRemoteImage that does most of the work for you. (Both on the retrieving as well as the caching side)
Consider uploading smaller versions of your image if you think you might be using them often. If your chat app, for instance, saves profile pictures as 1024x768 images, but then spend most of its time showing them as 66x50 thumbnails, you're probably downloading a lot of data you don't need. Consider saving both the original image and a thumbnail, and then grabbing the larger one only if you need it.
Hope that helps...

How to achieve lightening fast image download like Instagra

I am making an app that displays images in tableViewCells similar to Instagram using Firebase as text based database and Amazon S3 as the storage for image file. I use Alamofire to download image in my app.
I was looking at the Instagram app and realized that they can display so many photos so quickly. I was wondering what is the general guide to achieve something like that? Taking into consideration of things like
When to start download the image?
Should the download stop at some point if required?
Is there ways to speed up the process with firebase, S3, Alamofire or is this combination doomed to be capped at an UN-satisifiable speed?
Caching image after download (already doing) and load from cache when possible)
How many parallel download should be going on at one time?
Note. I believe Instagram imags are 200kb~300kb each.

Uploading big file (video size > 10 MB) using grails

Working with Grails 3.* Framework.
Iam trying to upload a video file, the size will be grater than 10 MB.
Below is the code, which works perfectly for storing image files in server, which is uploaded from client browser
File fileIns = new File("/opt/a.jpg")
fileIns << params.image
Can I use the same code for saving videos.
if so, params.video may consume a huge memory, how can I optimize it.
Or any other methods to stream part by part and save it as video?
How whatsapp and telegram are used for transferring video files into the server?
Any suggestions is appreciated.
If You are dealing with large data, I'd suggest going full java.
Check this link about using Streaming.
Another chance is using HTTPClient. Read this

iOS Upload Photo in Parts

Let's say I want to preserve the full resolution of a photo on the iPhone, and then upload it to a web service for storing. Quality is critical. Unfortunately, the size of a 3200x2400 photo taken with the iPhone camera is approximately 10-12MB for a PNG, and 1-3MB for a JPG (as of my latest tests).
Here we have a dilemma. On a 3G connection, a 12MB upload is an eternity (relatively speaking, of course). So I've explored a few options, including streams/chunking and background uploading. Still, it's not ideal. I'd like the upload to be as fast as possible. See edit.
So my question is this: would it be possible to split an image into separate data chunks, upload them all concurrently using multiple asynchronous connections, and then re-assemble them server side? Does an implementation exist for this?
EDIT: So speed is capped by bandwidth as has been discussed in the comments. But there are other uses for chunking/splitting that I would like to explore. So the question still stands.
What you can do is actually split the image into several pieces, and upload each, then reassemble later.
I guess a benefit of that would be getting a partial image on failed connection, then continuing uploading the remaining pieces afterwards.

Resources