iOS app file storage. YouTube, parse.com or local? - ios

I have an iOS app that uses a lot of pictures and videos. Currently, I use parse.com as backend and I store the pictures in parse.com database. Imagine I have 200 tasks, each task with two pictures and one video. Now, each user will have his own set of tasks, for example, user A may have 20 tasks, user B 40 and user C 50, all different tasks included on those 200.
Now, I'm worried about app performance, not so much server storage and other structure details...
I don't think including all pictures in the app bundle is a good idea, because user will use only about 40% of the pictures at a time, so it wouldn't be wise to include it all in the app. Instead, I have the pictures in database and when user needs, the app makes a request and gets it. Am I right about this? Is this the best performance?
Regarding the videos, should I store the videos in youTube.com or in my parse.com database? I think youTube is the best option because I can live stream the video whether in parse.com I will need to download the video first. Does that make sense?
If there are any other services or better way to handle pictures and videos in an iOS app, please let me know.

Regarding images - the best performance here is to store them remotely on a server and from what it seems, you're doing it right.
Storing videos on YouTube is not the best idea but definitely better then just storing them on your server - if you have the ability to upload multiple Youtube videos from lots of users and if Google API allows that - I'd go for this option.
In general I would store as little as possible on the device, apart from the image cache maybe. Nevertheless even the image cache sometimes grows to hundreds of MBs so you have to think about it as well.

Related

Rails 6 - Cost risks involved while making a large video Public, hosted on AWS S3 and distributed on AWS CloudFront

I have already asked this question on AWS Developer Forum but dont have any answer, hence posting the same question here to get some help.
I have a quite well organized and fast Rails 6 app where users can upload large videos(4gb)/images and also make them public to others. Its using AWS SDK for S3 upload and CloudFront to distribute and make the content available globally.All uploaded videos are transcoded into mp4,HD.Full HD videos using Input S3 bucket - MediaConvert - Lambda - Ootput S3 bucket - Cloudfront workflow.
Now my query is -
As users are allowed to upload upto 4GB of videos and also can make them public, so does this feature of making large videos public will also increase the cost/billing, as the video is public and more and more people will watch it, raising concerns to more incoming request for CloudFront...Can someone correct me here?
If the above point is correct and will happen, what are the ways I can make videos public without effecting the billing/cost, for example using Cache(cloudfront cache) or any other way to minimize the increasing cost.
What are the ways I can allow users to share uploaded videos to share with others, without increasing the AWS billing?
There is no legal way to avoid the data transfer cost increase for the use case you described in AWS. Even if CloudFront cache your data, you still need to pay the CloudFront outbound data transfer cost.
As users are allowed to upload up to 4GB of videos and also can make them public, so does this feature of making large videos public will also increase the cost/billing, as the video is public and more and more people will watch it, raising concerns to more incoming request for CloudFront...Can someone correct me here?
You are correct. You will be paying for all these outgoing data transfers.
If the above point is correct and will happen, what are the ways I can make videos public without affecting the billing/cost, for example using Cache(CloudFront cache) or any other way to minimize the increasing cost.
The only way is to earn money based on such a service. So you either charge the users to view the videos, your charge the uploaded for the upload of the videos and subsequent data transfers, or you earn by advertisements on your portal. So still upload and download are free, but you somehow make your users go through a bunch of advertisement links to compensate for that. There are other ways to monetize a website, but it depends on how popular your website will become, e.g. collect user data and sell it out.
What are the ways I can allow users to share uploaded videos to share with others, without increasing the AWS billing?
See point two. You have to monetize your website, or simply change its architecture. Instead of you storing all the files, let users exchange torrent links. Then the files are not stored on your account nor you incur any cost associated with data transfers.
I truly appreciate the time and effort of all those who tried to help me.
I didnt got any answer though I was able to get quite close to what I needed in the link below - which is a blog post explaining the parameters and other relevant areas that needs to he considered while working on large videos, cost involved and how much total bit rate is viewed.
https://aws.amazon.com/blogs/media/frequently-asked-questions-about-the-cost-of-live-streaming/
Hope it helps someone looking for more specific answers.

Firebase Storage: How to reduce requests? (iOS)

I'm developing a chat app with Firebase. Am currently still in development phase.
Profile pictures of test users are uploaded to Firebase Storage, and are downloaded in the home screen (with all the pictures). I realized that with that I very quickly used up storage download requests (easily hit 3,000 requests in one night, and hit the free plan quota!).
What are some best practices I could use to minimize download requests? Just to be sure I'm doing it right - I'm sending a GET request to the Firebase Storage url directly: https://firebasestorage.googleapis.com/... to download the image. Is that the right way to do it?
Two suggestions that might help:
Cache your images! If you keep requesting the same images over and over again over the network, that's going to use up your quota pretty fast. Not to mention your user's battery and network traffic. After you retrieve an image from the network, save it locally, and then the next time you need an image, look for it locally before you make another network request. Or consider using a library like PINRemoteImage that does most of the work for you. (Both on the retrieving as well as the caching side)
Consider uploading smaller versions of your image if you think you might be using them often. If your chat app, for instance, saves profile pictures as 1024x768 images, but then spend most of its time showing them as 66x50 thumbnails, you're probably downloading a lot of data you don't need. Consider saving both the original image and a thumbnail, and then grabbing the larger one only if you need it.
Hope that helps...

How to achieve lightening fast image download like Instagra

I am making an app that displays images in tableViewCells similar to Instagram using Firebase as text based database and Amazon S3 as the storage for image file. I use Alamofire to download image in my app.
I was looking at the Instagram app and realized that they can display so many photos so quickly. I was wondering what is the general guide to achieve something like that? Taking into consideration of things like
When to start download the image?
Should the download stop at some point if required?
Is there ways to speed up the process with firebase, S3, Alamofire or is this combination doomed to be capped at an UN-satisifiable speed?
Caching image after download (already doing) and load from cache when possible)
How many parallel download should be going on at one time?
Note. I believe Instagram imags are 200kb~300kb each.

Watch video in the time they are uploaded

It is possible to implement a feature that allows users to watch videos as they are uploaded to server by others. Is html 5 suitable for this task? But flash? Are there any read to go solutions, don't want to reinvent the wheel. The application will be hosted on a dedicated server.
Thanks.
Of course it is possible, the data is there isnt it?
However it will be very hard to implement.
Also I am not so into python and I am not aware of a library or service suiting your requirements, but I can cover the basics of video streaming.
I assume you are talking about video files that are uploaded and not streams. Because, for that, there are obviously thousands of solutions out there...
In the most simple case the video being uploaded is already ready to be served to your clients and has a so called "faststart atom". They are container format specific and there are sometimes a bunch of them. The most common is the moov-atom. It contains a lot of data and is very complex, however in our use case, in a nutshell, it holds the data that enables the client to begin playing the video right away using the data available from the beginning.
You need that if you have progressive download videos (youtube...), meaning where a file is served from a Webserver. You obviously have not downloaded the full file and the player already can start playing.
If the fastastart atom was not present, that would not be possible.
Sometimes it is, but the player for example cannot display a progress bar, because it doesnt know how long the file is.
Having that covered the file could be uploaded. You will need an upload solution that writes the data directly to a buffer or a file. (file will be easier...).
This is almost always the case, for example PHP creates a file in the tmp_dir. You can also specify it if you want to find the video while its being uploaded.
Well, now you can start reading that file byte by byte and print that data to a connection to another client. Just be sure not to go ahead of what has already been recieved and written. You would probaby initiate your upload with a metadata set in memory that holds the current recieved byte position and location of the file.
Anyone who requests the file after the uploaded has started can just recieve the entire file, or if the upload is not yet finished, get it from your application.
You will have to throttle the data delivery or pause it when the data becomes short. This will appear to the client almost as a "slow connection". However you will have to echo some data from time to time to prevent the connection from closing. But if your upload doesnt stall, and why shoud it?, that shouldnt be a problem.
Now if you want to have someting like on the fly transcoding of various input formats into your desired output format, things get interesting.
AFAIK ffmpeg has neat apis which lets you directly deal with datasterams.
Also handbrake is a very good tool, however you would need to take the long road using external executeables.
I am not really aware of your requirements, however if your clients are already tuned in, for example on a red 5 streaming server, feeding data into a stream should also work fine.
Yes, take a look at Qik, http://qik.com/
"Instant Video Sharing ... Videos can be viewed live (right as they are being recorded) or anytime later."
Qik provides developer APIs, including ones like these:
qik.stream.subscribe_public_recent -- Subscribe to the videos (live and recorded)
qik.user.following -- Provides the list of people the user is following
qik.stream.public_info -- Get public information for a specific video
It is most certainly to do this, but it won't be trivial. And no, I don't think that you will find an "out of the box" solution that will require little effort on your behalf.
You say you want to let:
users watch videos as they are uploaded to server by others
Well, this could be interpreted two different ways:
Do you mean that you don't want a user to have to refresh the page before seeing new videos that other users have just finished uploading?
Or do you mean that you want one user to be able to watch a partially uploaded video (aka another user is still in the process of uploading it and right now the server only contains a partial upload of the video)?
Implementing #1 wouldn't be hard at all whatsoever. You would just need an AJAX script to check for newly uploaded videos, and those videos could then be served to the user in whatever way you choose. HTML5 vs. Flash isn't really a consideration here.
The second scenario, on the other hand, would require quite a bit of effort. I am guessing that HTML5 might not be mature enough to handle this type of situation. If you are not looking
to reinvent the wheel and don't have a lot of time to dedicate to this feature than I would say that you would be out of luck. You may be able to use ffmpeg to parse partial video files and feed them to a Flash player, but I would think of this as a large task.

Best way to collect data for an iPad app that also have an offline mode

My client states in an iPad app brief that the data (i.e. products and images) must be taken from an online source and saved. However, the app must also have an offline mode which shows this same data from when the app was previously online for times when internet access is not available (kind of like an offline reader). What would be the best way to tackle this? Any help greatly appreciated.
Download the data when the device is online and store it locally using whatever mechanism seems most appropriate (SQLite, Core Data, property lists, your own file format, etc.). Use this cached data when offline, and when online too unless it has changed. Create some mechanism that you can use to detect and download updates (preferably just the changes) when online.
This will be a big help for your users not just when they're offline, but online too. 3G data plans for the iPad are usually limited, so the better you can avoid repeat downloads of large resources like images, the better for your users.

Resources