Currently I am trying to build a web app for iPad which needs to be used in places with no internet connectivity. The app is includes some video files, PDF, Doc and PPT files so the total application size will be 100MB.
Initially I was planning to use HTML5's offline manifest caching to sync the assets to the iPad's memory when internet connectivity was available before going on the road, but unfortunately there appears to be a restriction (at least in iOS 3.2) that the cache can total no more than 5MB.
I am aware that there is no limit on Cache-Manifest in the HTML5 specification, but in most cases there is a total cache limit of about 5MB (with it varying from browser to browser), but this would not be enough for my needs.
Could you please help me to understand the best way to get this application done?
With the help of HTML5 and Web SQL Database / IndexedDB will I be able to do this?
Or the Native Application development is the only option?
Related
We are using IBM MobileFirst 7.1 for our hybrid app.
We have noticed that on the iPad ,direct update fails when the file size is around 400MB.
Why is there such a limit in place and is there a way to get around this ?
Technically there is no limit to direct update size. MFP server can serve direct update requests upto 250MB/second.
However many factors need to be considered - your server's performance, network, free space on your device etc. Most important of all, is why your direct update is in the range of 400 + MB - and by extension the size of the application on your device and on the server.
Note that direct update is a way to quickly update your web resources ( Javascript/CSS/HTML) over the air. Details here.
If your direct update archive size is 400 MB +, your actual application size will be much more - which puts enormous I/O strain on your MFP server and runtime DB. There can be runtime synchronization issues in loading your application content every time server reboots.
If 400MB direct update has to be served to all your connecting devices, the server will be choked on resources.
While at it, for such a huge file size it is more than likely the network may not hold until completely downloaded. End users may have to resume the download multiple times - all the while the application cannot be used.
Finally the end user's device should have enough free space to keep the downloaded archive and enough space to unarchive it.
The issue with direct update you have seen is only a symptom. You should probably re-consider your application design.
Specifically:
a) Why is a hybrid application so big ( possibly 500 MB+)? Consider the time it will take to download your application from Store.
b) Is your server sufficiently tuned to handle the massive load?
Performance tuning
Optimization and tuning of MobileFirst Server
Optimization of MobileFirst Server project databases
c) Are you embedding audio/video content into your application?
d) You can try minification of your JS and CSS files to reduce the size:
Minification of JS and CSS files
As a thumb rule, try to keep the direct update size to about a few 10s of MBs. If it is close to 100MB or more then you can consider going through AppStore or PlayStore.
If you would still like to avoid application resubmission, you can use CDN to serve your direct update:
Serving direct update requests from a CDN
Note that this only takes the strain off your MFP server - in serving the direct update request from all end users. The free space and network considerations of the end user does not change. Runtime synchronization issues at MFP server are still a possibility.
I'm doing rails development locally (i.e. WEBrick 1.3.1). To test mobile, I put my iPhone 4s (iOS 8.1) on the same local network and load from the appropriate IP address.
It's really slow.
I watch the console and I can't figure out what the bottleneck is. I don't get the same behavior when I'm running the desktop browser locally. Of course, there's supposed to be some latency since the packets have to go over the wire, but it's unbelievably slow, on the order of more than 7 seconds. Sometimes not all the resources are loaded.
How can I improve load times for iOS/iPhone/mobile? Has anyone else run into this? For example, I thought perhaps it might be because we're loading fonts. Do local fonts (i.e. fonts that are on the system already) get optimized for rendering? This would explain some of the slow-down since we send a custom font.
if you have launched it online, are sure the perforamnce issues is not from your hosting provider. if you are on a shared server, may be your server are heavily overloads with contents from other users. when i first launched my application built with phonegap, i experience the same issues of server sluggishness. i have to move to another hosting server and perhaps my apps becomes super fast.
I'm looking to find a way to stream a user's desktop LIVE (through some piece of software, such as Open Broadcaster Software) to a web application.
I'm assuming I should use a CDN to get the live streamed video to my web application, but how (and what software should I use) to get the user's desktop to a streaming service? Should I use a service such as Red5 or an AWS service? Or if only a few viewers are using it, should I host the service myself?
Although I have built my share of web applications, I have never dealt with live media streaming before, and I would appreciate any assistance anyone could lend.
By far the best resource for video on Rails is OpenTok
Our own demo here: http://bvc-video.herokuapp.com/broadcasts/1
--
Streaming
Video streaming is a tough one
The problem is really dependent on what you're trying to stream. If it's "live" video - I.E captured & sent directly to the viewers, you'll have to use some sort of server to process the video.
Although I don't have huge experience with this, the main issue we've found is the compression / distribution of the feed. It's actually very simple to acheive video streaming on iOS - all the software / hardware is the same (just use the same API / drivers)
This often negates the requirement for a central server, although it's highly recommended (almost required) for many cases. Problems arise when you try and beam to multiple clients on multiple systems; as you'll run into compatibility issues
--
Solutions
The solutions we've found are thus:
The most stable part of the app is to take the stream & send to a server
The wizardry will then be to beam that stream to multiple clients
The way to do this is typically to use a flash widget & pull the stream from the server
WebRTC is becoming the standard (OpenTok is built on this)
I'm not sure about video compression / distribution. Akami is an industry heavyweight, but I've never used it. Brightcove too
We've just pushed out a new ASP.NET MVC based web-app that works fine on all desktop connections, and all mobile devices such as iPhone, etc. However, when certain pages are viewed over a 3G connection (either via a 3G dongle on a laptop, or directly on a mobile device), a blank, white page is served with no content of any kind. It appears as if we've returned an empty request.
On certain proxied networks, we're getting a similar issue whereby they say our request size is too large. This sort of makes sense, as it only affects certain pages, and I would assume the mobile network providers operate all manner of proxy servers on their side.
However, I've not been able to find any information on what would constitute a request that was too large. I've profiled one of the pages in question, here are some of the stats I thought might be relevant:
HTML content size: 33.04KB compressed, 50.65KB uncompressed
Total size of all stylesheets (4 files): 32.39KB compressed, 181.65KB uncompressed
Total size of all external JS (24 files): 227.82KB compressed, 851.46KB uncompressed
To me, the compressed size of the content is not excessive, but perhaps I'm wrong. Can anyone advise on what I can do to solve this problem, as I've had a really hard time finding any definitive information on this.
As far as the MVC is concerned 3G networks are no different than Wifi. However there is a limitation on the size of the files that mobiles can Cache. In that case those files will be requested from the server with each postback.
Since some of the pages do work, I think its a good idea to isolate the problem to specific point of failure than searching in wild.
You can debug problems easilt with a 3G dongle and firebug in Firefox or Chrome developer tools
Make sure there are no Java Script errors in the first place that's causing the problem
Confirm the Javascript/css/html files are indeed delivered to the client. (Firebug on the client). On the server check the IIS logs or MS Network Monitor or create a http proxy where you can monitor the traffic. Try which ever you are comfortable with.
You have nearly 30 requests just for css/java script/html and the count may be higher if you have images. Completeting all those requests may be taking forever on 3G. Try bundling the Java Script files and reduce the request count. Browsers have limitation on number of simultaneous requests they can make, adding to the over all time (Firefox I believe can make around 10 simultaneous requests).
We did actually resolve this issue, and it was down to the size and number of Set-Cookie items in the response header. We found we had a bug whereby there were nearly 100 Set-Cookie items in the header. Resolving this bug, and reducing the size of the values fixed the issue
I have an iOS application that allows the users to download content and then use it. During development i used Dropbox for data storage and download purposes. Now i'm looking for a deployment solution. A service that can handle many concurrents downloads with decent download speed, provides sufficient storage space, is secure, and not very expensive. Optionally, if it provides Download Tracking/Monitoring features, then that'll be a plus.
Amazon S3 looks like a viable option. What other choices do i have?
So, we ended up using Rackspace which we think is more appropriate for our situation. And regarding the bandwidth limitation of Dropbox, here's what their representative said:
We automatically ban public links when they are responsible for an
uncommonly large amount of traffic. These include all of the sharing
links, not just the Public folder links.
The limit is 10GB/day for free accounts. Paid accounts have a much
higher limit of 250GB/day. While we want you to be able to share your
files with your friends we can't be a content delivery network for the
entire Internet.
Links are banned temporarily (3 days for the first time) and accounts
will eventually restore their public links.
So, dropbox was not an option.