Accessing shared DB using iOS and Django - ios

I'm just starting to learn about iOS development, and I figure the best way to get started is to build a simple (but non-trivial) app. My idea is this: have a web interface where a user can create a survey, and then access those surveys through the app and send responses back to the server. The web design part probably won't be terribly difficult -- I've done similar things with Django before. The part that will require learning/effort is the iPhone app.
I've got enough Objective-C that the data structures (model) won't be hard to code, and the UI (view, controller) part shouldn't be bad either. I predict that the interface between web and phone will be difficult, though. In particular, how will I be able to access the database on the server from the phone? I'd like to have a single DB that both web and phone apps use.
What I'd really like to have is a general, broad-strokes description of what I'll need to do to get this all up and running. Am I right in believing that the networking will be the hardest part? Are there any other possible snags? Any advice, or pointers to good resources on the subject, would be greatly appreciated.

Networking will probably not be the hardest part here, you're just guessing because that aspect is unfamiliar to you. For example, you can use NSURLConnection to take care of pretty much all the details of server connection. You can use NSJSONSerialization to convert your data to and from a format that is suitable for sending over the wire.
Basically what you might do is:
Mobile app sends a HTTP GET request to the server for survey info.
Server responds with a JSON description of the survey.
User fills out survey.
When done, the app sends the responses back in JSON format as a HTTP POST to the server.
Server stores the results in the database.
One of the key points here is that the app on the phone does not try to access the database directly. All requests go through your Django web app.

Related

Keeping users in sync with each other in an social network app?

I am wondering about what the best way to keep users in sync with each other in a social network is. The concerned stack is an iOS app with a NodeJS backend. Let me give you an example:
Say X and Y are friends on a social network. Y's posts appear in X's feed, and as such, Y is cached somewhere on the X's phone. This morning, Y decided to change profile pictures however. Everything is well, the new picture is uploaded to the server, but how do we go about letting X know about the change of profile picture?
My possible solution: Create a route /<UID>/updates that contains a stack of "cookies" which lets the user know what and who changed since the last time they made a GET request to the route.
This seems elegant enough, but what worries me is what happens on the client side (am I supposed to make a GET request every 2 minutes during my app's uptime?). Are there any other solutions?
One solution is indeed to poll the server, but that's not very elegant. A better way is to make use of websockets:
WebSockets is an advanced technology that makes it possible to open an interactive communication session between the user's browser and a server. With this API, you can send messages to a server and receive event-driven responses without having to poll the server for a reply.
They are a 2-way connection between client and server, allowing the server to notify the client of any changes. This is the underlying technology used in the Meteor framework for example.
Take a look at this blogpost for an example of how to use websockets between an iOS client and a NodeJS backend. They make use of the open source SocketRocket iOS library.

Solution for Web Application with Unreliable Internet Connection

We've developed a web application which is hosted on premises available for people in the shop floor via Wifi. However, the wifi signal is not reliable and it's not possible to use wired network or improve the signal.
I am looking for a solution to handle this issue. Is there a way to put the http requests into a local queue and process it asynchronously at the background? If so, how to do it? Or is there any other alternative approach?
Any thoughts are greatly appreciated.
I have the same problem in the company where I work, there are certain places where the WiFi can not reach, and the system needs to get information from the DB in order to show that info to the user and then upload some new info.
Part of this system is done with iPads, so to solve the problem I use LocalStorage to store a JSON object that contains the info the user need to work, I store the info that is going to be uploaded in another JSON object, and when there is a connection available the info is Uploaded.
Hope it helps
I would recommend to build the web app with angularjs or another javascript framework of your choice. Once the user has loaded the site you can perform asynchronous ajax/http requests to load the required data and the web app will never reload the entire page.
In case one http request fails you can implement that the web app should try one more time or whatever :)

How to dynamically and efficiently pull information from database (notifications) in Rails

I am working in a Rails application and below is the scenario requiring a solution.
I'm doing some time consuming processes in the background using Sidekiq and saves the related information in the database. Now when each of the process gets completed, we would like to show notifications in a separate area saying that the process has been completed.
So, the notifications area really need to pull things from the back-end (This notification area will be available in every page) and show it dynamically. So, I thought Ajax must be an option. But, I don't know how to trigger it for a particular area only. Or is there any other option by which Client can fetch dynamic content from the server efficiently without creating much traffic.
I know it would be a broad topic to say about. But any relevant info would be greatly appreciated. Thanks :)
You're looking at a perpetual connection (either using SSE's or Websockets), something Rails has started to look at with ActionController::Live
Live
You're looking for "live" connectivity:
"Live" functionality works by keeping a connection open
between your app and the server. Rails is an HTTP request-based
framework, meaning it only sends responses to requests. The way to
send live data is to keep the response open (using a perpetual connection), which allows you to send updated data to your page on its
own timescale
The way to do this is to use a front-end method to keep the connection "live", and a back-end stack to serve the updates. The front-end will need either SSE's or a websocket, which you'll connect with use of JS
The SEE's and websockets basically give you access to the server out of the scope of "normal" requests (they use text/event-stream content / mime type)
Recommendation
We use a service called pusher
This basically creates a third-party websocket service, to which you can push updates. Once the service receives the updates, it will send it to any channels which are connected to it. You can split the channels it broadcasts to using the pub/sub pattern
I'd recommend using this service directly (they have a Rails gem) (I'm not affiliated with them), as well as providing a super simple API
Other than that, you should look at the ActionController::Live functionality of Rails
The answer suggested in the comment by #h0lyalg0rithm is an option to go.
However, primitive options are.
Use setinterval in javascript to perform a task every x seconds. Say polling.
Use jQuery or native ajax to poll for information to a controller/action via route and have the controller push data as JSON.
Use document.getElementById or jQuery to update data on the page.

Secure data transfer between servers

i got a little odd situation to develop.
The MVC web system my team has to develop (or project is made with rails), will rely on login/password from another site.
The idea is, the user will have a log-in on the third part site, and somewhere relevant, will exist a link to our site. When the user click on that link, we need receive from the site, some data of the user.
We have no control of the third part server, or direct access to their database. Plus, making then make any change to their application/infrastructure is a BIGDEAL so i am searching for a solution with less impact for then. (Of course they will have do change something but will be a political issue, so the less, the better)
From our viem, we need to be sure that the user really come from the third part site (and only from there), and we not have received a fake message from an attacker.
Their site have an valid SSL certificate working. (no idea if my system will have one (it should))
Not sure if its relevant, but we think that their server is an oracle aplication server, who connect to a oracle server in their internal network.
I first thought in using just SSL, but i not sure how to do it (what i have to check, what i have to change?) and if is safe enought.
My second thought is to use PGP keys, and make then sing and cryptography the data before sending to us, and, the link yo our site, would make a post to a control on our server which would verify and de-crypt the data.
Anyone have any tips/pointers/thoughts that could help me?
If both servers are using SSL, and supposing the server give you at least a json or xml interface, should be ok to simply make a secure request (using, for example, rest-client) and evaluating the response in your server.
Most likely you will want to cache user data on login in your server, and if user/password aren't found, look in the other server - this will reduce the load.

Rails app with no databse and continually updated models

I'm wondering what the best way to go about developing a rails application with the following features:
All of the data comes from a SOAP request to a 3rd party
A background task will make this soap request every ~10s
The background task will parse the response and then update an ActiveRecord model accordingly
The data isn't written to a database at all, if the app fails, when we start it back up the data will come from the soap request again
Users will make a request to the app which will simply show data in the model (i.e. from the soap request).
The idea is to avoid making the SOAP request for every single user as the data won't change that frequently. Not using a database avoids reading and writing of data that only ever comes from the request anyway.
I imagine that all of this can be completely quite simply with a few gems but I've had a bit of trouble sorting through what meets my requirements and what doesn't.
Thanks
I'm not sure what benefit you're getting from using ActiveRecord in this case.
Perhaps consider some other type of persistance for the SOAP calls?
If the results form the WebService are really not changing, I would recommend the Rails caching mechanism. Wherever in your Rails app, you can do:
Rails.cache.fetch "a_unique_cache_key" do
... do your SOAP request and return the result
end
This will do the work within the block just once and fetch its result from the rails cache store in the future.
The cache store be of various types (one of which is the memcache store). I usually go with the file store for medium traffic sites, but you may choose another:
http://guides.rubyonrails.org/caching_with_rails.html

Resources