I am creating an iOS application with a ruby on rails backend using active record mysql
The application is able to post various types of media (images, gifs, videos)
Any user is allowed to post and posts that have the boolean flag is_private to false will show up in a global feed visible on the application (think Facebook news feed).
For this global feed i need to build pagination and pull to refresh.
Application side i have built the network services and models but i need to harden down the logic as to how to fetch data for the global feed.
My questions are:
-What is the common structure for the backend and application communication.
-So far i have a method which gets the initial page and a method which gets the next page starting at the last current item.
-How do i deal with the fact that more items have been put into the head (top) of the data source while the user has been scrolling causing issues with continuity?
Pagination with write consistency is harder than blind pagination. The basics of pagination are that you want to load an initial set and then be able to go down (typically back in time) from the point of the last fetch.
Two types of pagination:
Fetch the *top of the data-source and then fetch the next page and then the next page. the problem with this approach is that when items are inserted at the top of the data-source your definition of page 2 shifts by n items (n is the number of inserts since last fetch)
Fetch the head of the list and store the last id that is in the list returned from the server. On the next request (when the user scrolls to the buttom of the list) send the last see id and then filter to the next m items **after that last seend id
first request (GET items.json) returns
response = [12,11,10,9,8]
store the last id
last_id = response.last
send it with the next request (GET items.json?last_id=8)
response = [7,6,5,4]
and so for on the way down
pull to refresh sends a request (GET items.json) to fetch the head of the list
[19,18,17,16,15]
then make another request (GET items.json?last_id=15) to fill in the gaps between 15 and 12
Interesting question!!!!
The best information I have pertains to how to "receive" data from the backend in "real time". I'm not sure how you'll handle the JS scrolling mechanism
--
Live
The "live" functionality of the system is basically handled by passing data through either an SSE or Websocket (asynchronous connection), to make it appear like your application is operating in "real time".
In reality, "live" applications are nothing more than those which are constantly "listening" to the server - allowing you to take any data they send & put on the page with JS
If you wanted to keep the "feed" up to date perpetually, I'd recommend using either of these technologies:
SSE's
The most elemental way of performing this is to use server sent events - an HTML5 technology which basically allows you to pass data from your server to your DOM using the text/stream content-type:
This is what's considered a "native" way of handling updates from the server:
#app/assets/javascripts/application.js
var source = new EventSource("your_end_point");
source.onmessage = function(event) {
//your data here
};
This can be compensated on the Rails side with ActionController::Live::SSE controller :
class MyController < ActionController::Base
include ActionController::Live
def index
response.headers['Content-Type'] = 'text/event-stream'
sse = SSE.new(response.stream, retry: 300, event: "event-name")
sse.write({ name: 'John'})
sse.write({ name: 'John'}, id: 10)
sse.write({ name: 'John'}, id: 10, event: "other-event")
sse.write({ name: 'John'}, id: 10, event: "other-event", retry: 500)
ensure
sse.close
end
end
The problem with SSE's is they are basically the same as ajax long-polling; in that your front-end JS will constantly send requests every second. I don't like this
--
Websockets
Websockets are the "right" way to connect & receive data in "real time":
(source: inapp.com)
They basically allow you to open a perpetual connection between your front-end and your server, meaning you won't have to send constant requests to your server. I don't have much experience with websockets, but I do with Pusher
--
Pusher
I'd highly recommend pusher - it's a third party websocket system (I am not affiliated with them in any way)
Simply, it allows you to send updates to the Pusher service, and read them on your system. It takes out all the hassle from having to provide connectivity for your own websocket app.
You can read up on how it works, as well as studying the pusher gem
Related
Here is my workflow:
Person clicks on my ScheduleOnce link and schedules a meeting
Upon completing the ScheduleOnce booking form, the person clicks the done button
When this done button is clicked the person is redirected to a Node JS web app that displays an application page. This application page needs to be auto-populated with the information from the ScheduleOnce page.
Between step 2 and 3 is where Zapier comes in. I am trying to use Zapier to capture the data from the ScheduleOnce booking, which it is. Then I am trying to use a Zap to send that data to the page the person is redirected to, to auto-populate some of the fields.
I thought using the Code Javascript functionality would work but it does not. So then I was thinking about using the StoreClient option or the API. I am just confused on how to get the flow to work to access the data and auto-populate the fields on the next redirected page.
Some help would be greatly appreciated.
Here is the code I have for the Javascript option:
var store = StoreClient("Secret");
store
.setMany({firstName: inputData.firstName, lastName: inputData.lastName, email: inputData.email, mobilePhone: inputData.mobilePhone, otherPhone: inputData.otherPhone, businessWebsite: inputData.businessWebsite})
.then(function() {
return store.getMany('firstName', 'lastName', 'email', 'mobilePhone', 'otherPhone', 'businessWebsite');
})
.then(function() {
callback();
})
.catch(callback);
David here, from the Zapier Platform team. This is a cool use case and is probably possible. Something you need to remember is that Zapier is running totally separately from the user, so interaction will have to be indirect. Zapier can't redirect your user anywhere, it can just store data in response to a button push.
In your case you can skip everything after the setMany, since you're not trying to use the values in the zap; you just need to store them (and verify that action completed without errors).
var store = StoreClient("Secret");
store
.setMany({firstName: inputData.firstName, lastName: inputData.lastName, email: inputData.email, mobilePhone: inputData.mobilePhone, otherPhone: inputData.otherPhone, businessWebsite: inputData.businessWebsite})
.catch(callback);
You'll need to solve a couple of problems:
Speed. the user will reach your landing page before the zap completes (as it has to make a couple of HTTP round trips and execute code). You'll want to play them a 3 second loading gif, or put a waiting message and allow them to refresh the destination
Populating the page. I'm not sure what the nature of the destination is (best case scenario is that it's a server you control), but something will need to make an http request to store.zapier.com to retrieve the stored data and surface it in the view. This is easy if
Identifying the user. You'll need some way to identify the user getting redirected to the data you stored in StoreClient. If two users fill out the form in quick succession, the second one will currently overwrite the first. Plus, it seems to be semi-sensitive data that you don't just want available to anyone on your site. To that end, you'll probably want to store all of the data as a JSON string keyed by the user's email (or something else unique). That way, when I (the user) finish the form, I'm redirected to yoursite.com/landing?email=david#zapier.com, the backend knows to look for (the david#zapier.com key in store) and can render a view with the correct info.
To that end, I'd tweak the code to the following:
var store = StoreClient("Secret");
store
.set(inputData.email, JSON.stringify({firstName: inputData.firstName, lastName: inputData.lastName, email: inputData.email, mobilePhone: inputData.mobilePhone, otherPhone: inputData.otherPhone, businessWebsite: inputData.businessWebsite}))
.catch(callback);
Hope that points you in the right direction. You're working with a pretty complicated workflow, but I bet you can do it!
I have been using yandex.tank for a few days to perform load tests
I have set up the URL's list in different ways but I do not get my goal
I want to simulate a real visit (like a web navigator):
request
html response
request of objects embedded in the code
I can create a grouped list of the objects embedded in the code, but the results are oriented to each of the requests per individual. For example:
My "home" tag in "Cumulative Cases Info" shows me:
4554 28.21% / avg 171.2 ms
171.2 ms is the average time of each of the objects. I want the average time for the full request (html and embeded objects)
Is it possible to perform a load test by making requests like those indicated with yandex.tank? Or with another load testing tool?
Yandex-tank (actually default load generator Phantom in it), doesn't parse response and therefore knows nothing about embedded resources. You'd better try jmeter as a load generator, since it's HTTP Request Sampler has an option to retrieve resources - http://jmeter.apache.org/usermanual/component_reference.html#HTTP_Request
My requirement is to update user session record in database on each click that hits the server.
So I have written filter for this
allExceptLogin(controller: 'login', invert: true){
before = {
}
}
Which works fine as it goes inside filter where I can update the record but the problem is that if I have more than 1 method calls on a single click then it goes inside this filter that many times.
For e.g. If I click on a page which calls 4 different methods from same or different controller then it will go inside this filter 4 time which will eventually update the record 4 times.
I need some condition which says 1 click = 1 request to this filter.
Is this possible or can this be achieved by any other way?
The server has no notion of "clicks", it only deals with requests. One possible approach is to have the client send a key param with each "click" that your filter could then process in one batch.
Another option is to set a timeout on the server (e.g. on the session object) to only process requests every x many seconds. You may miss some related calls as well, but that may be OK.
The short of it is that Grails itself does not have a built-in mechanism to differentiate between related requests.
What I'm trying to do:
Be able to have users subscribed to a number of different 'chat rooms' and use reverse AJAX / comet to send messages from a chat room to everyone logged into that room. (a bit more complicated but this is a similar use case).
What I'm doing:
Using Grails with JMS and Atmosphere. When a message is sent, I'm using JMS to send the message object which is received by a Grails service which is then broadcasted to the atmosphere URL (i.e. atmosphere/messages).
Obviously JMS is a bit redundant there but I though I could use it to help me filter who should retrieve the message although that doesn't really look it'll work (given that the subscriber is basically a singleton service...).
Anyway, what I need to be able to do is only send out a message to the correct subset of people listening to atmosphere/messages. A RESTful-type URL will be perfect here (i.e. atmosphere/messages/* where * is the room ID) however I have no idea how to do that with Atmosphere.
Any ideas / suggestions on how I can achieve what I want? Nothing is concrete at all here so feel free to suggest almost anything. I've even been thinking (based on the response to another question), for example, if I could do something like send out messages to a Node.js server and have that handle the reverse AJAX / comet part.
If I understand your requirements correctly the following should work (jax-rs + scala code):
1) Everyone who wants to get messages from a chat room registers for it:
#GET
#Path(choose/a/path)
def register(#QueryParam("chatroomId") chatroomId: Broadcaster) {
// alternatively, the Suspend annotation can be used
new SuspendResponse.SuspendResponseBuilder[String]()
.resumeOnBroadcast(false).broadcaster(chatroomId).scope(SCOPE.REQUEST)
.period(suspendTimeout, TimeUnit.MINUTES)
.addListener(new AtmosphereEventsLogger()).build
}
2) To broadcast a message for all the registered users, call the following method:
#POST
#Broadcast
#Path(choose/a/path/{chatroomId})
def broadcast(#PathParam("chatroomId") id: String) {
// first find your broadcaster with the BroadcasterFactory
BroadcasterFactory.getDefault().lookupAll() // or maybe there is a find by id?
broadcaster = ...
broadcaster.broadcast(<your message>)
}
I also recommend reading the atmosphere whitepaper, have a look at the mailing list and at Jeanfrancois Arcand's blog.
Hope that helps.
There is a misunderstaning of the concept of comet. Its just another publish/subscribe implementation. If you have multiple chat-rooms, then you need to have multiple "topics", i.e. multiple channels the user can register to. E.g.:
broadcaster['/atmosphere/chatRoom1'].broadcast('Hello world!')
broadcaster['/atmosphere/chatRoom2'].broadcast('Hello world!')
So I would advance you to creaet multiple channels and do not filter manually the set of users, which should retrieve messages (which is definitely not the way it should be done). You do not need to create anything on the server side on this, since the user will just register for a specific channel and receive messages, which anyone is putting into it.
I would recommend you create an AtmosphereHandler for one URL like /atmosphere/chat-room and then use the AtmosphereResource and bind an BroadcastFilter with it, lets say name it ChatRoomBroadcastFilter.
Whenever a user subscribes to a new chat room, a message would be sent to the server (from the client) telling the server about the subscription. Once subscribed, maintain the list of users <> chat room bindings somewhere on the server.
Whenever a message is broadcasted, broadcast it with the chat room id with it. The in the ChatRoomBroadcastFilter (You probably need to make this a PerRequestBroacastFilter) propagate the message to the user only if the user subscribed to the chat room. I am not sure if this clears it out. If you need code example please mention in the comments. I'll put that but that needs some time so ain't putting it right now ;).
In the process of implementing this feature and wanted to get some suggestions.
When a user lands on this one page on my site, I get the user's connections from linkedin (via an API call). Now I need to display these connections on my page 10 connections at a time for some further processing. After loading the first 10 initially I want to give the user an option to load 10 more connections (by hitting a button), and then 10 more, etc.
I was thinking of implementing this in such a way that when user arrives on page, I make the API call, get all connections at one time, store them all into a JSON object and then display the first 10. If the user asks for the next 10, then I read the next 10 from the JSON object.
Is this the best way I can achieve this paging ability? Is using a JSON object a good way of going about it? Note, idea here is to prevent making muletiple API calls since Linkedin has a daily limit. Thanks.
I like this method, especially for something like Autocomplete where there will be a server-side request every time the user makes a keystroke. I noticed that it was putting a lot of strain on my server (because the autocomplete gets used on the app almost 24/7). So I'll use an example from my code for you to adapt. It's for entering in playlists.
I have an action in my PlaylistEntriesController called playlist which respond to JS:
def playlist
#all_playlist_entries = PlaylistEntry.scoped # just a dump of all the playlist entries for autocomplete
end
Then in playlist.js.erb:
<% cache 'playlist_entries', :expires_in => '1.day' do %>
var titles = <%=raw #all_playlist_entries.map(&:title).uniq.to_json %>
var artists = <%=raw #all_playlist_entries.map(&:artist).uniq.to_json %>
var albums = <%=raw #all_playlist_entries.map(&:album).uniq.to_json %>
var autocomplete_options = { delay: 100, autoFocus: true, minLength: 3 };
<% end %>
And then I simply pass those javascript variables as the "source" for the autocomplete. It's FAST (tested with 100,000 entries, database response time was something like 4 seconds and the actual autocomplete is still almost immediate), and even better, it only requires one call to the database each day, since it's cached, as opposed to THOUSANDS. The downside is, of course, that new entries won't be available until the next day, but in my case that is not a big issue.
So this is not exactly how you would implement it, but it will work. You can make one call to the LinkedIn API and store the results in a javascript array as json objects (using to_json, which you can override in the model if you need to). Then have a link that says Show 10 more, which will do some jQuery magic to show the next ten (I can't do it off the top of my head but it will be a matter of finding out how many are showing, either with a javascript variable or storing the number as a data-attribute or id on each DIV, and then getting the next 10 in the array).