In the process of implementing this feature and wanted to get some suggestions.
When a user lands on this one page on my site, I get the user's connections from linkedin (via an API call). Now I need to display these connections on my page 10 connections at a time for some further processing. After loading the first 10 initially I want to give the user an option to load 10 more connections (by hitting a button), and then 10 more, etc.
I was thinking of implementing this in such a way that when user arrives on page, I make the API call, get all connections at one time, store them all into a JSON object and then display the first 10. If the user asks for the next 10, then I read the next 10 from the JSON object.
Is this the best way I can achieve this paging ability? Is using a JSON object a good way of going about it? Note, idea here is to prevent making muletiple API calls since Linkedin has a daily limit. Thanks.
I like this method, especially for something like Autocomplete where there will be a server-side request every time the user makes a keystroke. I noticed that it was putting a lot of strain on my server (because the autocomplete gets used on the app almost 24/7). So I'll use an example from my code for you to adapt. It's for entering in playlists.
I have an action in my PlaylistEntriesController called playlist which respond to JS:
def playlist
#all_playlist_entries = PlaylistEntry.scoped # just a dump of all the playlist entries for autocomplete
end
Then in playlist.js.erb:
<% cache 'playlist_entries', :expires_in => '1.day' do %>
var titles = <%=raw #all_playlist_entries.map(&:title).uniq.to_json %>
var artists = <%=raw #all_playlist_entries.map(&:artist).uniq.to_json %>
var albums = <%=raw #all_playlist_entries.map(&:album).uniq.to_json %>
var autocomplete_options = { delay: 100, autoFocus: true, minLength: 3 };
<% end %>
And then I simply pass those javascript variables as the "source" for the autocomplete. It's FAST (tested with 100,000 entries, database response time was something like 4 seconds and the actual autocomplete is still almost immediate), and even better, it only requires one call to the database each day, since it's cached, as opposed to THOUSANDS. The downside is, of course, that new entries won't be available until the next day, but in my case that is not a big issue.
So this is not exactly how you would implement it, but it will work. You can make one call to the LinkedIn API and store the results in a javascript array as json objects (using to_json, which you can override in the model if you need to). Then have a link that says Show 10 more, which will do some jQuery magic to show the next ten (I can't do it off the top of my head but it will be a matter of finding out how many are showing, either with a javascript variable or storing the number as a data-attribute or id on each DIV, and then getting the next 10 in the array).
Related
I am using Instagram api
My query is
api.instagram.com/v1/users/[USER_ID]/media/recent?count=3818
I get only 33 photos that is ordered by date
But,I need to load all the users photo which has more likes
what am I missing here ?
You can use Pagination.See http://instagram.com/developer/endpoints/ for information on pagination. You need to subsequentially step through the result pages, each time requesting the next part with the next_url that the result specifies in the pagination object.
From instagram
pagination
Sometimes you just can't get enough. For this reason, we've provided a
convenient way to access more data in any request for sequential data.
Simply call the url in the next_url parameter and we'll respond with
the next set of data.
{
...
"pagination": {
"next_url": "https://api.instagram.com/v1/tags/puppy/media/recent?access_token=fb2e77d.47a0479900504cb3ab4a1f626d174d2d&max_id=13872296",
"next_max_id": "13872296"
} }
On views where pagination is present, we also support the "count"
parameter. Simply set this to the number of items you'd like to
receive. Note that the default values should be fine for most
applications - but if you decide to increase this number there is a
maximum value defined on each endpoint.
I am creating an iOS application with a ruby on rails backend using active record mysql
The application is able to post various types of media (images, gifs, videos)
Any user is allowed to post and posts that have the boolean flag is_private to false will show up in a global feed visible on the application (think Facebook news feed).
For this global feed i need to build pagination and pull to refresh.
Application side i have built the network services and models but i need to harden down the logic as to how to fetch data for the global feed.
My questions are:
-What is the common structure for the backend and application communication.
-So far i have a method which gets the initial page and a method which gets the next page starting at the last current item.
-How do i deal with the fact that more items have been put into the head (top) of the data source while the user has been scrolling causing issues with continuity?
Pagination with write consistency is harder than blind pagination. The basics of pagination are that you want to load an initial set and then be able to go down (typically back in time) from the point of the last fetch.
Two types of pagination:
Fetch the *top of the data-source and then fetch the next page and then the next page. the problem with this approach is that when items are inserted at the top of the data-source your definition of page 2 shifts by n items (n is the number of inserts since last fetch)
Fetch the head of the list and store the last id that is in the list returned from the server. On the next request (when the user scrolls to the buttom of the list) send the last see id and then filter to the next m items **after that last seend id
first request (GET items.json) returns
response = [12,11,10,9,8]
store the last id
last_id = response.last
send it with the next request (GET items.json?last_id=8)
response = [7,6,5,4]
and so for on the way down
pull to refresh sends a request (GET items.json) to fetch the head of the list
[19,18,17,16,15]
then make another request (GET items.json?last_id=15) to fill in the gaps between 15 and 12
Interesting question!!!!
The best information I have pertains to how to "receive" data from the backend in "real time". I'm not sure how you'll handle the JS scrolling mechanism
--
Live
The "live" functionality of the system is basically handled by passing data through either an SSE or Websocket (asynchronous connection), to make it appear like your application is operating in "real time".
In reality, "live" applications are nothing more than those which are constantly "listening" to the server - allowing you to take any data they send & put on the page with JS
If you wanted to keep the "feed" up to date perpetually, I'd recommend using either of these technologies:
SSE's
The most elemental way of performing this is to use server sent events - an HTML5 technology which basically allows you to pass data from your server to your DOM using the text/stream content-type:
This is what's considered a "native" way of handling updates from the server:
#app/assets/javascripts/application.js
var source = new EventSource("your_end_point");
source.onmessage = function(event) {
//your data here
};
This can be compensated on the Rails side with ActionController::Live::SSE controller :
class MyController < ActionController::Base
include ActionController::Live
def index
response.headers['Content-Type'] = 'text/event-stream'
sse = SSE.new(response.stream, retry: 300, event: "event-name")
sse.write({ name: 'John'})
sse.write({ name: 'John'}, id: 10)
sse.write({ name: 'John'}, id: 10, event: "other-event")
sse.write({ name: 'John'}, id: 10, event: "other-event", retry: 500)
ensure
sse.close
end
end
The problem with SSE's is they are basically the same as ajax long-polling; in that your front-end JS will constantly send requests every second. I don't like this
--
Websockets
Websockets are the "right" way to connect & receive data in "real time":
(source: inapp.com)
They basically allow you to open a perpetual connection between your front-end and your server, meaning you won't have to send constant requests to your server. I don't have much experience with websockets, but I do with Pusher
--
Pusher
I'd highly recommend pusher - it's a third party websocket system (I am not affiliated with them in any way)
Simply, it allows you to send updates to the Pusher service, and read them on your system. It takes out all the hassle from having to provide connectivity for your own websocket app.
You can read up on how it works, as well as studying the pusher gem
My application feature a "main" page where most of the action happens: There are tags for filtering and a list of results in a (paginated) table, plus the possibility to select some or all results in a "shopping cart".
This page has to keep track of a whole lot of things: what tags are selected, what items are selected, and how the result table is sorted and what page it's on. Everything has to persist, so if I select a new tag, the page must partially reload but remember everything (sorting, what's selected).
Right now I'm handling everything with parameters, and for each action taken on the page, all links (select a tag/item, change page, sort table) are updated to include previous parameters + the relevant new addition. This works, obviously, but it feels kind of inefficient, as I have to reload more of the page than I want to. How is this situation normally handled? I can't find that much info on google at all, but it doesn't feel like a particularly uncommon case.
tl;dr: How to best make sure all links (to the same page) always include everything previously selected + the new action. There are a lot of links (one per tag to select/deselect, one per result item to select/deselect, one per sort option, one per page)
There are five ways to do that:
Method 1: By parameters
You mentioned this. I never think of this as it's too troublesome. Anyway it's still a solution for very simple case.
Method 2: By cookie
Save the settings to a cookie and read the cookie in controller to arrange layout settings.
Method 3: By LocalStorage
Similar to cookie but allows more space.
Method 4: By Session
If you are using ActiveRecord to save session, this could be the best solution for pure pages loading. Save the user preferences into session and load it in next layout.
Method 5: Use Ajax
This is the best solution IMO. Instead of whole page loading, use Ajax to refresh/retrieve changes you need. Using together with above method, a user can even continue his last preferences. This is the most powerful and should be applicable to your case which looks like a web app than a website.
Have you tried creating model for all those attributes? and just always load the 'latest' when on the page load, if you dont need them you can always have a flag for that session.
I'm planning out how to track internal search data in Omniture/SiteCatalyst.
It's a fairly straight-forward plan for a standard "enter a term and get a page of results" model: set sProps and eVars with the terms, the count of results, and the page searched from, then fire a success event for searching and another for clicking a search result.
For a type-ahead search--where the user is given search results as they type in a search bar--what's a good strategy for handling the timing of event submissions so that you don't end up with different events/entries for letters 4, 5, 6, and 7 of a search term's entry?
Our solution was to leverage a delay on the autocomplete to reduce the number of calls. From a tracking standpoint, if someone pauses for 1 second (or 500 ms, whatever), then they're probably actually waiting for the autocomplete results, and that constitutes a valid search.
From a technical standpoint, we leveraged the delay option on the jQuery UI widget.
Strategy I've always used is to not track the "auto-complete" search features..put the tracking on the search results page same as normal. Or are you saying the whole search results page is being output as the user types? If that is the case...one thing you could do is write some code to pop the Omniture code when the search field loses focus.
Another thing you can do is as the visitor is typing in the search bar, on each keypress, write the current value to a cookie. Then have some code that runs on page load to look for that cookie and if it exists, pop the Omniture search variables and erase the cookie. Alternatively you can keep track of current value w/ a server-side session variable since I assume this thing is ajax driven, and output the omn code w/ server-side code if session var exists. These methods would mean that the search events and vars would not pop on the search results page...this probably isn't that big a deal, unless you have supporting variables you pop, like an "internal search referrer" prop/eVar that keeps track of the previous page the visitor was on (or the page the visitor was on when they performed the search). So you'll have to keep that in mind and carry that over as well.
Whenever you do a search you might be aware of the concept that query string parameter get added at the end of URL.
Suppose www.stackoverfow.com is website and when are you performing a search on it then it will be like www.stackoverflow.com?q=yourname , yourname is the searchkeyword.This keyword we can capture in sitecatalyst.
you can see google.com while searching on internet for sitecatalyst is ---
www.google.co.in/search?q=sitecatalyst
In the same way we can use query string parameter as q = something.
after doing all this thing we can use the plugin getQueryParam in plugin section of the s_code library file to fetch that variable and store that in sitecatalyst variable...
example:-
function s_doPlugins(s) {
var one = s.getQueryParam("q");
if(one)
s.eVar1=one;
}
s.doPlugins=s_doPlugins
insert this below code outside plugin section
/*
* Returns the value of a specified query string parameter, if found in the current page URL.
*/
s.getQueryParam=new Function("p","d","u",""
+"var s=this,v='',i,t;d=d?d:'';u=u?u:(s.pageURL?s.pageURL:s.wd.locati"
+"on);if(u=='f')u=s.gtfs().location;while(p){i=p.indexOf(',');i=i<0?p"
+".length:i;t=s.p_gpv(p.substring(0,i),u+'');if(t){t=t.indexOf('#')>-"
+"1?t.substring(0,t.indexOf('#')):t;}if(t)v+=v?d+t:t;p=p.substring(i="
+"=p.length?i:i+1)}return v");
s.p_gpv=new Function("k","u",""
+"var s=this,v='',i=u.indexOf('?'),q;if(k&&i>-1){q=u.substring(i+1);v"
+"=s.pt(q,'&','p_gvf',k)}return v");
s.p_gvf=new Function("t","k",""
+"if(t){var s=this,i=t.indexOf('='),p=i<0?t:t.substring(0,i),v=i<0?'T"
+"rue':t.substring(i+1);if(p.toLowerCase()==k.toLowerCase())return s."
+"epa(v)}return ''");
you will find that it will capture your search results
please let me know in case of more clarifications
I have a search page which finds candidates.
From this page you can click view to find more information about the candidate.
When on the candidate view you can click edit or do a number of other actions which would return you too the candidates view.
My problem is from the candidates view I need to add a button to go back to the search results.
I originally thought of using a JS button with history -1 but because the user can do other action from inside the view this won't work.
I am still quite new to rails so not sure of my options... I am thinking some sort of caching of the results and then maybe a hidden field to keep track of the location of the cache(don't think this is the best solution as keeping track of the hidden value could get abit messy!)
Thanks, Alex
I would probably use a session variable to store this information.
First, make sure your form that posts to the search page is a GET operation, this way the search details are in your query string. Then in your search action, you can grab the request URL and store it in the session:
session[:search_results] = request.url
Now in your view for the results, you can do your "Back to search results" like this:
link_to "Back to search results", session[:search_results]
You have a couple of options:
Cache the results, as you've suggested. The potential downsides to this are that it takes memory, and if new valid records get added, you won't see them. You could store the cache in Session, or in the database (though in the latter case, you don't gain much).
I'd suggest just remembering the last search term, either in session or using hidden fields. You end up re-running the query when you go to the search results page, but in a properly indexed DB, that shouldn't be a big deal.
Good luck!
You can include the parameters for the query on the subpage. Eg.: /foo/search?q=stuff displays search result. Each result then has a link like /foo/:id?q=stuff. And on the subpage, you will have the parameter available to link back to the main page.
This solution doesn't use any server side state, which is generally accepted as the better way to build web applications. Not only does it mean that you browser will behave as expected, with respect to bookmarks, multiple tabs etc., but it also ensures that proper caching can be employed. Further, it lowers the complexity of your application, making it easier to debug and extend.
You could put the search results in a "search_results" table keyed by the user id. Then when the user hits the page, always load from a query on that table.
If anybody does come across this page and you need a button that goes back to the previous page and still display those search results (just like the google chrome button), just use :back.....
<%= link_to(image_tag("back.svg"), :back, :class => 'back_btn') %>