I want to do pagination in UITableView.
For example,
If I show you 20 article
Do I need to send a 20 times HTTP Request ?
=> return 1 data row
DO I need to send a 1 times HTTP Request()
=> return 20 data row
This is a little bit tricky. Probably you don't know about 'lazy loading'. This is the choice in this regard. Before saying the reasons, lets see the problems of two options you provided.
Consider 2nd option: if you want 20 articles at a times. Then you have to wait to all data of these 20 articles. If once all 20 articles downloaded successfully then only user can see a article. Otherwise, user have to wait until it download. Which makes a user boring.
Consider the 1st option: it will reduces the 2nd option problem. But it still remains other problem.if user scrolling the table cell to see a specific article then the table cell will stack because in background thread downloading continue so UI will hang.
3rd option:download 1 by 1 items with lazy loading(like Facebook ). If data comes user will see article otherwise cell will empty but UI will never stack/ hang.
You can use these method of tableview delegate
- (void)scrollViewDidScroll:(UIScrollView *)scrollView
When your scroll reach at 20 you can call web service, these process done again when reach 40 etc.
Related
I have one problem, i fetch data from one URL and set it to Table but if data is almost 10 to 15 values. then i get data in table easily means table data populated in less time.
But if Data is almost 500 to 600 values then one have to wait till all data come as i have used ProgressView so user have to wait till all response doesn't come.
is there is any way to resolve this, to set some data earlier and then after all data that i have got afterwards.
Any help will be appreciated
you should use pagination support in your tableView and in your backend as well, please see this example:
https://easyiostechno.wordpress.com/2014/11/14/pagination-of-table-views/
Basically it's a bad practice to fetch large data at once and keep user waiting. You should ideally fetch data only when it's necessary. For your case I would suggest you to use paging mechanism.
Below is just rough idea about paging which you can use:
When you load your data from webservice, send two parameters named
PAGE_COUNT and PREVIOUS_PAGE_COUNT.
For first time send PAGE_COUNT = nuber_of_values_you_want_to_fetch_initially and PREVIOUS_PAGE_COUNT
= 0
When user scrolls down showing loader at the bottom of table and again hit webservice but with PREVIOUS_PAGE_COUNT = nuber_of_values_you_want_to_fetch_initially + PAGE_COUNT
This approach will need some modification from back-end also like checking for initial page count and then fetching next records from
database.
I hope this will help you.
I am new at youtube API V3 now I can get the videos of some user and display them at my site but now when the result is more than 50 video I want to make a paging for them lets say that my result is 240 so I have 5 pages now how I make the request for page number 4 with out go through links 1 , 2 or 3
Here is my list request:-
https://www.googleapis.com/youtube/v3/playlistItems?part=snippet&playlistId=UUdxi8d8qRsRyUi2ERYjYb-w&key={myKey}
there are a page token that allow me to go through the links but that mean that I must to use next and prev requests
so is there any way to load page 4 direct ?
There is no way to load a specific page directly you need to call the API with the nextPageToken for each page (so 4 times in your example).
I do not recommend this approach but include it for completleness as it provides a direct solution to the question whereas the other answer provides alternatives.
Youtube seems to use the same pageTokens regardless of playlist or date of access. E.g., the pageToken for the 2nd page is always CDIQAA, for the 3rd page CGQQAA, etc.
These aren't guranteed to always remain the same, but based on stackoverflow posts from as early as 2014, it hasn't changed in a long time, so it's a reasonable assumption.
So you could make the request for the first five pages as you normally do, then persist the page tokens (e.g. in a local cache). Then whenever you need e.g. the 240th item in the future, simply look up the 5th page token (which happens to be CMgBEAA) and access that page directly.
Here are the 1st 9 page tokens:
1 <blank>
2 CDIQAA
3 CGQQAA
4 CJYBEAA
5 CMgBEAA
6 CPoBEAA
7 CKwCEAA
8 CN4CEAA
9 CJADEAA
Given :
view A ( uitableView ) is used to display all images after you successfully pull them from a server via a request named getAllImages
you can also upload a new photo in view A via a top right button
My Goal :
Display a new set of images ( the new images included) on the table
What I am doing is :
send the request to server for uploading image ( i am using afnetworking to do that)
since server side is only returned "success" or failure" to me without other information. Supposed it is success, I will fire a request to get the new set of images via getAllImages
will invoke reloadData to display a new set of data on the table
I am not sure this is a good way to do it, I am still looking for the best approach to achieve this task. I dont know should I use core data in this task and how to use it.
Please give me any suggestions if you have been experiencing this task. Any comments are appreciated.
Here is what I would do:
1 - call getAllImages to show all N images
2 - take new photo
3 - display N images previously gotten from getAllImages, and 1 local image from step 2
4 - fire asynchronous request (do not specifically remember how do we do that using AFNetworking) to upload image from step 2
5 - if success code, keep N+1 images. If failure code, show only N images and remove the last one.
You can specifically reload only single row using reloadrowatindexpath, without much of a performance hit.
I Created A VIEW IN MY ASP.NET MVC APPLICATION.
View class retrieves a data of 6100 rows
that class is used in one of my view which fills grid with data it can also be sorted.
it will load nicely for first time then sorting also works fine but when i click on last page link it takes time and finally gives following error
The timeout period elapsed prior to completion of the operation or the server is not responding
anyone can help me please i am not getting WATS the problem
Sounds like the request time is reaching the max allowed (default in ASP.NET = 30 secs). I don't think its as the back button as much as the request is randomly taking 29 secs, 31 secs, etc.
You may want to try a pagination approach to displaying your data instead of loading 6,100 rows at once - this will reduce the load times.
Check out this link for a pagination example.
Another idea would to use page caching. But I would recommend pagination.
I have a page that has to render a huge set of query results - most of them with very, very small images. It is already paginated, so that won't solve my problem.
The query executes fine - it's very zippy, returns in about .0004 seconds, paginates itself out to the View - all is well in the land of Oz.
However there is some big trouble in that ASP.NET MVC dumps the page when it is ready, not as it is loaded. Is there any way around this?
I tried using jQuery to lace through div layers and draw partial views - this alleviated some of the problem, but it still just 'hangs' on the page until the whole thing is ready to be drawn.
I was looking around and found a few suggestions about using Response.Write - but I couldn't uncover anything relevant to my case. Any ideas? The structure is as follows...
PartialView
- Category
- IEnumerable<Models.Images> (List)
PartialView
- Page
- IEnumerable<Models.Images> (List) (Paginated View)
View
- Gallery
-- Index
--- Categories (Ajax Loaded on Demand, not on View render.)
---- ViewPage (No specific model passed)
The problem is clearly the images, I've tested it several times. If I remove the tags from the code, it renders quickly with just any data I tell it to. Each image is around 4 kb in size - so compressing them isn't likely.
Any help would be greatly appreciated.
There are a couple of things that you can do.
First, make sure the results themselves are not inside tables. IE (and perhaps other browsers) have to wait until the table has been fully loaded before rendering to the browser.
Secondly, there is a command called Response.Flush which will push the buffered output to the client. You can call this repeatedly. You may want to call it for every 10 items or so, for example. If you can incorporate this into your code it should do the trick for you.
About how many images are being loaded in a given request? As I'm sure you're aware the issue is less the size of the files and more the quantity of them- it takes longer to move a bunch of small files than an equally sized large file.
One thing to consider would be to send the page down with a specific sized set of results already populated and then use JavaScript (and perhaps scroll events) to dynamically load the rest. Ideally you should try to minimize the size of the initial request so that the page doesn't block user interaction for long; after that initial loading period you could then start pulling in the rest of the results.