Optimize array data size while pagination - ios

I know how to implement pagination with UITableview but my question is we always append data of next page with existing complete data array so every next page array is increasing array size.
For example - We get 50 records in first page and we request for next page and we again get 50 records and then we will append that records in existing complete array so complete array is now having 100 records. I am requesting data with around 100 pages so my array will have 5000 records as we know holding some starting page array data is not good idea as we hardly come back for starting page after visited 100 pages .
Is there any way to optimize array size? please help me on this as i searched a lot but didn't find good answer for this.
I would be very grateful for help and sorry for my bad english.

I think you can achieve that by writing the "old" data to a local storage, and retrieve and insert back into your array.
So, imagine that you've already fetched, lets say 200 items. So when the user scrolls down, and you fetched the next page (the next 20 items), you "cut" from your array the items from 0 to 99 and write to a file. Now your array has 120 items. Then, when the user continues scrolling and again reached 220 (array.count >= 220), repeat the same logic, and so on.
Now the most interesting part. If the user scrolls back and the index of the top visible cell is <100, you read the previously written data from the file (and remove from the file) and insert into your array at 0 position.
And of course it'd be better to clear all that kind of files on the app launch.
Of course the numbers I wrote below are magic numbers and you should play with them to find the right ones that best fit your needs.

Related

I can't figure out how to filter or query in Google Sheets without returning a bunch of blank strings appended to actual data

I'm at my wit's end on trying to figure out why filtering/querying in Google Sheets is so broken. I have a sheet with some data about practice exams I'm taking and I'm attempting to pull some data from that sheet to another sheet for calculating statistics. I've made a shareable document with the pertinent stuff so you can see what I mean.
My raw data is in the TestScores sheet and I made a TESTSTATS sheet to test different methods of pulling data from TestScores. In my example, I'm only trying to pull unique dates from range TestScores!B2:B and I've added a few different methods to do so in TESTSTATS (removed the equal sign from each one so each can be tested on its own by putting in the equal sign).
The methods I've tried:
=UNIQUE(TestScores!B2:B)
=UNIQUE(FILTER(TestScores!B2:B, TestScores!B2:B<>""))
=UNIQUE(FILTER(TestScores!B2:B, TestScores!B2:B<>0))
=UNIQUE(FILTER(TestScores!B2:B, NOT(ISBLANK(TestScores!B2:B))))
=UNIQUE(QUERY(TestScores!B2:B, "select B"))
=ARRAY_CONSTRAIN(UNIQUE(QUERY(TestScores!B2:B, "select B")), ROWS(UNIQUE(TestScores!B2:B))+1,5)
You'll see that each one, when activated by adding the = in front of the formula returns the proper data, but also appends 500 empty rows which look empty, but are in fact blank strings (""). This makes it difficult to work with because there are a lot of calculations in my sheet that depend on one another. I also do not want to specify an explicit end to my ranges and would prefer to keep them open ended (B2:B instead of B2:B17) so everything updates automatically as new records are added.
What am I doing wrong? Why are the returned data appended with a bunch of empty cells, and why 500 specifically (seems arbitrary considering my source data is 29 or 30 rows depending on whether or not you include headers)?
Starting with only two rows in TESTSTATS more rows have to be added for somewhere to place the output. It seems Google choose to do so 500 rows at a time (from the last required cell). "Why?" would have to be a matter for Google.
If you know 14 rows are required for the output and increase the size of TESTSTATS to 16 no more rows will be added. Since you want room for expansion you can't extend to 16 and avoid further issues but you could allow some room, say to 30 rows, and delete the few extra, or, if 30 becomes insufficient (when sheet shoots up to say 540 rows) delete the rows not required but set the sheet size to say 60 rows - and so on.

How to implement infinite scroll with multiple filter on data that get from Firebase in Swift?

I'm using Firebase for my iOS application and I'm having trouble implement infinite scroll and filtering data together.
What I need to do is:
Display items with order/filter on multiple property (location, category, status . . .)
Implement infinite scroll when the user scrolled to bottom of the screen.
I tried to think about some solutions:
The first, I think that I'll query the data with the necessary conditions then limit the number of records by use queryLimitedToFirst(N), and increase N when need to load the next items. But because Firebase can only filter on one property at a time and it's also a waste to reload data. So, I was thinking about the second solution.
As approaches are suggested from Frank van Puffelen (Query based on multiple where clauses in firebase):
filter most on the server, do the rest on the client
Yes, exactly like that. I'll execute queryOrderedByKey, queryStartingAtValue, queryEndingAtValue to implement infinite scroll, pull down the remaining data and filter that on client. But there is one problem that is I would not have enough items to display for the user if execute filter on the client.
For example: each time run the query, I receive 10 items. After data filtering process on the client, I just left 5 (can be 0) items meet the conditions to display to the user.
I don't want this because user may think there is a problem
Can I please get some pointers on this? If I didn't structured the data properly, can I also get some tips there?

How to display bunch of data in tableview

In my app I have 800 000 data in server which I have to display to user. User can also search from those data. I really got confused what to do here now. How to achieve this functionality.? I am trying to load first 50 data to table and then at top part there is search bar from that user can search data but user can search by writing approximate word also (i.e if user wrote "bcd" then it will return all data having "bcd" combination). Can anyone suggest me something that will help me to get out of this situation.
You have to do pagination here without it you can't get that much data if you do that then your application will be crash. Fetch some data from the server like 30 or 40 and when you reach at 30 request for next 30 data. Then you can meet the application need.
You need to use pagination in your application .without pagination if you got 8 lakhs data in one shot then your application might be crash.
every time send request to server like"abc"
server get first 10 data from result and return those data.
now for second request server will return 11 to 20 records from resultant data
I am developer with SIMpalm. i would like to suggest you below answer.
why can't you take two array on for displaying in table view other contain all results ,when you search then search result in the Array which contains all results.and add them to the array which shown in the table.
You will have to use pagination, I don't see any other way you can do this without eating up lot of memory or the elegant way and worst case sporadic crash.
You can do the pagination in the browse and the search both. To avoid delay's for user you can preload data. e.g. for pages of 200 records, when user reaches to 150 you start fetching data for next page.
Also if your local/web server is taking more than min to load. you have serious problem on the server, That needs to be fixed. No user will wait for min to reload or get the new data.
I am not expert on the servers/networking but it should not take more than 10-15 secs.
Think about search logic as very similar to the browsing all data.
Search/Browse both needs paging
Browse returns all the data in pages
Search returns specific data in pages
Search/Browse proloads data after user reaches certain point

Data sorting and update of UIcollectionViewCells. Is this a lost cause?

I have core data entries displayed in a collectionView, sorted from 1 2 3 ... n. New batches of entries are added as the user flips through the first n. Data is built from a JSON response obtained from a web server.
Because the first entry of the fetch request is associated to cell 0 - via the datasource delegate -, it's not possible to add a new batch at the bottom of the collection view. If it's added from cell 0, old cell contents are replaced by new ones, or in short the whole page seems to be replaced by new stuff, and the data the user was looking at is offset by the number of new entry. If the batch is large, it's simply buried. Furthermore, if the update is done from cell 0, all entries are made visible, which takes time and memory.
There are several options that I considered:
1) data-redorder, meaning instead of getting the fetch result as 1 2 3 4 ... n, I need the opposite, n ... 3 2 1 (nothing to do with a fetch using reverse order sorting) straight from the fetch request. I'm not sure it's possible? is there a CD gotcha allowing to re-order the fetch result before it is presented to the UICollectionViewDataSource delegate ?
2)Change the Index path/viewCell association in "collectionView cellForItemAtIndexPath:", Use (numberOfItemsInSection - IndexPath.Item). It creates several edges cases, as entries can be removed/updated in the view (hence numberOfItemsInSection changes). So I'd rather avoid it if I can...
3) adding new data from cell 0, ruled out for the reason I explained. There may be a solution: has anyone achieved a satisfactory result by setting a view offset? For example, if 20 new entries are added, then the content of cell 0 is moved to cell 20. So, we just need to tell the view controller to display from cell 20 onwards. Any image flipping or side effects I might expect?
4) download a big chunk of the data, and simply using the built-in core data faulting mechanism. But that's below optimal, because I'm not sure exactly how much I should download - user dependent - and the initial request (JSON+Core Data) might take too long. That's why lazy fetching is here for anyway.
Any advice someone facing the same problem could share ?
Thanks !

NSMutableDictionary caching and/or size limiting

I'm using a NSMutableDictionary to cache high scores that I pull from Game Center (storing scores by GC rank as key). The pulling happens as soon as the user views that line in a tableview. If there are a million rows and the user views them all, that would mean that the cache fills up to a million rows...
Ok in practice I guess I'll be happy if a million people played my game but still to be on the safe side I'd like to limit the amount of rows that go into the NSMutableDictionary.
Anyone got a simple approach here? Maybe another structure than a dictionary would be useful. My idea was to remove the entries from the dictionary that were the most old, and out of current tableview.
Taken a look at NSCache? ・゜゚・:.。..。.:*・'(゚▽゚)'・:.。. .。.:・゜゚・*

Resources