Custom Listbox: Limit Maximum Item Count - silverlight-3.0

I have silverlight 3.0 project that has a listbox that is databound to a list of items. What I want to do is limit the number of items displayed in the listbox to be <= 10. I originally accomplished this by limiting the data bound to the list to 10 items by doing a .Take(10) on my orignal data and databinding the result.
The problem w/ the .Take(10) approach is that the original datasource may change and since .Take() returns a reference (or copy not sure) of the original data I sometimes do not see changes in the data reflected in my UI.
I'm trying to figure out a better way of handling this rather than the .Take() approach. It seems you shouldn't 'filter' your data using LINQ functions if you have more than one UI element bound to the same data. My only thought on how to do this better is to make a custom container that will limit the count, but that seems like it might be a mountain of work to make a custom stackpanel or equivalent.

Take(10) does not make a copy, it just appends another step to the LINQ query. But all execution is still deferred till someone pulls the items of the query.
If you were setting the items statically, a copy would indeed be created, by running the query once. But since you set the constructed query as the ItemsSource property of the list box, it can run and update it any time, so it is the right approach.
The real reason why you sometimes do not see changes in the data reflected in the UI is that the list box has no way to determine why the data returned by the query have changed and it surely doesn't want to keep constantly trying to refetch the data and maybe update itself. You need to let it know.
How can you let it know? The documentation for ItemsSource says that "you should set the ItemsSource to an object that implements the INotifyCollectionChanged interface so that changes in the collection will be reflected (...).". Apparently the default way of doing things by .Net itself does not work in your case.
So there are some examples how to implement that yourself e.g. in this SO answer. If even the top-level source data collection (over which you are doing the LINQ query) does not support these notifications (which you would just forward), you might need to update the list box manually from your other code which changes the underlying data.

Related

How do I force the SmartTable to load all items?

My SAP UI5 view contains a SmartTable that is bound to an entity set in an ODataModel in read-only mode.
The table limits the number of items it displays to 100, verifiable by the ?$top=0&$limit=100 parameters it appends ot the data query to the server.
However, my users would like to have the table load and display all items, without paging or having to press some "More" button.
Is there a way to override the SmartTable's default behavior? For example by setting some growingSize property to "Infinity"? Or by modifying the aggregation binding? Or by adding annotations to the OData service?
Since you did not specify the number of expected items or the table you're using here are some general considerations and a few possible solutions.
The type of table
There are a few things to consider between the varying types of tables you can use, there is some advice of SAP itself from the design guidelines:
Do not use the responsive table if: You expect the table to contain more than around 1,000 rows. Try using the analytical table or grid table instead; they are easier to handle, perform better, and are optimised for handling large numbers of items.
Ways around it
First option I can think of, if you're using a responsive table and you expect less than 1000 rows then the scroll to load feature might be of interest which should load more entries when the user reaches the bottom of the current list.
There are ways to increase the default size of 100, both through the table using the growingThreshold property, or through the declaration of the oData model using the sizeLimit
If you're using a grid table then the scroll-to-load works slightly differently, since it doesn't display all rows at the same time. Instead, it destroys the current lines to display new lines, which is why it's recommended for (very) large datasets.
Second, if none of those solutions work for your users you could alternatively first fetch the count of the current list including filters, so you can set an accurate threshold on the table before displaying results. If done correctly, your oData service should return a count using /myserivce/MyEntity/$count?$filters... when queried. CDS does this automatically, older services will need to implement that separately.
Last, if you know the list never exceeds a certain number of lines, you could set the growingThreshold parameter on the table to that number and then you don't have to worry about fetching an accurate count first.
How all of this is implemented depends a bit on how you create the smart table (elements, manually etc) so I'm not sure how to provide usable example code
You can achieve this by using a formatter function which will return how many entries are within your model.
<SmartTable growingThreshold = "{path:'yourModel>/', formatter:'.formatter.sizeCalculator'}">
In the formatter File, which is usually to find in the model folder:
sizeCalcualtor: function(oModel){
let count = 0;
for(let i in oModel){
//add item to count;
}
return count;
}

With Realm, should I use a List object or Results object as the data source for a UITableView?

There are at least 2 main collection types used in Realm:
List
Results
The relevant description from the documentation on a Results object says:
Results is an auto-updating container type in Realm returned from
object queries.
Because I want my UITableView to respond to any changes on the Realm Object Server, I really think I want my UITableView to be backed by a Results object. In fact, I think I would always want a Results object to back my UI for this reason. This is only reinforced by the description of a List object in the documentation:
List is the container type in Realm used to define to-many
relationships.
Sure seems like a List is focused on data modeling... So, being new to Realm and just reading the API, I'm thinking the answer is to use the Results object, but the tutorial (Step 5) uses the List object while the RealmExamples sample code uses Results.
What am I missing? Should I be using List objects to back my UITableViews? If so, what are the reasons?
Short answer: use a List if one already exists that closely matches what you want to display in your table view, otherwise use a Results.
If the data represented by a List that's already stored in your Realm corresponds to what you want to display in your table view, you should certainly use that to back it. Lists have an interesting property in that they are implicitly ordered, which can sometimes be helpful, like in the tutorial you linked to above, where a user can reorder tasks.
Results contain the results of a query in Realm. Running this query typically has a higher runtime overhead than accessing a List, by how much depends on the complexity of the query and the number of items in the Realm.
That being said, mutating a List has performance implications too since it's writing to the file in an atomic fashion. So if this is something that will be changing frequently, a Results is likely a better fit.
You should use Results<> as the Results is auto updating to back your UITableView. List can be used to link child models in a Realm model. where as Results is used to query the Realm Objects and you should add a Realm Notification Token so you know when the Results are updated and take necessary action (reload table view etc.) Look here for realm notifications: https://realm.io/docs/swift/latest/#notifications
P.S. The data in that example is just static and no changes are observed

ios UITableView with fetchedresultscontroller - add custom rows

I have a UITableView with data coming from NSFetchedResultsController.
Here is my tablewView:
I need to add a row "All types". It also needs to be:
Sortable with all other items
Selectable (Design is now selected)
Selecting "All types" should deselect other rows
Give something to understand that it's an "All types" row when selected
I've read Add extra row to a UITableView managed by NSFetchedResultsController and NSFetchedResultsController prepend a row or section. Given approaches makes impossible to sort data or will look so hacky and produce so much hard-maintailable code, that it will be impossible to change logic and maintain code.
Are there any other good options?
PS. I understand, that my question may sound "broad" and doesn't containt code, but I think it's very common problem.
I do not think this is a very common problem at all. I can see it seems natural to do what you are trying but lets analyse your situation: What you generally have are 2 arrays of objects which you wish to sort as a single array. Now that is quite a common situation and I believe everyone knows how to solve this issue. You need to create a single array of objects and then sort it.
The way I see it you have 3 options:
Fetch all the items, merge the 2 arrays, sort and present them. This is not a very good idea since your memory consumption can be a bit too large if there are a lot of items in the database.
Put the extra data into the database and use a fetch result controller as you would normally. This should work good but you will probably need to mark these items so they are later removed or keep it in the database but ignore them where you wish not to display them.
Create a temporary database combined with what needs to be fetched from the database and your additional data. This approach is great if your data are meant for read-only in this list (which actually seems to be the case in what you posted). Still it is best if you create some kind of link between the objects. For instance some kind of ID would be great, this way when user selects an object from the second database you simply read the ID and fetch the object from the original database.

Limit serverside results from WebApi controller and ODATA w/ Nhibernate

I've created a WebApi project in VS 2012, using NHibernate as my ORM and I intend to enable Odata support on it. So I've created a test controller with a single Get method that returns a list of entities from a table on my database.
Everything works fine, I can use OData to filter and order my results, etc. The problem is I couldn't find a way to limit the amount of data that's being returned from the database to the controller, and this table has millions of records in it.
Using the PageSize property of the Queryable attribute only seems to be limiting the amount of data returned to the client, but no the amount of Data returned from the DB.
I've tried applying a Take(n) on the IQueryable inside the get method before returning it, and it limits the results brought back from the DB, but it breaks the OData filtering, since if you try to query an entity that's not in the first n results, it just returns an empty collection.
I know you can use the $Top parameter on OData to accomplish this, but I would like not to depend on the client/consumer providing it in order to ensure that I'm not unnecessarily bringing thousands or even million of records that I'm not going to use.
I've also tried to manually check if the client provided a Top parameter on the query string, apply the OData transformation to my Queryable and then applying the Take(n) method over the transformed query. This approach enabled me to filter for any entity through OData, but it breaks pagination, because if I use the $Skip=n parameter, it again returns an empty collection.
So, is there any way to reliably limit the results fetched from the DB while not breaking the OData support?
We recently found that too. We are not applying a Take(pageSize) when server driven paging is enabled as we have to figure out if a next page link should be generated or not. We just enumerate the result set for pageSize number of entities and check if there are more entities or not. We thought that most providers generally bring a partial set of results as IQueryable is generally a lazy implementation. Turns out that is not true. Also, the database can optimize the query if it knows only pageSize number of results are required.
This is the issue that was opened for it. Good news is Youssef fixed it already :). This is the commit that fixed it. So, if you grab the nightly builds you should be good.

How can I copy a Delphi TTable including its calculated fields?

I have defined a Delphi TTable object with calculated fields, and it is used in a grid on a form. I would like to make a copy of the TTable object, including the calculated fields, open that copy, do some changes to the data with the copy, close the copy, and then refresh the original copy and thusly the grid view. Is there an easy way to get a copy of a TTable object to be used in such a way?
The ideal answer would be one that solves the problem as generically as possible, i.e., a way of getting something like this:
newTable:=getACopyOf(existingTable);
You can use the TBatchMove component to copy a table and its structure.
Set the Mode property to specify the desired operation. The Source and Destination properties indicate the datasets whose records are added, deleted, or copied. The online help has additional details.
(Although I reckon you should investigate a TClientDataSet approach - it's certainly more scalable and faster).
Let me propose several things:
Let us suppose that you want to make changes programmatically. You could then use DisableControls and EnableControls methods of the TTable to disallow screen updates during that time.
If you want to have two screens with the same data (f.e. to compare data during online changes), you could actually create the same screen twice, with the TTable object being on the screen itself. It will have the exact same configuration (but not carry over previously made changes on the first screen but read the data from the database). Changes made on one screen will not be automatically refreshed on the other.
Another way: Try using TDataSetProvider with TTable as Dataset (source) feeding a TClientDataSet. ApplyUpdates would feed back the changes to the TTable. Since the calculated fields are read only, they are not affected. (untested, but should work)
I believe that the second approach (TClientDataset) is probably the best method to use in this scenario. An alternative would be to use a memory table (kbmMemTable for instance). Either way, you would clone your original table and then after making your changes loop thru the memory version of your dataset and update your original table.
You should be able to select the table on the form, copy it using Ctrl-C, then paste it into any text editor. You will get the text version of the object's properties which you can then edit as needed. When you are done, select all the text again and you can copy it to the clipboard and paste it back onto a form.

Resources