Silverlight / Data Challenging Issue - silverlight-3.0

I'm a little stumped on this one. Anyone have any ideas? I'll try to lay out the example as brief as possible.
Creating Silverlight 3.0 application against SQL 2005 database. Using RIA Services and Entity Framework for data access.
I need to be able to populate a grid against a table. However, my grid UI and my table structure is different. Basically my grid needs to turn rows into columns (like a PIVOT table). Here are my challenges / assumptions
I have no idea until runtime which columns I will have on the grid.
Silverlight 3 only supports binding to properties
Silverlight 3 does not allow you to add a row to the grid and manually populate data.
As we all know, Silverlight does not have the System.Data (mainly DataTable) namespace
So, how do I create an object w/ dynamic properties so that I can bind to the grid. Every idea I've had (multi-dimensional arrays, hash tables, etc.) fall apart b/c SL needs a property to bind to, I can't manually add/fill a data row, and I can't figure out a way to add dynamic properties. I've seen an article on a solution involving a linked list but I'm looking for a better alternative. It may come down to making a special "Cody Grid" which will be a bunch of text boxes/labels. Doable for sure but I'll lose some grid functionality that users expect
The ONLY solution I have been able to come up is to create a PIVOT table query in SQL 2005 and use an entity based on that query/view. SQL 2008 would help me with that. I would prefer to do it in Silverlight but if that is the last resort, so be it. If I go the PIVOT route, how do I implement a changing data structure in Entity Framework?
Data Sample.
Table
Name Date Value
Cody 1/1/09 15
Cody 1/2/09 18
Mike 1/1/09 20
Mike 1/8/09 77
Grid UI should look like
Name 1/1/09 1/2/09 1/3/09 .... 1/8/09
Cody 15 18 NULL NULL
Mike 20 NULL NULL 77
Cody

My team came up with a good solution. I'm not sure who deserves the credit but it's somewhere in google land. So far it works pretty good.
Essentially the solution comes down to using reflection to build a dynamic object based on this dynamic data. The function takes in a 2-dimensional array and turns it into a List
object with properties that can be bound. We put this process in a WCF Service and it seems to do exactly what we need so far.
Here is some of the code that builds the object using Reflection
AppDomain myDomain = AppDomain.CurrentDomain;
AssemblyName myAsmName = new AssemblyName("MyAssembly");
AssemblyBuilder myAssembly = myDomain.DefineDynamicAssembly(myAsmName, AssemblyBuilderAccess.Run);
ModuleBuilder myModule = myAssembly.DefineDynamicModule(myAsmName.Name);
TypeBuilder myType = myModule.DefineType("DataSource", TypeAttributes.Public);
string columnName = "whatever";
for (int j = 0; j <= array.GetUpperBound(1); j++)
{
Type properyType = typeof(T);
FieldBuilder exField = myType.DefineField("_" + "columnName" + counter, properyType, FieldAttributes.Private);
//The following line is where I’m passing columnName + counter and getting errors with some strings but not others.
PropertyBuilder exProperty = myType.DefineProperty(columnName + counter.ToString(), PropertyAttributes.None, properyType, Type.EmptyTypes);
//Get
MethodBuilder exGetMethod = myType.DefineMethod("get_" + "columnName" + counter, MethodAttributes.Public, properyType, Type.EmptyTypes); ILGenerator getIlgen = exGetMethod.GetILGenerator();
//IL for a simple getter:
//ldarg.0
//ldfld int32 SilverlightClassLibrary1.Class1::_Age
//ret
getIlgen.Emit(OpCodes.Ldarg_0);
getIlgen.Emit(OpCodes.Ldfld, exField);
getIlgen.Emit(OpCodes.Ret);
exProperty.SetGetMethod(exGetMethod);
//Set
MethodBuilder exSetMethod = myType.DefineMethod("set_" + "columnName" + counter, MethodAttributes.Public, null, new Type[] { properyType }); ILGenerator setIlgen = exSetMethod.GetILGenerator();
//IL for a simple setter:
//ldarg.0
//ldarg.1
//stfld int32 SilverlightClassLibrary1.Class1::_Age
//ret
setIlgen.Emit(OpCodes.Ldarg_0);
setIlgen.Emit(OpCodes.Ldarg_1);
setIlgen.Emit(OpCodes.Stfld, exField); setIlgen.Emit(OpCodes.Ret);
exProperty.SetSetMethod(exSetMethod);
counter++;
}
finished = myType.CreateType();

You can dynamically set columns with their associated bindings (ensuring that AutoGenerateColumns is off):
For instance, the name column:
DataGridTextColumn txtColumn = new DataGridTextColumn();
textColumn.Header = "Name";
textColumn.Binding = new Binding("FirstName");
myDataGrid.Columns.Add(txttColumn);
The ObservableCollection you use to store the data that is queried could possibly be overriden to support pivoting, making sure to change the binding of the DataGrid columns, as shown above.
Note: This is a fair amount of hand waving i'm sure (haven't touched silverlight for over a year); but I hope it's enough to formulate another strategy.

if you are working with two dimensional array then adding columns dynamically as shown above will not work.
The problem is with silverlight it cannot understand the binding of columns to a list.
So we have to create list of rows with row convertor that will represent our two dimensional arrays.
this one worked for me
http://www.scottlogic.co.uk/blog/colin/2010/03/binding-a-silverlight-3-datagrid-to-dynamic-data-via-idictionary-updated/

Related

How to reduce Azure Table Storage latency?

I have a rather huge (30 mln rows, up to 5–100Kb each) Table on Azure.
Each RowKey is a Guid and PartitionKey is a first Guid part, for example:
PartitionKey = "1bbe3d4b"
RowKey = "1bbe3d4b-2230-4b4f-8f5f-fe5fe1d4d006"
Table has 600 reads and 600 writes (updates) per second with an average latency of 60ms. All queries use both PartitionKey and RowKey.
BUT, some reads take up to 3000ms (!). In average, >1% of all reads take more than 500ms and there's no correlation with entity size (100Kb row may be returned in 25ms and 10Kb one – in 1500ms).
My application is an ASP.Net MVC 4 web-site running on 4-5 Large instances.
I have read all MSDN articles regarding Azure Table Storage performance goals and already did the following:
UseNagle is turned Off
Expect100Continue is also disabled
MaxConnections for table client is set to 250 (setting 1000–5000 doesn't make any sense)
Also I checked that:
Storage account monitoring counters have no throttling errors
There are some kind of "waves" in performance, though they does not depend on load
What could be the reason of such performance issues and how to improve it?
I use the MergeOption.NoTracking setting on the DataServiceContext.MergeOption property for extra performance if I have no intention of updating the entity anytime soon. Here is an example:
var account = CloudStorageAccount.Parse(RoleEnvironment.GetConfigurationSettingValue("DataConnectionString"));
var tableStorageServiceContext = new AzureTableStorageServiceContext(account.TableEndpoint.ToString(), account.Credentials);
tableStorageServiceContext.RetryPolicy = RetryPolicies.Retry(3, TimeSpan.FromSeconds(1));
tableStorageServiceContext.MergeOption = MergeOption.NoTracking;
tableStorageServiceContext.AddObject(AzureTableStorageServiceContext.CloudLogEntityName, newItem);
tableStorageServiceContext.SaveChangesWithRetries();
Another problem might be that you are retrieving the entire enity with all its properties even though you intend only use one or two properties - this is of course wasteful but can't be easily avoided. However, If you use Slazure then you can use query projections to only retrieve the entity properties that you are interested in from the table storage and nothing more, which would give you better query performance. Here is an example:
using SysSurge.Slazure;
using SysSurge.Slazure.Linq;
using SysSurge.Slazure.Linq.QueryParser;
namespace TableOperations
{
public class MemberInfo
{
public string GetRichMembers()
{
// Get a reference to the table storage
dynamic storage = new QueryableStorage<DynEntity>("UseDevelopmentStorage=true");
// Build table query and make sure it only return members that earn more than $60k/yr
// by using a "Where" query filter, and make sure that only the "Name" and
// "Salary" entity properties are retrieved from the table storage to make the
// query quicker.
QueryableTable<DynEntity> membersTable = storage.WebsiteMembers;
var memberQuery = membersTable.Where("Salary > 60000").Select("new(Name, Salary)");
var result = "";
// Cast the query result to a dynamic so that we can get access its dynamic properties
foreach (dynamic member in memberQuery)
{
// Show some information about the member
result += "LINQ query result: Name=" + member.Name + ", Salary=" + member.Salary + "<br>";
}
return result;
}
}
}
Full disclosure: I coded Slazure.
You could also consider pagination if you are retrieving large data sets, example:
// Retrieve 50 members but also skip the first 50 members
var memberQuery = membersTable.Where("Salary > 60000").Take(50).Skip(50);
Typically, if a specific query requires scanning a large number of rows, that will take longer time. Is the behavior you are seeing specific a query / data? Or, are you seeing the performance varies for the same data and query?

Best Data Access approach for asp.net mvc application with big data

we have big challenge : Application is too slow because of huge records :(
we look for best data access pattern in web application (asp.net mvc 3)
with big data (sql server view with more than 66 million records.)
customers have forms with search fields and will see results within a grid of web page.
since now
we used both these data access ways:
1- entity framework with some features like compiled query
--get result
GridModelsResult = VwLoanInquiryList.Skip(pageIndex * pageSize).Take(pageSize).ToList();
--get count
int totalPages = (int)Math.Ceiling((float)totalRecords / (float)pageSize);
2- sql server store procedure for pagination and get only current page results
--get result
select t1.* into #t from
(select *
FROM VwLoanInquiry
where CustomerNumber like '%'+#CustomerNumber+'%'
) AS [t1]
WHERE (#PageIndex=-1 or (t1.ID BETWEEN (#PageIndex-1)* #RecordPerPage AND ((#PageIndex)* #RecordPerPage)-1))
--get count
select count(*) from #t
3- OLAP; we made cube in analysis services (SSAS)
but here our problem is to show records with string column type within Cube Cells (its impossible to define Non-numeric fields for measures )
in additon ,
to get result faster,our view has index and data types of column has been set .
please help us to know which one of these ways are better ,or if there is an other way what is that?
thanks

ASP.NET MVC3 -Performance Improvement through paging concept, I Need an example?

I am working on application built on ASP.NET MVC 3.0 and displaying the data in MVC WebGrid.
I am using LINQ to get the records from Entities to EntityViewModel. In doing this I have to convert the records from entity to EntityViewModel.
I have 30K records to be displayed in the grid, for each and every record there are 3 flags where It has to go 3 other tables and compare the existence of the record and paint with true or false and display the same in grid.
I am displaying 10 records at a time, but it is bit very slow as I am getting all the records and storing in my application.
The Paging is in place (I mean to say -only 10 records are being displayed in web grid) but all the records are getting loaded into the application which is taking 15-20 seconds. I have checked the place where this time is being spent by the processor. It's happening in the painting place(where every record is being compared with 3 other tables).
I have converted LINQ query to SQL and I can see my SQL query is getting executed under 2 seconds. By this , I can strongly say that, I do not want to spend time on SQL indexing as the speed of SQL query is good enough.
I have two options to implement
1) Caching for MVC
2) Paging(where I should get only first ten records).
I want to go with the paging technique for performance improvement .
Now my question is how do I pass the number 10(no of records to service method) so that It brings up only ten records. And also how do I get the next 10 records when clicking on the next page.
I would post the code, but I cannot do it as it has some sensitive data.
Any example how to tackle this situation, many thanks.
If you're using SQL 2005 + you could use ROW_NUMBER() in your stored procedure:
http://msdn.microsoft.com/en-us/library/ms186734(v=SQL.90).aspx
or else if you just want to do it in LINQ try the Skip() and Take() methods.
As simple as:
int page = 2;
int pageSize = 10;
var pagedStuff = query.Skip((page - 1) * pageSize).Take(pageSize);
You should always, always, always be limiting the amount of rows you get from the database. Unbounded reads kill applications. 30k turns into 300k and then you are just destroying your sql server.
Jfar is on the right track with .Skip and .Take. The Linq2Sql engine (and most entity frameworks) will convert this to SQL that will return a limited result set. However, this doesn't preclude caching the results as well. I recommend doing that as well. That fastest trip to SQL Server is the one you don't have to take. :) I do something like this where my controller method handles paged or un-paged results and caches whatever comes back from SQL:
[AcceptVerbs("GET")]
[OutputCache(Duration = 360, VaryByParam = "*")]
public ActionResult GetRecords(int? page, int? items)
{
int limit = items ?? defaultItemsPerPage;
int pageNum = page ?? 0;
if (pageNum <= 0) { pageNum = 1; }
ViewBag.Paged = (page != null);
var records = null;
if (page != null)
{
records = myEntities.Skip((pageNum - 1) * limit).Take(limit).ToList();
}
else
{
records = myEntities.ToList();
}
return View("GetRecords", records);
}
If you call it with no params, you get the entire results set (/GetRecords). Calling it will params will get you the restricted set (/GetRecords?page=3&items=25).
You could extend this method further by adding .Contains and .StartsWith functionality.
If you do decide to go the custom stored procedure route, I'd recommend using "TOP" and "ROW_NUMBER" to restrict results rather than a temp table.
Personally I would create a custom stored procedure to do this and then call it through Linq to SQL. e.g.
CREATE PROCEDURE [dbo].[SearchData]
(
#SearchStr NVARCHAR(50),
#Page int = 1,
#RecsPerPage int = 50,
#rc int OUTPUT
)
AS
SET NOCOUNT ON
SET FMTONLY OFF
DECLARE #TempFound TABLE
(
UID int IDENTITY NOT NULL,
PersonId UNIQUEIDENTIFIER
)
INSERT INTO #TempFound
(
PersonId
)
SELECT PersonId FROM People WHERE Surname Like '%' + SearchStr + '%'
SET #rc = ##ROWCOUNT
-- Calculate the final offset for paging --
DECLARE #FirstRec int, #LastRec int
SELECT #FirstRec = (#Page - 1) * #RecsPerPage
SELECT #LastRec = (#Page * #RecsPerPage + 1)
-- Final select --
SELECT p.* FROM People p INNER JOIN #TempFound tf
ON p.PersonId = tf.PersonId
WHERE (tf.UID > #FirstRec) AND (tf.UID < #LastRec)
The #rc parameter is the total number of records found.
You obviously have to model it to your own table, but it should run extremely fast..
To bind it to an object in Linq to SQL, you just have to make sure that the final selects fields match the fields of the object it is to be bound to.

Select certain number of records for batch processing

Hi is it possible using Entity Framework and/or linq to select a certain number of rows? For example i want to select rows 0 - 500000 and assign these records to the List VariableAList object, then select rows 500001 - 1000000 and assign this to the List VariableBList object, etc. etc.
Where the Numbers object is like ID,Number,DateCreated, DateAssigned, etc.
Sounds like you're looking for the .Take(int) and .Skip(int) methods
using (YourEntities db = new YourEntities())
{
var VariableAList = db.Numbers
.Take(500000);
var VariableBList = db.Numbers
.Skip(500000)
.Take(500000);
}
You may want to be wary of the size of these lists in memory.
Note: You also may need an .OrderBy clause prior to using .Skip or .Take--I vaguely remember running into this problem in the past.

Combining table, web service data in Grails

I'm trying to figure out the best approach to display combined tables based on matching logic and input search criteria.
Here is the situation:
We have a table of customers stored locally. The fields of interest are ssn, first name, last name and date of birth.
We also have a web service which provides the same information. Some of the customers from the web service are the same as the local file, some different.
SSN is not required in either.
I need to combine this data to be viewed on a Grails display.
The criteria for combination are 1) match on SSN. 2) For any remaining records, exact match on first name, last name and date of birth.
There's no need at this point for soundex or approximate logic.
It looks like what I should do is extract all the records from both inputs into a single collection, somehow making it a set on SSN. Then remove the blank ssn.
This will handle the SSN matching (once I figure out how to make that a set).
Then, I need to go back to the original two input sources (cached in a collection to prevent a re-read) and remove any records that exist in the SSN set derived previously.
Then, create another set based on first name, last name and date of birth - again if I can figure out how to make a set.
Then combine the two derived collections into a single collection. The collection should be sorted for display purposes.
Does this make sense? I think the search criteria will limit the number of record pulled in so I can do this in memory.
Essentially, I'm looking for some ideas on how the Grails code would look for achieving the above logic (assuming this is a good approach). The local customer table is a domain object, while what I'm getting from the WS is an array list of objects.
Also, I'm not entirely clear on how the maxresults, firstResult, and order used for the display would be affected. I think I need to read in all the records which match the search criteria first, do the combining, and display from the derived collection.
The traditional Java way of doing this would be to copy both the local and remote objects into TreeSet containers with a custom comparator, first for SSN, second for name/birthdate.
This might look something like:
def localCustomers = Customer.list()
def remoteCustomers = RemoteService.get()
TreeSet ssnFilter = new TreeSet(new ClosureComparator({c1, c2 -> c1.ssn <=> c2.ssn}))
ssnFilter.addAll(localCustomers)
ssnFilter.addAll(remoteCustomers)
TreeSet nameDobFilter = new TreeSet(new ClosureComparator({c1, c2 -> c1.firstName + c1.lastName + c1.dob <=> c2.firstName + c2.lastName + c2.dob}))
nameDobFilter.addAll(ssnFilter)
def filteredCustomers = nameDobFilter as List
At this point, filteredCustomers has all the records, except those that are duplicates by your two criteria.
Another approach is to filter the lists by sorting and doing a foldr operation, combining adjacent elements if they match. This way, you have an opportunity to combine the data from both sources.
For example:
def combineByNameAndDob(customers) {
customers.sort() {
c1, c2 -> (c1.firstName + c1.lastName + c1.dob) <=>
(c2.firstName + c2.lastName + c2.dob)
}.inject([]) { cs, c ->
if (cs && c.equalsByNameAndDob(cs[-1])) {
cs[-1].combine(c) //combine the attributes of both records
cs
} else {
cs << c
}
}
}

Resources