I have an ASP.NET MVC 3 project using .NET framework 4.0 and LINQ to SQL. Because I have serious problems of performance with some queries the first time they are executed I decided to use Entity Framework 5 in order to take advantage of the new LINQ auto compiled feature.
Now I have installed VS 2012, .NET framework 4.5 and Entity Framework 5 and changed my project to point to the .NET framework 4.5 as well as updated my Data Context file to use EF5. The queries now have exactly the same performance, some of them are so slow at the first time execution that even I get a time out exception, the second time everything is fine, I am wondering if the problem is that maybe the migration process was not correct and I am using still the EF4 or it is just a problem of my query construction which can't use the auto compiled feature for an unknown reason.
The EntityFramework dll has the 5.0 version, and the System.Data.Entity dll the version 4, that is right, isn't? Any suggestions?
I include the most problematic query, it retrieves paginated the result for a grid (Telerik MVC grid) which I need to build through 2 queries (originally was a SQL sentence with a sub query):
/// <summary>
/// Get the elements for the AppLog grid.
/// </summary>
public void GetAppLogElements(
int clientID, string language, int functionID, int errorLevel,
int numPage, int pageSize, IList<IFilterDescriptor> filterDescriptors, IList<SortDescriptor> sortDescriptors, IList<GroupDescriptor> groupDescriptors,
ref IQueryable<Model_AppLog> rows, ref int numRows)
{
string orderString = string.Empty;
var items =
from ap in objDataContext.applicationLogs
where
ap.clientID == clientID &&
ap.recordGroup != null
join alf in objDataContext.appLogFunctions on
new { functionID = ap.functionID.Value } equals new { functionID = alf.functionID }
join ale in objDataContext.appLogErrorLevels on
new { errorLevel = ap.errorLevel.Value } equals new { errorLevel = ale.errorLevelID }
join als in objDataContext.appLogSeverities on
new { severity = ap.severity.Value } equals new { severity = als.severityID }
group new { ap, alf, als } by new { ap.functionID, ap.recordGroup, ap.clerkID } into queryGrouped
select new Model_AppLog()
{
sequence = queryGrouped.Max(c => c.ap.sequence),
functionID = queryGrouped.Key.functionID,
recordGroup = queryGrouped.Key.recordGroup,
clerkID = queryGrouped.Key.clerkID,
date = queryGrouped.Min(c => c.ap.date),
errorLevel = (queryGrouped.Max(c => c.ap.errorLevel) == null || !queryGrouped.Max(c => c.ap.errorLevel).HasValue ? 0 : queryGrouped.Max(c => c.ap.errorLevel)),
severity = queryGrouped.Max(c => c.ap.severity)
};
if (errorLevel != -1)
items = items.Where(column => column.errorLevel >= errorLevel);
var _items =
from subSelect in items
join alf in objDataContext.appLogFunctions on
new { functionID = subSelect.functionID.Value } equals new { functionID = alf.functionID }
join alft in objDataContext.appLogFunctionTexts on
new { alf.functionID, language } equals new { alft.functionID, alft.language }
join ale in objDataContext.appLogErrorLevels on
new { errorLevel = subSelect.errorLevel.Value } equals new { errorLevel = ale.errorLevelID }
join alet in objDataContext.appLogErrorLevelTexts on
new { errorLevelID = subSelect.errorLevel.Value, language } equals new { alet.errorLevelID, alet.language }
join als in objDataContext.appLogSeverities on
new { severity = subSelect.severity.Value } equals new { severity = als.severityID }
join alst in objDataContext.appLogSeverityTexts on
new { als.severityID, language } equals new { alst.severityID, alst.language }
select new Model_AppLog()
{
sequence = subSelect.sequence,
functionID = subSelect.functionID,
recordGroup = subSelect.recordGroup,
clerkID = subSelect.clerkID,
date = subSelect.date,
errorLevel = subSelect.errorLevel,
severity = subSelect.severity,
functionDescription = alft.denotation,
errorLevelDescription = alet.denotation,
severityDescription = alst.denotation
};
//Apply filters
if (filterDescriptors != null && filterDescriptors.Any())
{
_items = _items.Where(ExpressionBuilder.Expression<Model_AppLog>(filterDescriptors));
}
if (functionID != -1)
_items = _items.Where(column => column.functionID == functionID);
//Apply sorting
if (sortDescriptors != null)
{
GlobalMethods objGlobalMethods = new GlobalMethods();
orderString = objGlobalMethods.GetOrderString(sortDescriptors);
}
//Apply ordering
if (orderString != string.Empty)
_items = _items.OrderBy(orderString);
else
_items = _items.OrderByDescending(x => x.date);
//Set total number of rows
numRows = _items.AsEnumerable<Model_AppLog>().Count();
//Get paginated results
_items = _items.Skip(pageSize * (numPage - 1)).Take(pageSize);
//Set data result
rows = _items;
}
Here's a paper about EF performance: MSDN
The most important thing you can do to improve start up time is to precompile your views ( there's a section near the end of the MSDN paper )
If you're using database first, there's a T4 template here you can use.
If you're using code first, there's a project here, although I haven't tried it myself.
Note: when you upgrade your project from .NET 4.0 to .NET 4.5, you have to uninstall the EF NuGet package and reinstall it - there are different version of EF 5 for the different runtimes.
Related
I'm new to EF, using 6.0 code-first.
using (EmpolyeeContext emp = new EmpolyeeContext())
{
var callrecordSession = new CallRecordingSession();
string ucid = new Random().Next(0, 5000).ToString();
callrecordSession.UCID = ucid;
callrecordSession.InstrumentationTransactionId = new Random().Next(0, 9000).ToString();
callrecordSession.ClientName = "IVR";
callrecordSession.ClientRequestObject = "{string.empty}";
callrecordSession.CallRecordingStatusName = "Started";
callrecordSession.ModifiedDatetime = DateTime.Now;
emp.CallRecordingSessions.Add(callrecordSession);
var itemToEdit = emp.CallRecordingSessions.Where(c => c.UCID == ucid).FirstOrDefault<CallRecordingSession>();
if (itemToEdit != null)
{
itemToEdit.CallRecordingStatusName = "Inprogress";
itemToEdit.ModifiedDatetime = DateTime.Now;
}
emp.SaveChanges();
}
What is wrong with this code? I'm always getting itemtoEdit as null
Entity Framework always queries the data source. This is by design. If they queried your local cache instead, You could get an Id that has already been inserted by a different user in the database. Also Entity framework has to merge results from your memory and the database.
var itemToEdit = emp.CallRecordingSessions.Where(c => c.UCID == ucid).FirstOrDefault<CallRecordingSession>();
So all you got to do is call the savechanges before you get the data back.
emp.CallRecordingSessions.Add(callrecordSession);
emp.SaveChanges();
var itemToEdit = emp.CallRecordingSessions.Where(c => c.UCID == ucid).FirstOrDefault<CallRecordingSession>();
if (itemToEdit != null)
{
itemToEdit.CallRecordingStatusName = "Inprogress";
itemToEdit.ModifiedDatetime = DateTime.Now;
}
emp.SaveChanges();
.SaveChanges() did the tricks.
I am using iTextSharp to create pdf. I have 100k records, but I am getting following exception:
An exception of type 'System.OutOfMemoryException' occurred in
itextsharp.dll but was not handled in user code At the line:
bodyTable.AddCell(currentProperty.GetValue(lst, null).ToString());
Code is:
var doc = new Document(pageSize);
PdfWriter.GetInstance(doc, stream);
doc.Open();
//Get exportable count
int columns = 0;
Type currentType = list[0].GetType();
//PREPARE HEADER
//foreach visible columns check if current object has proerpty
//else search in inner properties
foreach (var visibleColumn in visibleColumns)
{
if (currentType.GetProperties().FirstOrDefault(p => p.Name == visibleColumn.Key) != null)
{
columns++;
}
else
{
//check child property objects
var childProperties = currentType.GetProperties();
foreach (var prop in childProperties)
{
if (prop.PropertyType.BaseType == typeof(BaseEntity))
{
if (prop.PropertyType.GetProperties().FirstOrDefault(p => p.Name == visibleColumn.Key) != null)
{
columns++;
break;
}
}
}
}
}
//header
var headerTable = new PdfPTable(columns);
headerTable.WidthPercentage = 100f;
foreach (var visibleColumn in visibleColumns)
{
if (currentType.GetProperties().FirstOrDefault(p => p.Name == visibleColumn.Key) != null)
{
//headerTable.AddCell(prop.Name);
headerTable.AddCell(visibleColumn.Value);
}
else
{
//check child property objects
var childProperties = currentType.GetProperties();
foreach (var prop in childProperties)
{
if (prop.PropertyType.BaseType == typeof(BaseEntity))
{
if (prop.PropertyType.GetProperties().FirstOrDefault(p => p.Name == visibleColumn.Key) != null)
{
//headerTable.AddCell(prop.Name);
headerTable.AddCell(visibleColumn.Value);
break;
}
}
}
}
}
doc.Add(headerTable);
var bodyTable = new PdfPTable(columns);
bodyTable.Complete = false;
bodyTable.WidthPercentage = 100f;
//PREPARE DATA
foreach (var lst in list)
{
int col = 1;
foreach (var visibleColumn in visibleColumns)
{
var currentProperty = currentType.GetProperties().FirstOrDefault(p => p.Name == visibleColumn.Key);
if (currentProperty != null)
{
if (currentProperty.GetValue(lst, null) != null)
bodyTable.AddCell(currentProperty.GetValue(lst, null).ToString());
else
bodyTable.AddCell(string.Empty);
col++;
}
else
{
//check child property objects
var childProperties = currentType.GetProperties().Where(p => p.PropertyType.BaseType == typeof(BaseEntity));
foreach (var prop in childProperties)
{
currentProperty = prop.PropertyType.GetProperties().FirstOrDefault(p => p.Name == visibleColumn.Key);
if (currentProperty != null)
{
var currentPropertyObjectValue = prop.GetValue(lst, null);
if (currentPropertyObjectValue != null)
{
bodyTable.AddCell(currentProperty.GetValue(currentPropertyObjectValue, null).ToString());
}
else
{
bodyTable.AddCell(string.Empty);
}
break;
}
}
}
}
}
doc.Add(bodyTable);
doc.Close();
A back of the envelope computation of the memory requirements given the data you provided for memory consumption gives 100000 * 40 * (2*20+4) = 167MBs. Well within your memory limit, but it is just a lower bound. I imagine each Cell object is pretty big. If each cell would have a 512 byte overhead you could be well looking at 2GB taken. I reckon it might be even more, as PDF is a complex beast.
So you might realistically be looking at a situation where you are actually running out of memory. If not your computers, then at least the bit C# has set aside for its own thing.
I would do one thing first - check memory consumption like here. You might even do well to try with 10, 100, 1000, 10000, 100000 rows and see up until what number of rows the program works.
You could perhaps try a different thing altogether. If you're trying to print a nicely formatted table with a lot of data, perhaps you could output an HTML document, which can be done incrementally and which you can do by just writing stuff to a file, rather than using a third party library. You can then "print" that HTML document to PDF. StackOverflow to the rescue again with this problem.
I am having issues with my application. I have a db table for a print queue. When I read from that table in a loop, once I add that record to the view model, I then want to delete it from the database...this would be the most efficient way to do it, but EF barks:
An entity object cannot be referenced by multiple instances of IEntityChangeTracker.
I've tried using multiple contexts... but that didn't seem to work either. I've seen articles like Rick Strahl's, but frankly it was above my level of understanding, and not exactly sure if it helps my issue here and seemed quite an in depth solution for something as simple as this.
Is there a simple way to accomplish what I am trying to achieve here?
Here is my code:
public List<InventoryContainerLabelViewModel> CreateLabelsViewModel(int intFacilityId)
{
var printqRep = new Repository<InventoryContainerPrintQueue>(new InventoryMgmtContext());
var printqRepDelete = new Repository<InventoryContainerPrintQueue>(new InventoryMgmtContext());
IQueryable<InventoryContainerPrintQueue> labels =
printqRep.SearchFor(x => x.FacilityId == intFacilityId);
List<InventoryContainerLabelViewModel> labelsViewModel = new List<InventoryContainerLabelViewModel>();
if (labels.Count() > 0)
{
//Get printq record
foreach (InventoryContainerPrintQueue label in labels)
{
IEnumerable<InventoryContainerDetail> icDtls =
label.InventoryContainerHeader.InventoryContainerDetails;
//Get print details
foreach (InventoryContainerDetail icDtl in icDtls)
{
labelsViewModel.Add(new InventoryContainerLabelViewModel()
{
...
populate view model here
}
);//Add label to view model
} //for each IC detail
//Delete the printq record
printqRepDelete.Delete(label); <======== Error Here
} //foreach label loop
}//label count > 0
return labelsViewModel.ToList();
}
In the end, I added a column to the printq table for status, then in the the loop updated it to processed, then called a separate method to delete it.
public List<InventoryContainerLabelViewModel> CreateLabelsViewModel(int intFacilityId)
{
InventoryMgmtContext dbContext = new InventoryMgmtContext();
var printqRep = new Repository<InventoryContainerPrintQueue>(dbContext);
IEnumerable<InventoryContainerPrintQueue> unprintedPrtqRecs =
printqRep.SearchFor(x => x.FacilityId == intFacilityId && x.Printed == false);
List<InventoryContainerLabelViewModel> labelsViewModel = new List<InventoryContainerLabelViewModel>();
if (unprintedPrtqRecs.Count() > 0)
{
//Get printq record
foreach (InventoryContainerPrintQueue unprintedPrtqRec in unprintedPrtqRecs)
{
IEnumerable<InventoryContainerDetail> icDtls =
unprintedPrtqRec.InventoryContainerHeader.InventoryContainerDetails;
//Get container details to print
foreach (InventoryContainerDetail icDtl in icDtls)
{
labelsViewModel.Add(new InventoryContainerLabelViewModel()
{
...
}
);//Get IC details and create view model
} //for each IC detail
unprintedPrtqRec.Printed = true;
printqRep.Update(unprintedPrtqRec, unprintedPrtqRec, false);
} //foreach label loop
//Commit updated to Printed status to db
dbContext.SaveChanges();
}//label count > 0
return labelsViewModel;
}
public ActionConfirmation<int> DeletePrintQRecs(int intFacilityId)
{
InventoryMgmtContext dbContext = new InventoryMgmtContext();
var printqRep = new Repository<InventoryContainerPrintQueue>(dbContext);
IEnumerable<InventoryContainerPrintQueue> printedPrtqRecs =
printqRep.SearchFor(x => x.FacilityId == intFacilityId && x.Printed == true);
foreach (InventoryContainerPrintQueue printedPrtqRec in printedPrtqRecs)
{
//Delete the printq record
printqRep.Delete(printedPrtqRec, false);
}
//Save Changes on all deletes
ActionConfirmation<int> result;
try
{
dbContext.SaveChanges();
result = ActionConfirmation<int>.CreateSuccessConfirmation(
"All Label Print Q records deleted successfully.",
1);
}
catch (Exception ex)
{
result = ActionConfirmation<int>.CreateFailureConfirmation(
string.Format("An error occured attempting to {0}. The error was: {2}.",
"delete Label Print Q records",
ex.Message),
1
);
}
return result;
}
I'm working with a legacy database while re-building the web application. I want to use Symfony2.x which obviously has Doctrine as ORM.
I've around 50 (mysql) tables which has NO Primary Keys. When I try to generate models, it does not let me do and throw an exception with "No Primary Key on ... table".
Do I must have Primary Keys on tables to use Doctrine or is there any way around it?
Any help would be great.
Thanks.
Doctrine requires every entity class to have an identifier/primary key.
Take a look at this page: http://www.doctrine-project.org/docs/orm/2.0/en/reference/basic-mapping.html#identifiers-primary-keys
It is a requirement for Doctrine to have an identifier/primary key.
Take a look at this page: http://www.doctrine-project.org/docs/orm/2.0/en/reference/basic-mapping.html#identifiers-primary-keys
But there is a way to generate mappings and entities from tables that do not have a primary key. A table with no primary key is an unusual and bad database design but such a scenario exists in case of legacy databases.
Solution:
Note: All references below refer to Doctrine 2.0
1. Find the file DatabaseDriver.php (in Doctrine/ORM/Mapping/Driver/DatabaseDriver.php)
2. Find the method reverseEngineerMappingFromDatabase. Modify the code as stated below. The original code is:
private function reverseEngineerMappingFromDatabase()
{
if ($this->tables !== null) {
return;
}
$tables = array();
foreach ($this->_sm->listTableNames() as $tableName) {
$tables[$tableName] = $this->_sm->listTableDetails($tableName);
}
$this->tables = $this->manyToManyTables = $this->classToTableNames = array();
foreach ($tables as $tableName => $table) {
/* #var $table \Doctrine\DBAL\Schema\Table */
if ($this->_sm->getDatabasePlatform()->supportsForeignKeyConstraints()) {
$foreignKeys = $table->getForeignKeys();
} else {
$foreignKeys = array();
}
$allForeignKeyColumns = array();
foreach ($foreignKeys as $foreignKey) {
$allForeignKeyColumns = array_merge($allForeignKeyColumns, $foreignKey->getLocalColumns());
}
if ( ! $table->hasPrimaryKey()) {
throw new MappingException(
"Table " . $table->getName() . " has no primary key. Doctrine does not ".
"support reverse engineering from tables that don't have a primary key."
);
}
$pkColumns = $table->getPrimaryKey()->getColumns();
sort($pkColumns);
sort($allForeignKeyColumns);
if ($pkColumns == $allForeignKeyColumns && count($foreignKeys) == 2) {
$this->manyToManyTables[$tableName] = $table;
} else {
// lower-casing is necessary because of Oracle Uppercase Tablenames,
// assumption is lower-case + underscore separated.
$className = $this->getClassNameForTable($tableName);
$this->tables[$tableName] = $table;
$this->classToTableNames[$className] = $tableName;
}
}
}
The modified code is:
private function reverseEngineerMappingFromDatabase()
{
if ($this->tables !== null) {
return;
}
$tables = array();
foreach ($this->_sm->listTableNames() as $tableName) {
$tables[$tableName] = $this->_sm->listTableDetails($tableName);
}
$this->tables = $this->manyToManyTables = $this->classToTableNames = array();
foreach ($tables as $tableName => $table) {
/* #var $table \Doctrine\DBAL\Schema\Table */
if ($this->_sm->getDatabasePlatform()->supportsForeignKeyConstraints()) {
$foreignKeys = $table->getForeignKeys();
} else {
$foreignKeys = array();
}
$allForeignKeyColumns = array();
foreach ($foreignKeys as $foreignKey) {
$allForeignKeyColumns = array_merge($allForeignKeyColumns, $foreignKey->getLocalColumns());
}
$pkColumns=array();
if ($table->hasPrimaryKey()) {
$pkColumns = $table->getPrimaryKey()->getColumns();
sort($pkColumns);
}
sort($allForeignKeyColumns);
if ($pkColumns == $allForeignKeyColumns && count($foreignKeys) == 2) {
$this->manyToManyTables[$tableName] = $table;
} else {
// lower-casing is necessary because of Oracle Uppercase Tablenames,
// assumption is lower-case + underscore separated.
$className = $this->getClassNameForTable($tableName);
$this->tables[$tableName] = $table;
$this->classToTableNames[$className] = $tableName;
}
}
}
3. Find the method loadMetadataForClass in the same file. Modify the code as stated below.
Find the code stated below:
try {
$primaryKeyColumns = $this->tables[$tableName]->getPrimaryKey()->getColumns();
} catch(SchemaException $e) {
$primaryKeyColumns = array();
}
Modify it like this:
try {
$primaryKeyColumns = ($this->tables[$tableName]->hasPrimaryKey())?$this->tables[$tableName]->getPrimaryKey()->getColumns():array();
} catch(SchemaException $e) {
$primaryKeyColumns = array();
}
The above solution creates mappings(xml/yml/annotation) even for tables that don't have a primary key.
I am working in ASP.Net MVC 1.0 and SQL Server 2000 using Entity Framework.
My faulty controller code is given below:
int checkUser = id ?? 0;
string userNameFromNew, userNameToNew;
if (checkUser == 1)
{
userNameFromNew = "U" + Request.Form["usernameFrom"];
userNameToNew = "U" + Request.Form["usernameTo"];
}
else
{
userNameFromNew = "C" + Request.Form["usernameFrom"];
userNameToNew = "C" + Request.Form["usernameTo"];
}
var rMatrix = from Datas in repository.GetTotalRightData()
where Datas.UserName == userNameFromNew.Trim()
select Datas;
Right_Matrix RM = new Right_Matrix();
foreach(var Data in rMatrix)
{
RM.Column_Id = Data.Column_Id;
RM.ProMonSys_Right = Data.ProMonSys_Right;
RM.UserName = userNameToNew;
UpdateModel(RM);
this.repository.AddRightTransfer(RM);
}
return RedirectToAction("RightTransfer");
My faulty model code is given below:
public void AddRightTransfer(Right_Matrix RM)
{
context.AddObject("Right_Matrix", RM);
context.SaveChanges();
}
My code shows error once it is in the model code stating the DataReader is already open and I need to close it first.
Please suggest a workaround.
Try moving the AddRightTransfer loop out of the LINQ foreach, and into a separate one. I think LINQ is executing the first add before the database result set is closed. By moving the call to AddRightTransfer into a second foreach, you should avoid the problem, if that's what is happening.
Here's an example:
List<Right_Matrix> matrixes = new List<Right_Matrix>();
foreach (var Data in rMatrix)
{
Right_Matrix rm = new Right_Matrix();
rm.Column_Id = Data.Column_Id;
rm.ProMonSys_Right = Data.ProMonSys_Right;
rm.UserName = userNameToNew;
UpdateModel(rm);
matrixes.Add(rm);
}
foreach (var rm in matrixes)
{
this.repository.AddRightTransfer(rm);
}
The problem is that you are modifying the database while still iterating over it. Either finish the query before modifying, like this:
foreach(var Data in rMatrix.ToArray())
Or don't modify the database in your loop, like this:
foreach(var Data in rMatrix)
{
RM.Column_Id = Data.Column_Id;
RM.ProMonSys_Right = Data.ProMonSys_Right;
RM.UserName = userNameToNew;
UpdateModel(RM);
context.AddObject("Right_Matrix", RM);
}
context.SaveChanges();
Obviously, those two context. calls would have to be made into methods on your repository, but you get the picture.