EF4 Strip table prefixes on import - entity-framework-4

I am attempting to have table names automatically renamed to strip off a leading prefix in EF4. I know it can be done in the GUI, however, my company created the DB schema in Visio and use that to create the DB creation script in SQL. We do this often, and sometimes have a lot of tables, so using the GUI is not an ideal solution.
Is there a way to modify the properties on the .edmx file to strip off a defined prefix from the DB table so the Entity class is how we desire it?

The simplest solution we came up with was to make a new console application that did the low level work of renaming everything in the edmx file; the application can be added to the tools menu of Visual Studio (add external tool) with 'arguments' = $(ItemPath), 'initial directory' = $(ItemDir), 'prompt for arguments' = true and 'use output window' = true, then can be executed with the EDMX file selected. In our case, we needed to strip underscores and transform the names to camel case (interpreting the underscore as a word separator), in addition to stripping the prefix.
I'll omit the rest of the code (like the Transform.Standard method, error handling and the stripping of additional parameters) because it's too badly written to share and it's too late at night for refactoring :P; but it's easy enough to interpret the rest of the arguments after the first as strings to be stripped from names and so on - this main code is only about the modifications needed on the EDMX file. In case you use this as an External Tool from VS, you can specify the rest of the strings after $(ItemPath) in 'arguments'.
Please note this wasn't tested extensively, and there may be additional information in other EDMX files that we failed to account for (but I doubt it); also, I got part of the code from somewhere else on the net but failed to write down where exactly, so sorry! Credit should have been given accordingly. Also, naturally this would have been much better as an extension for VS2010, but I simply did not have the time to do it - if you do link it somewhere and I'll use that instead ;)
if (_args.Count < 1) return;
string file = _args.First();
if (!File.Exists(file))
{
wait("Could not find specified file.");
return;
}
if (Path.GetExtension(file) != ".edmx")
{
wait("This works only on EDMX files.");
return;
}
//processing:
Console.WriteLine("Creating backup: " + Path.ChangeExtension(file, ".bak"));
File.Copy(file, Path.ChangeExtension(file, ".bak"), true);
Console.WriteLine("Reading target document...");
XDocument xdoc = XDocument.Load(file);
const string CSDLNamespace = "http://schemas.microsoft.com/ado/2008/09/edm";
const string MSLNamespace = "http://schemas.microsoft.com/ado/2008/09/mapping/cs";
const string DiagramNamespace = "http://schemas.microsoft.com/ado/2008/10/edmx";
XElement csdl = xdoc.Descendants(XName.Get("Schema", CSDLNamespace)).First();
XElement msl = xdoc.Descendants(XName.Get("Mapping", MSLNamespace)).First();
XElement designerDiagram = xdoc.Descendants(XName.Get("Diagram", DiagramNamespace)).First();
//modifications for renaming everything, not just table names:
string[] CSDLpaths = new string[]
{
"EntityContainer/EntitySet.Name",
"EntityContainer/EntitySet.EntityType",
"EntityContainer/AssociationSet/End.EntitySet",
"EntityType.Name",
"EntityType/Key/PropertyRef/Name",
"EntityType/Property.Name",
"EntityType/NavigationProperty.Name",
"Association/End.Type",
"Association//PropertyRef.Name",
};
#region CSDL2
Console.WriteLine("Modifying CSDL...");
Console.WriteLine(" - modifying entity sets...");
foreach (var entitySet in csdl.Element(XName.Get("EntityContainer", CSDLNamespace)).Elements(XName.Get("EntitySet", CSDLNamespace)))
{
entitySet.Attribute("Name").Value = Transform.Standard(entitySet.Attribute("Name").Value);
entitySet.Attribute("EntityType").Value = Transform.Standard(entitySet.Attribute("EntityType").Value);
}
Console.WriteLine(" - modifying association sets...");
foreach (var associationSet in csdl.Element(XName.Get("EntityContainer", CSDLNamespace)).Elements(XName.Get("AssociationSet", CSDLNamespace)))
{
foreach (var end in associationSet.Elements(XName.Get("End", CSDLNamespace)))
{
end.Attribute("EntitySet").Value = Transform.Standard(end.Attribute("EntitySet").Value);
}
}
Console.WriteLine(" - modifying entity types...");
foreach (var entityType in csdl.Elements(XName.Get("EntityType", CSDLNamespace)))
{
entityType.Attribute("Name").Value = Transform.Standard(entityType.Attribute("Name").Value);
foreach (var key in entityType.Elements(XName.Get("Key", CSDLNamespace)))
{
foreach (var propertyRef in key.Elements(XName.Get("PropertyRef", CSDLNamespace)))
{
propertyRef.Attribute("Name").Value = Transform.Standard(propertyRef.Attribute("Name").Value);
}
}
foreach (var property in entityType.Elements(XName.Get("Property", CSDLNamespace)))
{
property.Attribute("Name").Value = Transform.Standard(property.Attribute("Name").Value);
}
foreach (var navigationProperty in entityType.Elements(XName.Get("NavigationProperty", CSDLNamespace)))
{
navigationProperty.Attribute("Name").Value = Transform.Standard(navigationProperty.Attribute("Name").Value);
}
}
Console.WriteLine(" - modifying associations...");
foreach (var association in csdl.Elements(XName.Get("Association", CSDLNamespace)))
{
foreach (var end in association.Elements(XName.Get("End", CSDLNamespace)))
{
end.Attribute("Type").Value = Transform.Standard(end.Attribute("Type").Value);
}
foreach (var propref in association.Descendants(XName.Get("PropertyRef", CSDLNamespace)))
{
//propertyrefs are contained in constraints
propref.Attribute("Name").Value = Transform.Standard(propref.Attribute("Name").Value);
}
}
#endregion
#region MSL2
Console.WriteLine("Modifying MSL...");
Console.WriteLine(" - modifying entity set mappings...");
foreach (var entitySetMapping in msl.Element(XName.Get("EntityContainerMapping", MSLNamespace)).Elements(XName.Get("EntitySetMapping", MSLNamespace)))
{
entitySetMapping.Attribute("Name").Value = Transform.Standard(entitySetMapping.Attribute("Name").Value);
foreach (var entityTypeMapping in entitySetMapping.Elements(XName.Get("EntityTypeMapping", MSLNamespace)))
{
entityTypeMapping.Attribute("TypeName").Value = Transform.Standard(entityTypeMapping.Attribute("TypeName").Value);
foreach
(var scalarProperty in
(entityTypeMapping.Element(XName.Get("MappingFragment", MSLNamespace))).Elements(XName.Get("ScalarProperty", MSLNamespace))
)
{
scalarProperty.Attribute("Name").Value = Transform.Standard(scalarProperty.Attribute("Name").Value);
}
}
}
Console.WriteLine(" - modifying association set mappings...");
foreach (var associationSetMapping in msl.Element(XName.Get("EntityContainerMapping", MSLNamespace)).Elements(XName.Get("AssociationSetMapping", MSLNamespace)))
{
foreach (var endProperty in associationSetMapping.Elements(XName.Get("EndProperty", MSLNamespace)))
{
foreach (var scalarProperty in endProperty.Elements(XName.Get("ScalarProperty", MSLNamespace)))
{
scalarProperty.Attribute("Name").Value = Transform.Standard(scalarProperty.Attribute("Name").Value);
}
}
}
#endregion
#region Designer
Console.WriteLine("Modifying designer content...");
foreach (var item in designerDiagram.Elements(XName.Get("EntityTypeShape", DiagramNamespace)))
{
item.Attribute("EntityType").Value = Transform.Standard(item.Attribute("EntityType").Value);
}
#endregion
Console.WriteLine("Writing result...");
using (XmlTextWriter writer = new XmlTextWriter(args[0], Encoding.Default))
{
writer.Formatting = Formatting.Indented;
xdoc.WriteTo(writer);
}
Edit: adding the Transform class used by the code above. Also, please note that this works for Entity Framework 4.0 - later versions might have slightly different EDMX structure (I'm not sure) so the code might need to be modified to account for that.
public class Transform
{
public static string Standard(string initial, IEnumerable<string> eliminations = null)
{
Regex re = new Regex(#"(\w+)(\W*?)$", RegexOptions.Compiled);
Regex camelSplit = new Regex(#"(?<!^)(?=[A-Z])", RegexOptions.Compiled);
return re.Replace(initial, new MatchEvaluator((Match m) =>
{
string name = m.Groups[1].Value;
var parts = name.Split('_').AsEnumerable();
if (parts.Count() == 1 && IsMixedCase(name))
{
string result = string.Concat(camelSplit.Split(name).Except(eliminations, StringComparer.CurrentCultureIgnoreCase));
return result + m.Groups[2];
}
else
{
parts = parts.Select(s => CultureInfo.CurrentCulture.TextInfo.ToTitleCase(s.ToLower()));
parts = parts.Except(eliminations, StringComparer.CurrentCultureIgnoreCase);
return string.Concat(parts) + m.Groups[2];
}
}));
}
public static bool IsMixedCase(string name)
{
int lower = 0, total = 0;
for (int i = 0; i < name.Length; i++)
{
if (char.IsLower(name, i)) lower++;
if (char.IsLetter(name, i)) total++;
}
return lower != 0 && lower != total;
}
}
Now you could write a little bit more code in the Main method that could take the arguments (except the first one which will be the name of the file) and pass them onward to the eliminations parameter; these could be strings that need to be erased from the names, in my case the prefixes. You can then specify these strings from the visual studio tools interface when invoking the tool. Or I suppose you can just hardcode them if you don't care that much about reusing the tool :)

I do not know about any build in feature for stripping table prefixes but you can open .edmx file as XML (use Open With in VS) and use simple replacing to replace prefix with nothing.

Related

MVC Full Calendar Error [duplicate]

I am trying to do a simple JSON return but I am having issues I have the following below.
public JsonResult GetEventData()
{
var data = Event.Find(x => x.ID != 0);
return Json(data);
}
I get a HTTP 500 with the exception as shown in the title of this question. I also tried
var data = Event.All().ToList()
That gave the same problem.
Is this a bug or my implementation?
It seems that there are circular references in your object hierarchy which is not supported by the JSON serializer. Do you need all the columns? You could pick up only the properties you need in the view:
return Json(new
{
PropertyINeed1 = data.PropertyINeed1,
PropertyINeed2 = data.PropertyINeed2
});
This will make your JSON object lighter and easier to understand. If you have many properties, AutoMapper could be used to automatically map between DTO objects and View objects.
I had the same problem and solved by using Newtonsoft.Json;
var list = JsonConvert.SerializeObject(model,
Formatting.None,
new JsonSerializerSettings() {
ReferenceLoopHandling = Newtonsoft.Json.ReferenceLoopHandling.Ignore
});
return Content(list, "application/json");
This actually happens because the complex objects are what makes the resulting json object fails.
And it fails because when the object is mapped it maps the children, which maps their parents, making a circular reference to occur. Json would take infinite time to serialize it, so it prevents the problem with the exception.
Entity Framework mapping also produces the same behavior, and the solution is to discard all unwanted properties.
Just expliciting the final answer, the whole code would be:
public JsonResult getJson()
{
DataContext db = new DataContext ();
return this.Json(
new {
Result = (from obj in db.Things select new {Id = obj.Id, Name = obj.Name})
}
, JsonRequestBehavior.AllowGet
);
}
It could also be the following in case you don't want the objects inside a Result property:
public JsonResult getJson()
{
DataContext db = new DataContext ();
return this.Json(
(from obj in db.Things select new {Id = obj.Id, Name = obj.Name})
, JsonRequestBehavior.AllowGet
);
}
To sum things up, there are 4 solutions to this:
Solution 1: turn off ProxyCreation for the DBContext and restore it in the end.
private DBEntities db = new DBEntities();//dbcontext
public ActionResult Index()
{
bool proxyCreation = db.Configuration.ProxyCreationEnabled;
try
{
//set ProxyCreation to false
db.Configuration.ProxyCreationEnabled = false;
var data = db.Products.ToList();
return Json(data, JsonRequestBehavior.AllowGet);
}
catch (Exception ex)
{
Response.StatusCode = (int)HttpStatusCode.BadRequest;
return Json(ex.Message);
}
finally
{
//restore ProxyCreation to its original state
db.Configuration.ProxyCreationEnabled = proxyCreation;
}
}
Solution 2: Using JsonConvert by Setting ReferenceLoopHandling to ignore on the serializer settings.
//using using Newtonsoft.Json;
private DBEntities db = new DBEntities();//dbcontext
public ActionResult Index()
{
try
{
var data = db.Products.ToList();
JsonSerializerSettings jss = new JsonSerializerSettings { ReferenceLoopHandling = ReferenceLoopHandling.Ignore };
var result = JsonConvert.SerializeObject(data, Formatting.Indented, jss);
return Json(result, JsonRequestBehavior.AllowGet);
}
catch (Exception ex)
{
Response.StatusCode = (int)HttpStatusCode.BadRequest;
return Json(ex.Message);
}
}
Following two solutions are the same, but using a model is better because it's strong typed.
Solution 3: return a Model which includes the needed properties only.
private DBEntities db = new DBEntities();//dbcontext
public class ProductModel
{
public int Product_ID { get; set;}
public string Product_Name { get; set;}
public double Product_Price { get; set;}
}
public ActionResult Index()
{
try
{
var data = db.Products.Select(p => new ProductModel
{
Product_ID = p.Product_ID,
Product_Name = p.Product_Name,
Product_Price = p.Product_Price
}).ToList();
return Json(data, JsonRequestBehavior.AllowGet);
}
catch (Exception ex)
{
Response.StatusCode = (int)HttpStatusCode.BadRequest;
return Json(ex.Message);
}
}
Solution 4: return a new dynamic object which includes the needed properties only.
private DBEntities db = new DBEntities();//dbcontext
public ActionResult Index()
{
try
{
var data = db.Products.Select(p => new
{
Product_ID = p.Product_ID,
Product_Name = p.Product_Name,
Product_Price = p.Product_Price
}).ToList();
return Json(data, JsonRequestBehavior.AllowGet);
}
catch (Exception ex)
{
Response.StatusCode = (int)HttpStatusCode.BadRequest;
return Json(ex.Message);
}
}
JSON, like xml and various other formats, is a tree-based serialization format. It won't love you if you have circular references in your objects, as the "tree" would be:
root B => child A => parent B => child A => parent B => ...
There are often ways of disabling navigation along a certain path; for example, with XmlSerializer you might mark the parent property as XmlIgnore. I don't know if this is possible with the json serializer in question, nor whether DatabaseColumn has suitable markers (very unlikely, as it would need to reference every serialization API)
add [JsonIgnore] to virtuals properties in your model.
Using Newtonsoft.Json: In your Global.asax Application_Start method add this line:
GlobalConfiguration.Configuration.Formatters.JsonFormatter.SerializerSettings.ReferenceLoopHandling = Newtonsoft.Json.ReferenceLoopHandling.Ignore;
Its because of the new DbContext T4 template that is used for generating the EntityFramework entities. In order to be able to perform the change tracking, this templates uses the Proxy pattern, by wrapping your nice POCOs with them. This then causes the issues when serializing with the JavaScriptSerializer.
So then the 2 solutions are:
Either you just serialize and return the properties you need on the client
You may switch off the automatic generation of proxies by setting it on the context's configuration
context.Configuration.ProxyCreationEnabled = false;
Very well explained in the below article.
http://juristr.com/blog/2011/08/javascriptserializer-circular-reference/
Provided answers are good, but I think they can be improved by adding an "architectural" perspective.
Investigation
MVC's Controller.Json function is doing the job, but it is very poor at providing a relevant error in this case. By using Newtonsoft.Json.JsonConvert.SerializeObject, the error specifies exactly what is the property that is triggering the circular reference. This is particularly useful when serializing more complex object hierarchies.
Proper architecture
One should never try to serialize data models (e.g. EF models), as ORM's navigation properties is the road to perdition when it comes to serialization. Data flow should be the following:
Database -> data models -> service models -> JSON string
Service models can be obtained from data models using auto mappers (e.g. Automapper). While this does not guarantee lack of circular references, proper design should do it: service models should contain exactly what the service consumer requires (i.e. the properties).
In those rare cases, when the client requests a hierarchy involving the same object type on different levels, the service can create a linear structure with parent->child relationship (using just identifiers, not references).
Modern applications tend to avoid loading complex data structures at once and service models should be slim. E.g.:
access an event - only header data (identifier, name, date etc.) is loaded -> service model (JSON) containing only header data
managed attendees list - access a popup and lazy load the list -> service model (JSON) containing only the list of attendees
Avoid converting the table object directly. If relations are set between other tables, it might throw this error.
Rather, you can create a model class, assign values to the class object and then serialize it.
I'm Using the fix, Because Using Knockout in MVC5 views.
On action
return Json(ModelHelper.GetJsonModel<Core_User>(viewModel));
function
public static TEntity GetJsonModel<TEntity>(TEntity Entity) where TEntity : class
{
TEntity Entity_ = Activator.CreateInstance(typeof(TEntity)) as TEntity;
foreach (var item in Entity.GetType().GetProperties())
{
if (item.PropertyType.ToString().IndexOf("Generic.ICollection") == -1 && item.PropertyType.ToString().IndexOf("SaymenCore.DAL.") == -1)
item.SetValue(Entity_, Entity.GetPropValue(item.Name));
}
return Entity_;
}
You can notice the properties that cause the circular reference. Then you can do something like:
private Object DeCircular(Object object)
{
// Set properties that cause the circular reference to null
return object
}
//first: Create a class as your view model
public class EventViewModel
{
public int Id{get;set}
public string Property1{get;set;}
public string Property2{get;set;}
}
//then from your method
[HttpGet]
public async Task<ActionResult> GetEvent()
{
var events = await db.Event.Find(x => x.ID != 0);
List<EventViewModel> model = events.Select(event => new EventViewModel(){
Id = event.Id,
Property1 = event.Property1,
Property1 = event.Property2
}).ToList();
return Json(new{ data = model }, JsonRequestBehavior.AllowGet);
}
An easier alternative to solve this problem is to return an string, and format that string to json with JavaScriptSerializer.
public string GetEntityInJson()
{
JavaScriptSerializer j = new JavaScriptSerializer();
var entityList = dataContext.Entitites.Select(x => new { ID = x.ID, AnotherAttribute = x.AnotherAttribute });
return j.Serialize(entityList );
}
It is important the "Select" part, which choose the properties you want in your view. Some object have a reference for the parent. If you do not choose the attributes, the circular reference may appear, if you just take the tables as a whole.
Do not do this:
public string GetEntityInJson()
{
JavaScriptSerializer j = new JavaScriptSerializer();
var entityList = dataContext.Entitites.toList();
return j.Serialize(entityList );
}
Do this instead if you don't want the whole table:
public string GetEntityInJson()
{
JavaScriptSerializer j = new JavaScriptSerializer();
var entityList = dataContext.Entitites.Select(x => new { ID = x.ID, AnotherAttribute = x.AnotherAttribute });
return j.Serialize(entityList );
}
This helps render a view with less data, just with the attributes you need, and makes your web run faster.

Bulk Inserting Excel Data to Database Using EF Code First

I use asp.net mvc ef code first. I upload the file to the server all i need is inserting that excel data to code first database. What is the best way to bulk insert excel data to database? Would appreciate your consultation.
Thanks in advance.
You can user LinqToExcel to get data from the Excel file and map it in your entity class.
If you are looking for alternative methods, these are some:
OLEDB, see an example
OPEN XML 2.0
Some third party DLL like ExcelDataReader or EPPlus
Using an ORM like Entity Framework is not efficient to perform bulk operations. To bulk insert efficiently, the SqlBulkCopy class must be used.
To insert a generic list it must be converted to a DataTable:
To insert a generic list it must be converted to a DataTable:
public static DataTable ConvertToDataTable<T>(IList<T> list)
{
PropertyDescriptorCollection propertyDescriptorCollection = TypeDescriptor.GetProperties(typeof(T));
DataTable table = new DataTable();
for (int i = 0; i < propertyDescriptorCollection.Count; i++)
{
PropertyDescriptor propertyDescriptor = propertyDescriptorCollection[i];
Type propType = propertyDescriptor.PropertyType;
if (propType.IsGenericType && propType.GetGenericTypeDefinition() == typeof(Nullable<>))
{
table.Columns.Add(propertyDescriptor.Name, Nullable.GetUnderlyingType(propType));
}
else
{
table.Columns.Add(propertyDescriptor.Name, propType);
}
}
object[] values = new object[propertyDescriptorCollection.Count];
foreach (T listItem in list)
{
for (int i = 0; i < values.Length; i++)
{
values[i] = propertyDescriptorCollection[i].GetValue(listItem);
}
table.Rows.Add(values);
}
return table;
}
Then the SqlBulkCopy can be used. In the example the user table is bulk inserted:
DataTable dt = new DataTable();
using (SqlConnection connection = new SqlConnection(connectionString))
{
connection.Open();
using (SqlBulkCopy sqlBulkCopy = new SqlBulkCopy(connection))
{
sqlBulkCopy.ColumnMappings.Add("UserID", "UserID");
sqlBulkCopy.ColumnMappings.Add("UserName", "UserName");
sqlBulkCopy.ColumnMappings.Add("Password", "Password");
sqlBulkCopy.DestinationTableName = "User";
sqlBulkCopy.WriteToServer(dt);
}
}

Best Way to Update only modified fields with Entity Framework

Currently I am doing like this:
For Example:
public update(Person model)
{
// Here model is model return from form on post
var oldobj = db.Person.where(x=>x.ID = model.ID).SingleOrDefault();
db.Entry(oldobj).CurrentValues.SetValues(model);
}
It works, but for example,
I have 50 columns in my table but I displayed only 25 fields in my form (I need to partially update my table, with remaining 25 column retain same old value)
I know it can be achieve by "mapping columns one by one" or by creating "hidden fields for those remaining 25 columns".
Just wondering is there any elegant way to do this with less effort and optimal performance?
This is a very good question. By default I have found that as long as change tracking is enabled (it is by default unless you turn it off), Entity Framework will do a good job of applying to the database only what you ask it to change.
So if you only change 1 field against the object and then call SaveChanges(), EF will only update that 1 field when you call SaveChanges().
The problem here is that when you map a view model into an entity object, all of the values get overwritten. Here is my way of handling this:
In this example, you have a single entity called Person:
Person
======
Id - int
FirstName - varchar
Surname - varchar
Dob - smalldatetime
Now let's say we want to create a view model which will only update Dob, and leave all other fields exactly how they are, here is how I do that.
First, create a view model:
public class PersonDobVm
{
public int Id { get; set; }
public DateTime Dob { get; set; }
public void MapToModel(Person p)
{
p.Dob = Dob;
}
}
Now write the code roughly as follows (you'll have to alter it to match your context name etc):
DataContext db = new DataContext();
Person p = db.People.FirstOrDefault();
// you would have this posted in, but we are creating it here just for illustration
var vm = new PersonDobVm
{
Id = p.Id, // the Id you want to update
Dob = new DateTime(2015, 1, 1) // the new DOB for that row
};
vm.MapToModel(p);
db.SaveChanges();
The MapToModel method could be even more complicated and do all kinds of additional checks before assigning the view model fields to the entity object.
Anyway, the result when SaveChanges is called is the following SQL:
exec sp_executesql N'UPDATE [dbo].[Person]
SET [Dob] = #0
WHERE ([Id] = #1)
',N'#0 datetime2(7),#1 int',#0='2015-01-01 00:00:00',#1=1
So you can clearly see, Entity Framework has not attempted to update any other fields - just the Dob field.
I know in your example you want to avoid coding each assignment by hand, but I think this is the best way. You tuck it all away in your VM so it does not litter your main code, and this way you can cater for specific needs (i.e. composite types in there, data validation, etc). The other option is to use an AutoMapper, but I do not think they are safe. If you use an AutoMapper and spelt "Dob" as "Doob" in your VM, it would not map "Doob" to "Dob", nor would it tell you about it! It would fail silently, the user would think everything was ok, but the change would not be saved.
Whereas if you spelt "Dob" as "Doob" in your VM, the compiler will alert you that the MapToModel() is referencing "Dob" but you only have a property in your VM called "Doob".
I hope this helps you.
I swear by EntityFramework.Extended. Nuget Link
It lets you write:
db.Person
.Where(x => x.ID == model.ID)
.Update(p => new Person()
{
Name = newName,
EditCount = p.EditCount+1
});
Which is very clearly translated into SQL.
Please try this way
public update(Person model)
{
// Here model is model return from form on post
var oldobj = db.Person.where(x=>x.ID = model.ID).SingleOrDefault();
// Newly Inserted Code
var UpdatedObj = (Person) Entity.CheckUpdateObject(oldobj, model);
db.Entry(oldobj).CurrentValues.SetValues(UpdatedObj);
}
public static object CheckUpdateObject(object originalObj, object updateObj)
{
foreach (var property in updateObj.GetType().GetProperties())
{
if (property.GetValue(updateObj, null) == null)
{
property.SetValue(updateObj,originalObj.GetType().GetProperty(property.Name)
.GetValue(originalObj, null));
}
}
return updateObj;
}
I have solved my Issue by using FormCollection to list out used element in form, and only change those columns in database.
I have provided my code sample below; Great if it can help someone else
// Here
// collection = FormCollection from Post
// model = View Model for Person
var result = db.Person.Where(x => x.ID == model.ID).SingleOrDefault();
if (result != null)
{
List<string> formcollist = new List<string>();
foreach (var key in collection.ToArray<string>())
{
// Here apply your filter code to remove system properties if any
formcollist.Add(key);
}
foreach (var prop in result.GetType().GetProperties())
{
if( formcollist.Contains(prop.Name))
{
prop.SetValue(result, model.GetType().GetProperty(prop.Name).GetValue(model, null));
}
}
db.SaveChanges();
}
I still didn't find a nice solution for my problem, so I created a work around. When loading the Entity, I directly make a copy of it and name it entityInit. When saving the Entity, I compare the both to see, what really was changed. All the unchanged Properties, I set to unchanged and fill them with the Database-Values. This was necessary for my Entities without Tracking:
// load entity without tracking
var entityWithoutTracking = Context.Person.AsNoTracking().FirstOrDefault(x => x.ID == _entity.ID);
var entityInit = CopyEntity(entityWithoutTracking);
// do business logic and change entity
entityWithoutTracking.surname = newValue;
// for saving, find entity in context
var entity = Context.Person.FirstOrDefault(x => x.ID == _entity.ID);
var entry = Context.Entry(entity);
entry.CurrentValues.SetValues(entityWithoutTracking);
entry.State = EntityState.Modified;
// get List of all changed properties (in my case these are all existing properties, including those which shouldn't have changed)
var changedPropertiesList = entry.CurrentValues.PropertyNames.Where(x => entry.Property(x).IsModified).ToList();
foreach (var checkProperty in changedPropertiesList)
{
try
{
var p1 = entityWithoutTracking.GetType().GetProperty(checkProperty).GetValue(entityWithoutTracking);
var p2 = entityInit.GetType().GetProperty(checkProperty).GetValue(entityInit);
if ((p1 == null && p2 == null) || p1.Equals(p2))
{
entry.Property(checkProperty).CurrentValue = entry.Property(checkProperty).OriginalValue; // restore DB-Value
entry.Property(checkProperty).IsModified = false; // throws Exception for Primary Keys
}
} catch(Exception) { }
}
Context.SaveChanges(); // only surname will be updated
This is way I did it, assuming the new object has more columns to update that the one we want to keep.
if (theClass.ClassId == 0)
{
theClass.CreatedOn = DateTime.Now;
context.theClasses.Add(theClass);
}
else {
var currentClass = context.theClasses.Where(c => c.ClassId == theClass.ClassId)
.Select(c => new TheClasses {
CreatedOn = c.CreatedOn
// Add here others fields you want to keep as the original record
}).FirstOrDefault();
theClass.CreatedOn = currentClass.CreatedOn;
// The new class will replace the current, all fields
context.theClasses.Add(theClass);
context.Entry(theClass).State = EntityState.Modified;
}
context.SaveChanges();
In EF you can do like this
var result = db.Person.Where(x => x.ID == model.ID).FirstOrDefault();
if(result != null){
result.Name = newName;
result.DOB = newDOB;
db.Person.Update(result);
}
Or you can use
using (var db= new MyDbContext())
{
var result= db.Person.Where(x => x.ID == model.ID).FirstOrDefault();
result.Name= newName;
result.DOB = newDOB;
db.Update(result);
db.SaveChanges();
}
For more detail please EntityFramework Core - Update Only One Field
No Worry guys
Just write raw sql query
db.Database.ExecuteSqlCommand("Update Person set Name='"+_entity.Name+"' where Id = " + _entity.ID + "");

EF Code First, how to reflect on model

In EF code first, one specifies field properties and relationships using the fluent interface. This builds up a model. Is it possible to get a reference to this model, and reflect on it?
I want to be able to retrieve for a given field, if it is required, what its datatype is, what length, etc...
You need to access the MetadataWorkspace. The API is pretty cryptic. You may want to replace DataSpace.CSpace with DataSpace.SSpace to get the database metadata.
public class MyContext : DbContext
{
public void Test()
{
var objectContext = ((IObjectContextAdapter)this).ObjectContext;
var mdw = objectContext.MetadataWorkspace;
var items = mdw.GetItems<EntityType>(DataSpace.CSpace);
foreach (var i in items)
{
foreach (var member in i.Members)
{
var prop = member as EdmProperty;
if (prop != null)
{
}
}
}
}

LINQ to Entities StringConvert(double)' cannot be translated to convert int to string

Problem
Need to convert int to string using EF4 + SQL CE4. The recommended option of using SqlFunctions.StringConvert(double) still gives me errors.
Option 1 (original). This gives me error:
public static IEnumerable<SelectListItem> xxGetCustomerList()
{
using (DatabaseEntities db = new DatabaseEntities())
{
var list = from l in db.Customers
orderby l.CompanyName
select new SelectListItem { Value = l.CustomerID.ToString(), Text = l.CompanyName };
return list.ToList();
}
}
Option 2 (most suggested). Then as many posts suggests, I used the SqlFunctions.StringConvert() function from the library System.Data.Objects.SqlClient:
public static IEnumerable<SelectListItem> GetCustomerList()
{
using (DatabaseEntities db = new DatabaseEntities())
{
var list = from l in db.Customers
orderby l.CompanyName
select new SelectListItem { Value = SqlFunctions.StringConvert((double)l.CustomerID), Text = l.CompanyName };
return list.ToList();
}
}
Which now shows below error:
The specified method 'System.String StringConvert(System.Nullable`1[System.Double])' on the type 'System.Data.Objects.SqlClient.SqlFunctions' cannot be translated into a LINQ to Entities store expression.
Option 3 (for very specific case). Then anoter post shows a smart solution using Dictionary, which finally works:
public static IEnumerable<SelectListItem> xGetCustomerList()
{
using (DatabaseEntities db = new DatabaseEntities())
{
var customers = db.Customers.ToDictionary(k => k.CustomerID, k => k.CompanyName);
var list = from l in customers
orderby l.Value
select new SelectListItem { Value = l.Key.ToString(), Text = l.Value };
return list.ToList();
}
}
But only work for simple pair values (key, value). Can someone help me with another solution or what I'm doing wrong with option 2?
And I hope Microsoft will soon make EF right before pushing us to move from L2S which is already stable and much more mature. I actually using EF4 just because want to use SQL CE, otherwise I stay with L2S.
EF is database independent at upper layers but the part dealing with conversion of linq query to SQL is always database dependent and SqlFunctions are dependent on SQL Server Provider but you are using SQL Server CE provider which is not able to translate functions from SqlFunctions class.
Btw. third option is also not a solution because it will select whole customer table to memory and use linq-to-objects after that. You should use this:
public static IEnumerable<SelectListItem> xxGetCustomerList()
{
using (DatabaseEntities db = new DatabaseEntities())
{
// Linq to entities query
var query = from l in db.Customers
orderby l.CompanyName
select new { l.CustomerID, l.CompanyName };
// Result of linq to entities transformed by linq to objects
return query.AsEnumerable()
.Select(x => new SelectListItem
{
Value = x.CustomerID.ToString(),
Test = x.CompanyName
}).ToList();
}
}
Here is a simplified version I'm using now (specific for SQL CE):
public static IEnumerable<SelectListItem> GetBlogCategoryList()
{
using (SiteDataContext db = new SiteDataContext())
{
var list = from l in db.BlogCategories.AsEnumerable()
orderby l.CategoryName
select new SelectListItem { Value = l.CategoryID.ToString(), Text = l.CategoryName };
return list.ToList();
}
}
Note the db.BlogCategories.AsEnumerable() part

Resources