What's the best way to serialize multiple classes with XmlSerializer in one file? Ideally, what I want is to have one root node and the XmlSerializer serializes one child node to this root node, one per class.
Otherwise, my other idea is to just make a wrapper that contains these classes and serialize that.
Add the objects you want to serialize into a collection and then serialize the collection using the XmlSerializer.
Related
I have a command object that I want to convert into a domain object.
However, the object I want to convert the command object into may be one of two domain classes (they're both derived classes), and I need to do it in a service (which is where, based on other data, I decide which type of object it should be bound to). Is this possible and what's the best way to do this? bindData() only exists in a controller.
Do I just have to manually map command object parameters to the appropriate domain object properties? Or is there a faster/better way?
If the parameters have the same name, then you can use this question to copy the values over. A quick summary can be as follows.
Using the Grails API
You can cycle through the properties in a class by accessing the properties field in the class.
object.properties.each { property ->
// Do something
}
You can then check to see if the property is present in the other object.
if(otherObject.hasProperty(property) && !(key in ['class', 'metaClass']))
Then you can copy it from one object to the other.
Using Commons
Spring has a really good utility class called BeanUtils that provides a generic copy method that means you can do a simlple oneliner.
BeanUtils.copyProperties(object, otherObject);
That will copy values over where the name is the same. You can check out the docs here.
Otherwise..
If there is no mapping between them, then you're kind of stuck because the engine has no idea how to compare them, so you'll need to do it manually.
As a complete novice programmer I am trying to populate my neo4j DB with data from heterogeneous sources. For this I am trying to use the Neo4jClient C# API. The heterogeneity of my data comes from a custom, continuously evolving DSL/DSML/metamodel that defines the possible types of elements, i.e. models, thus creating classes for each type would not be ideal.
As I understand, my options are the following:
Have a predefined class for each type of element: This way I can easily serialize my objects that is if all properties are primitive types or arrays/lists.
Have a base class (with a Dictionary to hold properties) that I use as an interface between the models that I'm trying to serialize and neo4j. I've seen an example for this at Can Neo4j store a dictionary in a node?, but I don't understand how to use the converter (defined in the answer) to add a node. Also, I don't see how an int-based dictionary would allow me to store Key-Value pairs where the keys (that are strings) would translate to Property names in neo4j.
Generate a custom query dynamically, as seen at https://github.com/Readify/Neo4jClient/wiki/cypher#manual-queries-highly-discouraged. This is not recommended and possibly is not performant.
Ultimately, what I would like to achieve is to avoid the need to define a separate class for every type of element that I have, but still be able to add properties that are defined by types in my metamodel.
I would also be interested to somehow influencing the serializer to ignore non-compatible properties (similarly to XmlIgnore), so that I would not need to create a separate class for each class that has more than just primitive types.
Thanks,
J
There are 2 problems you're trying to solve - the first is how to program the C# part of this, the second is how to store the solution to the first problem.
At some point you'll need to access this data in your C# code - unless you're going fully dynamic you'll need to have some sort of class structure.
Taking your 3 options:
Please have a look at this question: neo4jclient heterogenous data return which I think covers this scenario.
In that answer, the converter does the work for you, you would create, delete etc as before, the converter just handles the IDictionary instance in that case. The IDictionary<int, string> in the answer is an example, you can use whatever you want, you could use IDictionary<string, string> if you wanted, in fact - in that example, all you'd need to do would be changing the IntString property to be an IDictionary<string,string> and it should just work.
Even if you went down the route of using custom queries (which you really shouldn't need to) you will still need to bring back objects as classes. Nothing changes, it just makes your life a lot harder.
In terms of XmlIgnore - have you tried JsonIgnore?
Alternatively - look at the custom converter and get the non-compatible properties into your DB.
Here's my scenario:
I get a JSON list from an API call, and use a RestKit mapping to generate the associated objects, with Core Data backing.
I then use a property on one of these objects (= parent) to generate a new (and separate) API call to get a different JSON list from which I map different (= child) objects.
Each of these children needs a reference (e.g. ID) to the parent object, but this reference is not included in the JSON payload.
So my question is: how do I access these child objects post-mapping, to make sure the field referencing their parent is specified? I'm sure the answer is staring me in the face somewhere in the RestKit documentation, but that's an enormous haystack and I'm not sure where to start looking. Thanks!
If the identity of the parent is part of the URL used to obtain the children then you can use RestKits routing mechanism, together with path patterns, to obtain all of the required data and connect the objects automatically during mapping. This uses the RKRouter and RKRelationshipMapping classes.
Check the section on 'Routing' here for a good description and several examples.
Alternatively, every method that loads a URL and performs a mapping has a completion block which gives access to the mapped objects. This is usually an instance of RKMappingResult, from which you can get a dictionary, or array, of the mapped objects.
Once you get the objects you can modify them and then simply save the managed object context.
Our users need to be able to export data in CSV format, edit some of the records, and upload the data again. The data does not map to entities, you could say that the object graph is flattened to fit in the Excel-based workflow.
Right now this happens in the controllers because I thought these DTO classes were view models. It smells but I don't have a clear idea how to fix that. Is there a pattern I could/should follow?
Thanks!
Start by abstract this logic into an interface containing the necessary method. Implement this interface against the CSV format. Pass the interface in the constructor of the controller. Use DI to inject the proper implementation. In the controller action call the method on the interface.
If you want to return CSV directly from your controller action you could write a custom ActionResult like CsvActionResult which will take the model and serialize it into CSV so that in your controller action you return new CsvResult(someModel).
I like having separate classes, one class represents the entity, and that a separate DAO (database access object).
Is this possible with rails and active record?
Most of what you would put into a DAO is already hidden inside of ActiveRecord anyway, so there's not much of a need to split these up. But, if you insist you can split out whatever methods you want into a separate Module and then include it in your model.