Serializing JSON with configured serializer - asp.net-mvc

I am using ASP.Net Web API with JSON.Net to serialize. I had to configure the serializer to handle ISO dates properly like this:
var iso = new IsoDateTimeConverter {
DateTimeFormat = "yyyy'-'MM'-'dd'T'HH':'mm':'ss.fffK"
};
GlobalConfiguration.Configuration.Formatters.JsonFormatter
.SerializerSettings.Converters.Add(iso);
This works fine when I am passing my objects down via the WebAPI. My problem, however, is that I have another place where I want to explicitly call the serialization:
#Html.Raw(JsonConvert.SerializeObject(Model));
In this case, it doesn't use the configuration I set up. I am aware that I can pass the iso converter into the SerializeObject call, but I prefer to avoid this and get a hold of a configured serialzer for obvious reasons.
Any ideas?

If you're going to do JSON serialization yourself, you have to pass the settings you want explicitly. There's no way around it. The best I can think of if you want to reuse the same serializer settings is to do something like this:
JsonConvert.SerializeObject(Model, GlobalConfiguration.Configuration.Formatters.
JsonFormatter.SerializerSettings)

Related

How to manually get/set object model from AngularDart component?

Angular manages our objects somewhere in the Scope, but I need to manually set and get the object that controls a component so I can automatically set all the component's info once I download its model from the web through JSON.
Example
#Component (selector: 'my-component')
class MyComponent {
//component fields
}
As the user interacts with <my-component> it changes its fields. I want to be able to get the whole MyComponent object with its fields to save them like this
MyComponent comp = magicComponentGetterFunc('my-component');
String json = encodeJson (comp); //I do this this redstone_mapper
I think ngProbe can do that.
ElementProbe ep = ngProbe('my-component');
You can can find examples by visiting the links at https://pub.dartlang.org/packages/angular where ngProbe or ElementProbe is mentioned.
The CHANGELOG.md mentions that ngProbe is necessary for animation but the codedoc on the ElementProbe class still states that it is for testing and debugging (see https://github.com/angular/angular.dart/blob/96c0bcc7b6b0c3501b2c4799f425de8cd2e4dc0c/lib/core_dom/view_factory.dart#L259)
To get the JSON you either have to implement a getter/method (for example toJson) that returns the JSON you want or use some of the available serialization solutions (see State of Serialization and Deserialization of JSON in Dart).
If you want to do that this usually indicates that your approach works against the way Angular should be used.

How to encode javascript string for display and post back?

I have an MVC application that is rendering rendering the following javascript on the client:
var rawData = [{"ID":5317,"Code":"12345","Description":"sometext \u003c/= 100"}];
The JSON data is a result of serializing an object using the JavaScriptSerializer and then running the result through the Html.Raw() helper.
This data is then used to load a knockout view model and display a popup on hover. In the popup, only the "sometext" portion of the "Description" property is being shown as the string gets converted to the unencoded version when setting the rawData variable (i.e. \u003c is converted to <).
Also, this data ends up being sent back to the server upon saving of data, and the ASP.NET validation kicks in and fails the request as it detects the "
I've worked around this, temporarily, by adding a computed property to my Knockout View Model like so:
self.DescriptionEncoded = ko.observable('');
self.Description = ko.computed({
read: function() {
return self.DescriptionEncoded ();
},
write: function(value) {
self.DescriptionEncoded($('<div/>').text(value).html());
}
});
In this way I can access the escaped property from my popup and the unescaped value is not sent back to the server when I serialize my viewmodel (using .toJSON()).
Is there a more global way to handle this rather than creating computed properties for every object that may have some text that appear to be a bad request while not compromising on security? I've considered an overload/helper to the serialization routine that would accept a list of properties to apply a Find/Replace I am thinking this will have to be handled on a case by case basis in a manner similar to what I've already done. As for sending the data back to the server, I could override the toJSON() method on my view model and delete the properties that don't need to be sent back, but that won't help me with my popup.
Thoughts?
You can encode using Ajax.JavaScriptStringEncode. You might also get the AntiXSS library and use it for the encoding.
I hope I understood your question well.

Jira issues in JSON format (Need to know thw classes and functions called by I am trying to use the REST API "api/2.0.alpha1/issue/{issueKey}"

I am trying to use the REST API api/2.0.alpha1/issue/{issueKey} .
Reference: http://docs.atlassian.com/jira/REST/4.4.3/#id2413591
I would get all issue id's from rest/api/2.0.alpha1/search
Using these issue IDs get all issues in JSON format.
But as I am using localhost (local Machine) I do not want to make network calls and increase network traffic. Hence I wanted to know which class in JAVA does these URIs call so that I can directly call these classes to get the issues in JSON format.
Basically I want all the issues in JSON format without network calls.
OR
I also have all the issue retrieved in issues object but not in JSON format. How can I convert that into JSON format?
I have found the following code from JIRA:
#GET
#Path ("/{issueKey}")
public Response getIssue(#PathParam ("issueKey") final String issueKey)
{
final Issue issue = getIssueObject(issueKey);
final IssueBean bean = createIssue(issue);
return Response.ok(bean).cacheControl(never()).build();
}
You could search the source code for the #GET references or use the REST API browser (https://developer.atlassian.com/display/RAB/Overview+of+the+Atlassian+REST+API+Browser)
but accessing the classes from Java probably means that you need to be running in the same class loader as JIRA or using a plugin.
Have you measured the overhead of the calls to make sure that you are not optimizing prematurely?

Grails JSON Converters and transient properties

Using Grail 1.3.7 I found that the JSON converter ignores transient properties of Domain objects.
Question: Is there an elegant way to work around this obstacle.
Bonus question: what's the reasoning behind excluding calculated fields(transient props) from being sent to the response????
what works for me is this one line
def jsonobj=domobj.properties as JSON
one way would be to manually create your json response, e.g.
["prop1" : obj.prop1, "prop2" : obj.prop2, ...] as JSON
Transient is made exactly for that: Variables may be marked transient to indicate that they are not part of the persistent state of an object
And JSON is an serialized (=persistent) state of object
So, if you need it to be serialized - you have to create an new class, just for json serialization, that will have all fields you need to serialize.
If you need fine-grained control over the fields that are included/excluded in the JSON, I fine using the JSONBuilder a better option than the converter. Here's an example of how to do this.
You could use the "marshallers" plug-in and define you transient property as virtual like this:
static marshalling = {
virtual {
yourPropery { value, json -> json.value(value.yourPropery) }
}
}

How to move from untyped DataSets to POCO\LINQ2SQL in legacy application

Good day!
I've a legacy application where data access layer consists of classes where queries are done using SqlConnection/SqlCommand and results are passed to upper layers wrapped in untyped DataSets/DataTable.
Now I'm working on integrating this application into newer one where written in ASP.NET MVC 2 where LINQ2SQL is used for data access. I don't want to rewrite fancy logic of generating complex queries that are passed to SqlConnection/SqlCommand in LINQ2SQL (and don't have permission to do this), but I'd like to have result of these queries as strong-typed objects collection instead of untyped DataSets/DataTable.
The basic idea is to wrap old data access code in a nice-looking from ASP.NET MVC "Model".
What is the fast\easy way of doing this?
Additionally to the answer below here is a nice solution based on AutoMapper: http://elegantcode.com/2009/10/16/mapping-from-idatareaderidatarecord-with-automapper/
An approach that you could take is using the DataReader and transfer. So for every object you want to work with define the class in a data transfer object folder (or however your project is structured) then in you data access layer have something along the lines of the below.
We used something very similar to this in a project with a highly normalized database but in the code we did not need that normalization so we used procedures to put the data into more usable objects. If you need to be able to save these objects as well you will need handle translating the objects into database commands.
What is the fast\easy way of doing
this?
Depending on the number of classes etc this is could not be the fastest approach but it will allow you to use the objects very similarly to the Linq objects and depending on the type of collections used (IList, IEnumerable etc) you will be able to use the extension methods on those types of collections.
public IList<NewClass> LoadNewClasses(string abc)
{
List<NewClass> newClasses = new List<NewClass>();
using (DbCommand command = /* Get the command */)
{
// Add parameters
command.Parameters["#Abc"].Value = abc;
// Could also put the DataReader in a using block
IDataReader reader = /* Get Data Reader*/;
while (reader.Read())
{
NewClass newClass = new NewClass();
newClass.Id = (byte)reader["Id"];
newClass.Name = (string)reader["Name"];
newClasses.Add(newClass);
}
reader.Close();
}
return newClasses;
}

Resources