Transform data when parsing a JSON string using Dart - dart

I'm using the parse() function provided in dart:json. Is there a way to transform the parsed data using parse()? I'm thinking of something similar to the reviver argument when parsing JSON using JavaScript:
JSON.parse(text[, reviver])

The parse() function in dart:json takes a callback as an arg that you can use to transform the parsed data. For example, you may prefer to express a date field as a DateTime object, and not as a list of numbers representing the year, month and day. Specify a ‘reviver’ function as a second argument to parse.
This function is called once for each object or list property parsed, and the return value of the reviver function is used instead of the parsed value:
import 'dart:json' as json;
void main() {
var jsonPerson = '{"name" : "joe", "date" : [2013, 10, 3]}';
var person = json.parse(jsonPerson, (key, value) {
if (key == "date") {
return new DateTime(value[0], value[1], value[2]);
}
return value;
});
person['name']; // 'joe'
person['date'] is DateTime; // true
}

Related

Convert JSON to map with string keys and List<String> values

I'm attempting to convert JSON that has strings for keys and string arrays for values.
From my understanding, this should work:
import 'dart:convert';
void main() {
var jsonString = '{"key": ["1", "2", "3"]}';
var data = json.decode(jsonString) as Map;
var result = data.cast<String, List<String>>();
print(result);
}
However I get the error that type 'List<dynamic>' is not a subtype of type 'List<String>' in type cast.
What's interesting, however, is that the following does correctly work:
import 'dart:convert';
void main() {
var jsonString = '{"key": "value"}';
var data = json.decode(jsonString) as Map;
var result = data.cast<String, String>();
print(result);
}
So, I assume that the .cast<> method introduced with Dart 2 doesn't know how to convert nested types that aren't simple types like String, int, or bool.
How would I convert this object to a Map<String, List<String>> without resorting to external libraries?
So, I assume that the .cast<> method introduced with Dart 2 doesn't know how to convert nested types that aren't simple types like String, int, or bool.
That's correct. It just does a one-level-deep shallow conversion. You can do the nested conversion yourself like:
void main() {
var jsonString = '{"key": ["1", "2", "3"]}';
var data = json.decode(jsonString) as Map;
var result = data.map((key, value) =>
MapEntry<String, List<String>>(key, List<String>.from(value)));
print(result.runtimeType);
}
This is calling Map.map(). Sometimes Map.fromIterable() or Map.fromIterables() is a better fit. The collection types have a handful of methods like this to convert between different types.

Dart: How to json decode '0' as double

I have a service that is sending me a list of values and after parsing with the 'dart:convert json.decode' I get a List < dynamic > json as such:
(values) [ 0.23, 0.2, 0, 0.43 ]
When I try to parse the values to 'double', I get a parsing error
double value = values[2] // error
Because if we check the .runtimeType of each element we can see that we have the following
[double, double, int, double]
So, how can I avoid '0' being interpreted as int?
I would like to avoid using a function like this:
double jsonToDouble(dynamic value) {
if(value.runtimeType == int) {
return (value as int).toDouble();
}
return value;
}
You can add some special logic to the JSON decoding by constructing your own codec with a reviver. This callback will be invoked for every property in a map or item in a List.
final myJsonCodec = new JsonCodec.withReviver((dynamic key, dynamic value) {
if (value is int) return value.toDouble();
return value;
});
However, you will always get a List<dynamic> from dart:covert, whether or not all of your number types are doubles or ints. Depending on what version of dart you are on you may be able to use List.cast to convert to a List<double>.
List<double> values = oldValues.cast<double>();
If that isn't available, you can create a new List<double> and add your values to it so that the runtime type is correctly set.
Doubles have a toDouble() method. So, you can use that in both cases (double and int).
double jsonToDouble(dynamic value) {
return value.toDouble();
}
This works only if the value is a double or an int, though. If the value is a string, an exception occurs.

Attempting to map dates to index in ElasticSearch

I am using ElasticSearch and attempting to create an index with a class that has a dynamic type property in it. This property may have strings or dates in it. When indexing, I have been using the code below:
dynamic instance = MyObject.GetDynamicJson();
var indexResponse = client.Index((object) instance, i=>i
.Index("myIndex")
.Type("MyObject")
);
Here's the code for GetDynamicJson().
MyObject has only Name and Value as properties. (apologies, I've had issues in the past with Elastic choking on json without all the quotes, which I have escaped with \ characters):
String json = "{ \"Name\":\" + Name + "\",\"DateValue\":\"";
try {
var date = DateTime.parse(Value);
json += DateTime.ToString("yyyy/MM/dd HH:mm:ss Z") + "\", \"Value\":\"\"}";
} catch { //If the DateTime parse fails, DateValue is empty and I'll have text in Value
json += "\",\"Value\":\"" + Value + "\"}";
}
return json;
For some reason it doesn't seem to like the string in DateValue and I definitely don't know why it's leaving out that property entirely in the error:
For whatever reason, ElasticSearch is completely dumping the DateValue property, doesn't seem to see the DateValue property at all.
I'm getting the error:
{"name":"CreatedDate","value":"2017-11-07T13:37:11.4340238-06:00"}
[indices:data/write/bulk[s][p]]"}],"type":"class_cast_exception","reason":"org.elasticsearch.index.mapper.TextFieldMapper cannot be cast to org.elasticsearch.index.mapper.DateFieldMapper"},"status":500
Later note: I have changed the index creator method to update the mapping. I added a third field to the Object, so now it has properties: Name, Value, DateValue:
public static void CreateRecordsIndex(ElasticClient client)
{
client.CreateIndex("myIndex", i => i
.Settings(s => s
.NumberOfShards(2)
.NumberOfReplicas(0)
)
.Mappings(x => x
.Map<MyObject>(m => m.AutoMap())));
}
Now, it is successfully mapping and creating a property each time, but it still seems to drop the property I am sending it in the json. It just sets them all to the default datetime: "dateValue": "0001-01-01T00:00:00". This is strange, because when making the dynamic instance I send to Elastic, I use only the MyObject.GetDynamicJson() method to build it. I no longer get the mapping error, but Elastic still seems oblivious to "dateValue":"some date here" in the object when it is set.
OK, I got rid of the dynamic object type (ultimately I wasn't actually getting data from the json method, I had a typo and Elastic was getting the original object directly - it's a wonder it was still handling it). So I let Elastic do the parse using its mapping. In order to do that, I first updated MyObject to include multiple properties, one for each type the incoming property could be (I am only handling text and dates in this case). For the DateValue property of MyObject, I have this:
public DateTime DateValue {
get
{
try
{
return DateTime.Parse(Value);
} catch
{
return new DateTime();
}
}
set
{
try {
DateValue = value;
} catch
{
DateValue = new DateTime();
}
}
}
Now, if Value is a date, my DateValue field will be set. Otherwise it'll have the default date (a very early date "0001-01-01T00:00:00"). This way, I can later search both for text against that dynamic field, or if a date is set, I can do date and date range queries against it (technically they end up in two different fields, but coming from the same injested data).
Key to this is having the index mapping setup, as you can see in this method from the question:
public static void CreateRecordsIndex(ElasticClient client)
{
client.CreateIndex("myIndex", i => i
.Settings(s => s
.NumberOfShards(2)
.NumberOfReplicas(0)
)
.Mappings(x => x
.Map<MyObject>(m => m.AutoMap())));
}
In order to recreate the index with the updated MyObject, I did this:
if (client.IndexExists("myIndex").Exists)
{
client.DeleteIndex("myIndex");
CreateRecordsIndex(client); //Goes to the method above
}

How to pass multiple data objects to HTML templates in Golang

I am returing all rows of a table as json to the variable pdata and unmarshaling it into an interface object.
I have an instance of the user struct which I would like to pass along with the unmarshalled json data to the render function and access it using field arguments {{.fieldname}} in the html template.
if uuid != "" {
pdata, err := getProduct()
if err != nil {
fmt.Println(err)
}
type Prdata struct {
Puid string `json:"puid"`
Pname string `json:"pname"`
Quantity string `json:"quantity"`
Price string `json:"price"`
Image string `json:"image"`
}
// Obj:= make(map[Prdata]string)
var Pr Prdata
err = json.Unmarshal(pdata , &Pr)
if err != nil {
fmt.Println(err)
}
fmt.Println(string(pdata))
fmt.Println(Pr)
fmt.Println(u)
render(w, "internal", Pr)
}
fmt.Println(string(pdata)) gives this output
[{"image":"1Appleiphone7.jpeg","pname":"iphone7","price":"70000","puid":"d6742e4e-2ad6-43c5-97f4-e8a7b00684e2","quantity":"100"}]
I have only been successful to unmarshal the data into an interface{} object. Trying to make maps with keys of the type interface{} and values of type string but it throws the error:
"json: cannot unmarshal array into Go value of type map[interface {}]string"
The render function takes an argument of the type interface{}
func render(w http.ResponseWriter, name string, data interface{})
fmt.Println(Pr) gives this output:
[map[quantity:100 image:1Appleiphone7.jpeg pname:iphone7 price:70000 puid:d6742e4e-2ad6-43c5-97f4-e8a7b00684e2]]
u is an instance of struct User
var u = &User{}
type User struct {
Uuid string
Username string
Password string
Fname string
Email string
}
I can see the output on the html page using the pipeline {{.}}. However I am not able to access any data using the fieldname.
There must be a way of doing this. But I am not sure how?
When I pass of a json of the type below, I am able to pass it to the struct type and reference it by the key values using pipelines in my template.
str := `{
"image": "1Appleiphone7.jpeg",
"pname": "iphone7",
"price": "70000",
"puid": "d6742e4e-2ad6-43c5-97f4-e8a7b00684e2",
"quantity": "100"
}`
unmarshal function
err = json.Unmarshal([]byte(str), &Pr)
The difference in the json data of the DB record pdata and the one above str is in the use of backticks "`". It seems that though the json data shows key value pairs, it is actually a json array and not a json object. Is there a way to get around this?
You don't need a map[interface{}]string to unmarshal the json obj. Your json is equivalent to slice of maps:
[
{
"image":"1Appleiphone7.jpeg",
"pname":"iphone7",
"price":"70000",
"puid":"d6742e4e-2ad6-43c5-97f4-e8a7b00684e2",
"quantity":"100"
}
]
The object to unmarshal should be a slice of maps with string keys and values:
var Pr []map[string]string
Playground
BTW, I believe misunderstanding is hidden there:
The render function takes an argument of the type interface{}
I means not that you must pass a variable of interface{} type there but it means you CAN pass a variable of any type to render function.
I am posting a working example of unmarshalling json as bytes into a struct type which then can be referenced using the {{.}} in the template.
package main
import (
"encoding/json"
"fmt"
)
type Usrdata struct {
Uuid string
Fname string
}
type Prdata struct {
Puid string `json:"puid"`
Pname string `json:"pname"`
Quantity string `json:"quantity"`
Price string `json:"price"`
Image string `json:"image"`
}
type Data struct {
U Usrdata
P []Prdata
}
func main() {
Ur := Usrdata{Uuid: "xyz", Fname: "Somename"}
Pr := make([]Prdata, 0)
var Dt Data
Dt.U = Ur
pdata := `[{"image":"1Appleiphone7.jpeg","pname":"iphone7","price":"70000","puid":"d6742e4e-2ad6-43c5-97f4-e8a7b00684e2","quantity":"100"}]`
err := json.Unmarshal([]byte(pdata), &Pr)
if err != nil {
fmt.Println(err)
}
Dt.P = Pr
fmt.Println(Pr[0].Pname)
fmt.Println(Pr)
fmt.Println(Dt)
}

Access Rhino's native JSON.Stringify from Java

Is there a cleaner way to get the JSON representation of a Javascript object than with the following kludge?
System.out.println(((ScriptableObject) scope).callMethod(
cx, (Scriptable) scope.get("JSON", scope),
"stringify", new Object[]{jsObject}));
Where jsObject is the ScriptableObject I want to stringify.
Note that Hannes has now addressed this in Rhino. So the usage simplifies to this:
import org.mozilla.javascript.NativeJSON;
// ...
Object json = NativeJSON.stringify(cx, scope, jsObject, null, null);
The org.mozilla.javascript.NativeJSON class should be public in the Rhino 1.7R4 release.
I was able to get this working within an Apache Ant target using the NativeJSON class.
importPackage(org.mozilla.javascript);
var context = Context.enter();
var json = '{}';
// The call to parse required a reviver function that should return the
// state of a key/value pair.
var reviver = function(key, value) { return value; };
var scope = context.initStandardObjects();
var object = NativeJSON.parse(context, scope, json, reviver);
// The call to stringify does not require the replacer or space parameters.
// The replacer is a function that takes a key/value pair and returns the
// new value or an array of keys from the input JSON to stringify. The space
// parameter is the indentation characters or length.
json = NativeJSON.stringify(context, scope, config, null, 4);
http://mozilla.github.io/rhino/javadoc/org/mozilla/javascript/NativeJSON.html
https://github.com/mozilla/rhino/blob/master/src/org/mozilla/javascript/NativeJSON.java

Resources