need Dart mirror instantiate() function - dart

to add some sanity to my life, looking for instantiate() function as syntactic sugar to Dart's mirror library: instantiate( class|type|instance, argArray )
class Klass {
int i1;
Klass( int i1 ) {
this.i1 = (i1 is int) ? i1 : 0;
}
}
type ktype = Klass;
Klass kinstance = new Klass( 5 );
Klass test1 = instantiate( Klass, [5] );
Klass test2 = instantiate( ktype, [5] );
Klass test3 = instantiate( kinstance, [5] );
currently 90% of my interaction with mirrors would be covered by this one function. currently blindly cutting and copying out of sheer stupidity. certainly someone smarter than me has done this already!
here is instantiate( type, [constructor, positional, named] ) for all occassions:
arguments of constructor, positional and named are all optional
type can be Type, in instantiated type, or a string representation of the type
constructor: eg, new Map.from(...) - 'from' is the constructor, either 'from' or #from
positional: positional arguments in a List
named: names arguments in Map, keys may be 'key' or #key
dynamic instantiate( dynamic v_type, [dynamic v_constructorName, List v_positional, Map v_named] ) {
Type type =
( _type is Type ) ? v_type
: (v_type is String ) ? str2Type( v_type )
: reflect(v_type).type.reflectedType;
Map v_named2 =
(v_named is Map) ? v_named
: (v_positional is Map) ? v_positional
: (v_constructorName is Map) ? v_constructorName
: {};
Map named = {};
v_named2.keys.forEach( (k) => named[(k is Symbol)?k:new Symbol(k)] = v_named2[k] );
List positional =
(v_positional is List) ? v_positional
: (v_constructorName is List) ? v_constructorName : [];
Symbol constructorName =
(v_constructorName is Symbol) ? v_constructorName
: (v_constructorName is String) ? Symbol(v_constructorName)
: const Symbol('');
return reflectClass(type).newInstance(constructorName, positional, named).reflectee;
}

import 'dart:mirrors';
void main() {
Type ktype = Klass;
Klass kinstance = new Klass( 5 );
// Constructor name
var ctor = const Symbol("");
Klass test1 = instantiate(Klass, ctor, [1]);
Klass test2 = instantiate(ktype, ctor, [2]);
Klass test3 = instantiate(reflect(kinstance).type.reflectedType, ctor, [3]);
Klass test4 = instantiate(Klass, #fromString, ["4"]);
print(test1.i1);
print(test2.i1);
print(test3.i1);
print(test4.i1);
}
dynamic instantiate(Type type, Symbol constructorName, List positional, [Map named]) {
return reflectClass(type).newInstance(constructorName, positional, named).reflectee;
}
class Klass {
int i1;
Klass( int i1 ) {
this.i1 = (i1 is int) ? i1 : 0;
}
Klass.fromString(String i) {
i1 = int.parse(i, onError : (s) => i1 = 0);
}
}
Output:
1
2
3
4

Related

Is there a way to multiply a nullable with a compact operator, something like "?*"

Is there a way to get a multiplication with a nullable using a compact syntax such as:
int? i;
final j = i ?* 2 ?? null;
Rater than:
final j = i == null ? null : i! * 2;
No.
There is no null-aware syntax which extends to operators (other than [] and []=).
You can introduce an extension method doing multiplication, like:
extension IntOps on int {
int imul(int other) => this * other;
int iadd(int other) => this + other;
int isub(int other) => this - other;
// etc.
}
and then you can do:
int? i;
final j = i?.imul(2);

Why does my tree creation fail without the use of inline?

I'm trying to create a Trie structure in Zig using Zigs StringHashMap.
I am able to get it to work a bit, but only by using a "inline" for loop which is not really usable as this requires the paths to be known at compile time :-(
Any help/explanation would be much appreciated :-)
The code:
const std = #import("std");
const Allocator = std.mem.Allocator;
const print = std.debug.print;
const expect = std.testing.expect;
const HashMap = struct {
value: u8,
children: std.StringHashMap(*HashMap),
};
fn newHashMap(allocator: Allocator, value: u8) HashMap {
return HashMap{
.value = value,
.children = std.StringHashMap(*HashMap).init(allocator),
};
}
fn showTree(root: *std.StringHashMap(*HashMap), keys:[3][]const u8 ) void {
var hashMap = root;
for (keys) |key| {
print("get key {s}\n", .{key});
var value = hashMap.get(key);
if (value) |node| {
print("we got a value for {s}:{}\n", .{key,node.value});
hashMap = &node.children;
} else {
print("no value for {s}\n", .{key});
break;
}
}
}
test "HashMap" {
var gpa = std.heap.GeneralPurposeAllocator(.{}){};
const gpaAllocator = gpa.allocator();
var arena = std.heap.ArenaAllocator.init(gpaAllocator);
defer {
arena.deinit();
const leaked = gpa.deinit();
if (leaked) expect(false) catch #panic("TEST FAIL"); //fail test; can't try in defer as defer is executed after we return
}
const allocator = arena.allocator();
var root = &std.StringHashMap(*HashMap).init(allocator);
var hashMap = root;
const keys = [_][]const u8{ "a", "b", "c" };
const values: [3]u8 = .{ 1, 2, 3 };
// create tree
inline for (keys) |key, i| {
print("put key {s}:{}\n", .{ key, values[i] });
var newNode = newHashMap(allocator, values[i]);
try hashMap.put(key, &newNode);
showTree(root,keys);
hashMap = &newNode.children;
}
showTree(root,keys);
}
This prints:
Test [1/1] test "HashMap"...
put key a:1
put key b:2
put key c:3
get key a
we got a value for a:1
get key b
we got a value for b:2
get key c
we got a value for c:3
All 1 tests passed.
as expected.
Removing the 'inline' results in:
Test [1/1] test "HashMap"...
put key a:1
put key b:2
put key c:3
get key a
we got a value for a:3
get key b
no value for b
All 1 tests passed.
The answer turned out to be quite obvious (with hindsight ;-)) as mentioned in 1:
var declarations inside functions are stored in the function's stack frame. Once a function returns, any Pointers to variables in the function's stack frame become invalid references, and dereferencing them becomes unchecked Undefined Behavior.
This explains the strange behaviour in a loop without inline.
The pointers just get overwritten resulting in Undefined Behaviour.
By adding 'inline' the loop is unwound and then there is no pointer reuse, hence the correct output.
The correct way of dealing with this is to allocate the struct explicitly and pass around the pointer to the struct as shown in 2.
Once that is sorted it all makes sense.
https://ziglang.org/documentation/master/#Where-are-the-bytes
https://www.reddit.com/r/Zig/comments/s6v8t3/idiomatic_zig_for_initializing_an_allocated/
For reference, the working code without 'inline' below:
const std = #import("std");
const Allocator = std.mem.Allocator;
const print = std.debug.print;
const expect = std.testing.expect;
const HashMap = struct {
value: u8,
children: std.StringHashMap(*HashMap),
};
fn newHashMap(allocator: Allocator, value: u8) !*HashMap {
const node = try allocator.create(HashMap);
node.* = .{
.value = value,
.children = std.StringHashMap(*HashMap).init(allocator),
};
return node;
}
fn showTree(root: *std.StringHashMap(*HashMap), keys:[3][]const u8 ) void {
var hashMap = root;
for (keys) |key| {
print("get key {s}\n", .{key});
var value = hashMap.get(key);
if (value) |node| {
print("we got a value for {s}:{}\n", .{key,node.value});
hashMap = &node.children;
} else {
print("no value for {s}\n", .{key});
break;
}
}
}
test "HashMap" {
var gpa = std.heap.GeneralPurposeAllocator(.{}){};
const gpaAllocator = gpa.allocator();
var arena = std.heap.ArenaAllocator.init(gpaAllocator);
defer {
arena.deinit();
const leaked = gpa.deinit();
if (leaked) expect(false) catch #panic("TEST FAIL"); //fail test; can't try in defer as defer is executed after we return
}
const allocator = arena.allocator();
var root = &std.StringHashMap(*HashMap).init(allocator);
var hashMap = root;
const keys = [_][]const u8{ "a", "b", "c" };
const values: [3]u8 = .{ 1, 2, 3 };
// create tree
for (keys) |key, i| {
print("put key {s}:{}\n", .{ key, values[i] });
var newNode = try newHashMap(allocator, values[i]);
try hashMap.put(key, newNode);
hashMap = &newNode.children;
}
showTree(root,keys);
}

The return type 'int?' isn't a 'int', as required by the closure's contex

The following:
var sortedByValue = SplayTreeMap<int, String>.from(
fruit, (key1, key2) => fruit[key1].compareTo(fruit[key2]));
complains about null safety, then I add "?" to fruit[key1]?, ok then I.... ahhh?
import 'dart:collection';
void splayTreeMapExample(){
var fruit = SplayTreeMap<int, String>();
fruit[0] = 'Banana';
fruit[5] = 'Plum';
fruit[6] = 'Strawberry';
fruit[2] = 'Orange';
fruit[3] = 'Mango';
fruit[4] = 'Blueberry';
fruit[1] = 'Apple';
print(fruit);
fruit.forEach((key, val) {
print('{ key: $key, value: $val}');
});
var sortedByValue = SplayTreeMap<int, String>.from(
fruit, (key1, key2) => fruit[key1]?.compareTo(fruit[key2]));
print(sortedByValue);
}
You got to love null safety(Null safety principles - Non-nullable by default, hmm, right - well it must be the "Map/fruit" return value):
fruit, (key1, key2) => fruit[key1]!.compareTo(fruit[key2]!));

How can I remove all elements from a DXL skip list

I want to clear all elements within a Skip list, like this:
Module mod = current()
Skip skip = create()
put(skip, 1, "test")
put(skip, 2, mod)
clearSkip(skip) // Removes all elements
example script for deleting Skips of custom types, here: type OutLinkInfo:
struct OutLinkInfo {}
OutLinkInfo createOutLinkInfo_() { DxlObject d = new(); OutLinkInfo x = (addr_ d) OutLinkInfo; return(x) }
DxlObject DxlObjectOf(OutLinkInfo x) { return((addr_ x) DxlObject) }
void deleteOutLinkInfo(OutLinkInfo &x) { DxlObject d = DxlObjectOf(x); delete(d); x = null; return() }
Skip deleteOutLinkInfo(Skip sk)
{
OutLinkInfo x = null OutLinkInfo
for x in sk do { deleteOutLinkInfo(x) }
delete(sk); sk = null
return(sk)
}
You can use the setempty(Skip) function, although this specific overload is undocumented as far as I know.

How to call a library function by its name and setting it's parameters

I have a library functions defined like that in my C code :
static const struct luaL_reg SelSurfaceLib [] = {
{"CapabilityConst", CapabilityConst},
{"create", createsurface},
{NULL, NULL}
};
static const struct luaL_reg SelSurfaceM [] = {
{"Release", SurfaceRelease},
{"GetPosition", SurfaceGetPosition},
{"clone", SurfaceClone},
{"restore", SurfaceRestore},
{NULL, NULL}
};
void _include_SelSurface( lua_State *L ){
luaL_newmetatable(L, "SelSurface");
lua_pushstring(L, "__index");
lua_pushvalue(L, -2);
lua_settable(L, -3); /* metatable.__index = metatable */
luaL_register(L, NULL, SelSurfaceM);
luaL_register(L,"SelSurface", SelSurfaceLib);
}
And I can use it with this Lua code :
local sub = SelSurface.create()
local x,y = sub:GetPosition()
...
Now, my difficult issue : I'm using follwing code
function HLSubSurface(parent_surface, x,y,sx,sy )
local self = {}
-- fields
local srf = parent_surface:SubSurface( x,y, sx,sy )
-- methods
local meta = {
__index = function (t,k)
local tbl = getmetatable(srf)
return tbl[k]
end
}
setmetatable( self, meta )
return self
end
and my main code is :
sub = HLSubSurface( parent, 0,0, 160,320 )
x,y = sub.GetPosition()
but it's failing
./HDB/80_LeftBar.lua:19: bad argument #1 to 'SetFont' (SelSurface expected, got userdata)
It's because I need to provide srf as 1st argument to GetPosition() function ... but I strictly duno how to do that :(
I don't want to do it when calling GetPosition(),
x,y = sub.GetPosition()
but I'm looking for a way to do it transparently by setting it in meta's function.
In other words, I would like to have HLSubSurface object to inherit methods from SubSurface.
Any idea ?
Thanks.
Laurent
function HLSubSurface(parent_surface, x, y, sx, sy)
local srf = parent_surface:SubSurface(x, y, sx, sy)
local self = {
-- fields
....
}
setmetatable(self, {__index =
function (obj, key)
local parent_field
local parent_fields = getmetatable(srf).__index
if type(parent_fields) == "function" then
parent_field = parent_fields(key)
elseif parent_fields then
parent_field = parent_fields[key]
end
if type(parent_field) == "function" then
return
function(o, ...)
if o == obj then
return parent_field(srf, ...)
else
return parent_field(o, ...)
end
end
else
return parent_field
end
end
})
return self
end
And your main code would be:
sub = HLSubSurface( parent, 0,0, 160,320 )
x,y = sub:GetPosition()

Resources