I am developing server plugin.
On documentation I read this :
ensure that it can produce an (Iterable of) Node, Relationship or Path, any Java primitive or String or an instance of a org.neo4j.server.rest.repr.Representation
What does it mean?
is it like always I have to return public Iterable<SOMETHING_HERE> function(){} Or I can return anything like like public int function {}
Thanks
Amit Aggarwal
The valid return types of server plugins are:
* Node, Relationship, Path, java primitve, String or org.neo4j.server.rest.repr.Representation
* or an iterable of the above
Since int is a java primitive, your plugin method can safely return a int. See https://github.com/jimwebber/neo4j-tutorial/blob/master/src/koan/java/org/neo4j/tutorial/koan13/AwesomenessServerPlugin.java for an example.
Related
I would like to understand how can I analyze methods / functions body to find types that are explicitly referenced from it. I have success analyzing method declaration (return type, parameter types, etc..), however I have no idea how to do that for body.
Assuming following function:
String someFunction(int param) {
final list = <String>['a', 'b', 'c']; // -> DartTypes: String, List<String>
final myClass = MyClass<Arg>(); // -> DartTypes: Arg, MyClass<Arg>
final functionCall = anotherFunction<FunctionArg<Arg>>(); // -> DartTypes: Arg, FunctionArg<Arg>
return 'result';
}
// At is point I would like to know that my function depends on
// String, List<String>, Arg, MyClass<Arg>, FunctionArg<Arg>
// in term of DartType instances with proper typeArguments.
I tried getting AstNode for method element described here: https://stackoverflow.com/a/57043177/2033394
However I could not get elements from nodes to figure out their types. Their declaredElement values are always null. So I can not get back to Element API from AST API.
If you've used the exact snippet from the answer you've referenced, the problem is likely in getParsedLibraryByElement(). This method only parses the referenced library - meaning that you'll get an AST that doesn't necessarily have semantic references (like the declaredElement of AST nodes) set.
Instead, you'll want to use getResolvedLibraryByElement. The AST returned by that method will have its types and references fully resolved.
With the resolved AST, you could visit the body of the method with a custom visitor to find type references. Your definition of "referenced types" isn't really exact - but perhaps you can collect types in visitNamedType for type references and visitVariableDeclaration to collect the types of variables.
I have a pipeline that successfully outputs an Avro file as follows:
#DefaultCoder(AvroCoder.class)
class MyOutput_T_S {
T foo;
S bar;
Boolean baz;
public MyOutput_T_S() {}
}
#DefaultCoder(AvroCoder.class)
class T {
String id;
public T() {}
}
#DefaultCoder(AvroCoder.class)
class S {
String id;
public S() {}
}
...
PCollection<MyOutput_T_S> output = input.apply(myTransform);
output.apply(AvroIO.Write.to("/out").withSchema(MyOutput_T_S.class));
How can I reproduce this exact behavior except with a parameterized output MyOutput<T, S> (where T and S are both Avro code-able using reflection).
The main issue is that Avro reflection doesn't work for parameterized types. So based on these responses:
Setting Custom Coders & Handling Parameterized types
Using Avrocoder for Custom Types with Generics
1) I think I need to write a custom CoderFactory but, I am having difficulty figuring out exactly how this works (I'm having trouble finding examples). Oddly enough, a completely naive coder factory appears to let me run the pipeline and inspect proper output using DataflowAssert:
cr.RegisterCoder(MyOutput.class, new CoderFactory() {
#Override
public Coder<?> create(List<? excents Coder<?>> componentCoders) {
Schema schema = new Schema.Parser().parse("{\"type\":\"record\,"
+ "\"name\":\"MyOutput\","
+ "\"namespace\":\"mypackage"\","
+ "\"fields\":[]}"
return AvroCoder.of(MyOutput.class, schema);
}
#Override
public List<Object> getInstanceComponents(Object value) {
MyOutput<Object, Object> myOutput = (MyOutput<Object, Object>) value;
List components = new ArrayList();
return components;
}
While I can successfully assert against the output now, I expect this will not cut it for writing to a file. I haven't figured out how I'm supposed to use the provided componentCoders to generate the correct schema and if I try to just shove the schema of T or S into fields I get:
java.lang.IllegalArgumentException: Unable to get field id from class null
2) Assuming I figure out how to encode MyOutput. What do I pass to AvroIO.Write.withSchema? If I pass either MyOutput.class or the schema I get type mismatch errors.
I think there are two questions (correct me if I am wrong):
How do I enable the coder registry to provide coders for various parameterizations of MyOutput<T, S>?
How do I values of MyOutput<T, S> to a file using AvroIO.Write.
The first question is to be solved by registering a CoderFactory as in the linked question you found.
Your naive coder is probably allowing you to run the pipeline without issues because serialization is being optimized away. Certainly an Avro schema with no fields will result in those fields being dropped in a serialization+deserialization round trip.
But assuming you fill in the schema with the fields, your approach to CoderFactory#create looks right. I don't know the exact cause of the message java.lang.IllegalArgumentException: Unable to get field id from class null, but the call to AvroCoder.of(MyOutput.class, schema) should work, for an appropriately assembled schema. If there is an issue with this, more details (such as the rest of the stack track) would be helpful.
However, your override of CoderFactory#getInstanceComponents should return a list of values, one per type parameter of MyOutput. Like so:
#Override
public List<Object> getInstanceComponents(Object value) {
MyOutput<Object, Object> myOutput = (MyOutput<Object, Object>) value;
return ImmutableList.of(myOutput.foo, myOutput.bar);
}
The second question can be answered using some of the same support code as the first, but otherwise is independent. AvroIO.Write.withSchema always explicitly uses the provided schema. It does use AvroCoder under the hood, but this is actually an implementation detail. Providing a compatible schema is all that is necessary - such a schema will have to be composed for each value of T and S for which you want to output MyOutput<T, S>.
I am new to dropwizard. I am going through different tutorial but I couldnt find good example/tutorial on how to call stored procedure in dropwizard using JDBI.
With JDBI it is possible to specify an object SQL query as a method annotation. This string is treated by the DB as normal SQL. An example DAO to execute a stored procedure might look like:
public interface SomeQueries
{
#SqlQuery("call find_name_procedure(:id)")
String findName(#Bind("id") int id);
}
Where find_name_procedure has been previously defined.
For more information see http://jdbi.org/sql_object_api_queries/
Jooq is an elegant way to call stored procedures from java (See https://dzone.com/articles/using-stored-procedures-with-jpa-jdbc-meh-just-use). You also have a thirdparty module integrating Jooq into Dropwizard
(See http://modules.dropwizard.io/thirdparty/)
So if you're familiar with Maven, you can set up the combination quickly and start validating if it works in your case.
DropWizard JDBI provides two options to make a stored procedure call. Depending on what you want from Stored proc.
If you want resultset from Stored Proc, Andy has answered that already:
public interface DAO
{
#SqlQuery("call sp_name(:id)")
String query(#Bind("id") int id);
}
The other scenario is that your stored proc doesn't return resultset but gives out some output parameters. Here is what you can do:
#SqlCall("{ CALL sp_name(:id, :id2) }")
#OutParameter(name = "out", sqlType = Types.INTEGER)
OutParameters spCal(#Bind("id") long id, #Bind("id2") long id2);
OutParameter annotation may not work in DropWizard as the annotation was introduced in v3, you can use the following for accessing OutParameters:
try (Handle handle = dbi.open()) {
parameters = handle.createCall("CALL sp_name(:IN_Field1,:OUT_field2)")
.bind("IN_Field1", filed1)
.registerOutParameter("OUT_field2", Types.INTEGER)
.invoke();
}
And for some real freaky situations where you will need both Outparameters and resultset - DropWizard currently doesn't provide any direct way (Because the JDBI version with dropwizard is old). So, you can get connection by doing below:
handle = jdbi.open()
handle.getConnection()
And then do what you want to do with the connection.
Also, one more thing - I will not try to use Hibernate for Stored Proc call. Its just more pain.
I used Zalando Sprocwrapper. I like it and was widely used in the Zalando company to call PostgreSQL functions.
https://github.com/zalando-incubator/java-sproc-wrapper
I created Java user defined node in IntegrationToolkit (9.0.0.1) and assigned it with several properties. Two of the node properties are simple (of String type) and one property is complex (table property with predefined type of User-defined) that is consisted of another two simple properties.
By following the documentation I was able to read two simple properties in my Java extension class (that extends MbNode and implements MbNodeInterface) by making getters and setters that match the names of the two simple properties. Documentation also states that getters and setters should return and set String values whatever the real simple type of a property may be. Obviously, this would not work for my complex node property.
I was also able to read User Defined Properties that are defined on the message flow level, by using CMP (Integration Buss API) classes, which was another impossible thing to do from user defined node without CMP. At one point I began to think that my complex property would be among User Defined Properties, (although UDPs are defined on the flow level and my property is defined on the custom node level) based on some other random documentation and some other forum discussion.
I finally deduced that the complex property should map to MbTable type (as it is so stated in that type's description), but I was not able to use that.
Does anyone know how to access user defined node's complex(table) property value from Java?
I recently started working with WebSphere Message Broker v 8.0.0.5 for one of my projects and I was going to ask the same question until SO suggested your question which answered my question. It might be a little late for this question but it may help others having similar questions.
After many frustrating hours consulting IBM documentation this is what I found following your thread:
You're correct about the properties being available as user-defined properties (UDP) but only at the node level.
According to the JavaDoc for MbTable class (emphasis added to call out the relevant parts):
MbTable is a complex data type which contains one or more rows of simple data types. It structure is very similar to a * standard java record set. It can not be constructed in a node but instead is returned by the getUserDefinedAttribute() on the MbNode class. Its primary use is in allowing complex attributes to be defined on nodes instead of the normal static simple types. It can only be used in the runtime if a version of the toolkit that supports complex properties is being used.
You have to call com.ibm.broker.plugin.MbNode.getUserDefinedAttribute which will return an instance of com.ibm.broker.plugin.MbTable. However, the broker runtime doesn't call any setter methods for the complex attributes during the node initialization process like it does for simple properties. Also, you cannot access the complex attributes in either the constructor or the setter methods of other simple properties in the node class. These are available only in the run or evaluate method.
The following is the decompiled method definition of com.ibm.broker.plugin.MbNode.getUserDefinedAttribute.
public Object getUserDefinedAttribute(String string) {
Object object;
String string2 = "addDynamicTerminals";
if (Trace.isOn) {
Trace.logNamedEntry((Object)this, (String)string2);
}
if ((object = this.getUDA(string)) != null && object.getClass() == MbElement.class) {
try {
MbTable mbTable;
MbElement mbElement = (MbElement)object;
object = mbTable = new MbTable(mbElement);
}
catch (MbException var4_5) {
if (Trace.isOn) {
Trace.logStackTrace((Object)this, (String)string2, (Throwable)var4_5);
}
object = null;
}
}
if (Trace.isOn) {
Trace.logNamedExit((Object)this, (String)string2);
}
return object;
}
As you can see it always returns an instance of MbTable if the attribute is found.
I was able to access the complex attributes with the following code in my node definition:
#Override
public void evaluate(MbMessageAssembly inAssembly, MbInputTerminal inTerminal) throws MbException {
checkUserDefinedProperties();
}
/**
* #throws MbException
*/
private void checkUserDefinedProperties() throws MbException {
Object obj = getUserDefinedAttribute("geoLocations");
if (obj instanceof MbTable) {
MbTable table = (MbTable) obj;
int size = table.size();
int i = 0;
table.moveToRow(i);
for (; i < size; i++, table.next()) {
String latitude = (String) table.getValue("latitube");
String longitude = (String) table.getValue("longitude");
}
}
}
The documentation for declaring attributes for user-defined extensions in Java is surprisingly silent on this little bit of detail.
Please note that all the references and code are for WebSphere Message Broker v 8.0.0 and should be relevant for IBM Integration Bus 9.0.0.1 too.
I'm working on a biztalk project and use a map to create the new message.
Now i want to map a datefield to a string.
I thought i can do it on this way with an Function Script with inline C#
public string convertDateTime(DateTime param)
{
return System.Xml.XmlConvert.ToString(param,ÿyyyMMdd");
}
But this doesn't work and i receive an error. How can i do the convert in the map?
It's a Biztalk 2006 project.
Without the details of the error you are seeing it is hard to be sure but I'm quite sure that your map is failing because all the parameters within the BizTalk XSLT engine are passed as strings1.
When I try to run something like the function you provided as inline C# I get the following error:
Object of type 'System.String' cannot be converted to type 'System.DateTime'
Replace your inline C# with something like the following:
public string ConvertDateTime(string param1)
{
DateTime inputDate = DateTime.Parse(param1);
return inputDate.ToString("yyyyMMdd");
}
Note that the parameter type is now string, and you can then convert that to a DateTime and perform your string format.
As other answers have suggested, it may be better to put this helper method into an external class - that way you can get your code under test to deal with edge cases, and you also get some reuse.
1 The fact that all parameters in the BizTalk XSLT are strings can be the source of a lot of gotchas - one other common one is with math calculations. If you return numeric values from your scripting functoids BizTalk will helpfully convert them to strings to map them to the outbound schema but will not so helpfully perform some very random rounding on the resulting values. Converting the return values to strings yourself within the C# will remove this risk and give you the expected results.
If you're using the mapper, you just need a Scripting Functiod (yes, using inline C#) and you should be able to do:
public string convertDateTime(DateTime param)
{
return(param.ToString("YYYYMMdd");
}
As far as I know, you don't need to call the System.Xml namespace in anyway.
I'd suggest
public static string DateToString(DateTime dateValue)
{
return String.Format("{0:yyyyMMdd}", dateValue);
}
You could also create a external Lib which would provide more flexibility and reusability:
public static string DateToString(DateTime dateValue, string formatPicture)
{
string format = formatPicture;
if (IsNullOrEmptyString(formatPicture)
{
format = "{0:yyyyMMdd}";
}
return String.Format(format, dateValue);
}
public static string DateToString(DateTime dateValue)
{
return DateToString(dateValue, null);
}
I tend to move every function I use twice inside an inline script into an external lib. Iit will give you well tested code for all edge cases your data may provide because it's eays to create tests for these external lib functions whereas it's hard to do good testing on inline scripts in maps.
This blog will solve your problem.
http://biztalkorchestration.blogspot.in/2014/07/convert-datetime-format-to-string-in.html?view=sidebar
Regards,
AboorvaRaja
Bangalore
+918123339872
Given that maps in BizTalk are implemented as XSL stylesheets, when passing data into a msxsl scripting function, note that the data will be one of types in the Equivalent .NET Framework Class (Types) from this table here. You'll note that System.DateTime isn't on the list.
For parsing of xs:dateTimes, I've generally obtained the /text() node and then parse the parameter from System.String:
<CreateDate>
<xsl:value-of select="userCSharp:GetDateyyyyMMdd(string(s0:StatusIdChangeDate/text()))" />
</CreateDate>
And then the C# script
<msxsl:script language="C#" implements-prefix="userCSharp">
<![CDATA[
public System.String GetDateyyyyMMdd(System.String p_DateTime)
{
return System.DateTime.Parse(p_DateTime).ToString("yyyyMMdd");
}
]]>