How to list all binding variables with GroovyShell - binding

I'm very new to Groovy. How can I list all variables I passed to Binding constructor ?
Considering I have following :
#Test
public void test() {
List<String> outputNames = Arrays.asList("returnValue", "ce");
String script = getScript();
Script compiledScript = compileScript(script);
CustomError ce = new CustomError("shit", Arrays.asList(new Long(1)));
Map<String, Object> inputObjects = new HashMap<String, Object>();
inputObjects.put("input", "Hovada");
inputObjects.put("error", ce);
Binding binding = new Binding(inputObjects);
compiledScript.setBinding(binding);
compiledScript.run();
for (String outputName : outputNames) {
System.out.format("outputName : %s = %s", outputName, binding.getVariable(outputName));
}
}
private Script compileScript(String script) {
GroovyShell groovyShell = new GroovyShell();
Script compiledScript = groovyShell.parse(script);
return compiledScript;
}
How can I iterate over all the variables (over the hashMap) in groovy.script ?

Script compiledScript represents the script, if you look at its source code, you'll see that it has property binding and getter+setter and Binding has a variable "variables". So you go :
binding.variables.each{
println it.key
println it.value
}
For Map<String, String> ...
you can also set properties like this :
Binding binding = new Binding(inputObjects);
compiledScript.setBinding(binding);
compiledScript.setProperty("prop", "value");
compiledScript.run();
and it is stored into the Binding variables.

Related

spring integration dsl :: stored procedure passing input parameter

I am trying to convert existing stored proc outbound gateway xml into dsl.
<int-jdbc:stored-proc-outbound-gateway id="my-proc"
request-channel="myChannel"
data-source="datasource"
stored-procedure-name="SAMPLE_SP"
expect-single-result="false"
ignore-column-meta-data="true">
<!-- Parameter Definitions -->
<int-jdbc:sql-parameter-definition name="V_TEST_ID" direction="IN"/>
<int-jdbc:sql-parameter-definition name="O_MSG" direction="OUT"/>
<!-- Parameter Mappings Before Passing & Receiving -->
<int-jdbc:parameter name="V_TEST_ID" expression="payload.testId"/>
</int-jdbc:stored-proc-outbound-gateway>
can you please throw some light how to pass input parameters to dsl?
#Bean
public StoredProcOutboundGateway spGateway(){
StoredProcOutboundGateway storedProcOutboundGateway = new StoredProcOutboundGateway(storedProcExecutor());
storedProcOutboundGateway.setExpectSingleResult(true);
storedProcOutboundGateway.setRequiresReply(true);
return storedProcOutboundGateway;
}
#Bean
public StoredProcExecutor storedProcExecutor() {
StoredProcExecutor storedProcExecutor = new StoredProcExecutor(this.datasource);
storedProcExecutor.setStoredProcedureName("SAMPLE_SP2");
storedProcExecutor.setIsFunction(false);
storedProcExecutor.setReturningResultSetRowMappers(..);
return storedProcExecutor;
}
You need to create the procedure parameters, and the sql parameters...
#Bean
public StoredProcExecutor storedProcExecutor() {
StoredProcExecutor storedProcExecutor = new StoredProcExecutor(this.datasource);
storedProcExecutor.setStoredProcedureName("SAMPLE_SP2");
storedProcExecutor.setIsFunction(false);
storedProcExecutor.setReturningResultSetRowMappers(..);
List<ProcedureParameter> procedureParameters = new ArrayList<>();
procedureParameters.add(new ProcedureParameter("cdc_group_name", groupName, null));
// TODO set output_limit from property file
procedureParameters.add(new ProcedureParameter("output_limit", 500, null));
storedProcExecutor.setProcedureParameters(procedureParameters);
List<SqlParameter> sqlParameters = new ArrayList<>();
sqlParameters.add(new SqlParameter("cdc_group_name", Types.CHAR));
sqlParameters.add(new SqlParameter("output_limit", Types.BIGINT));
storedProcExecutor.setSqlParameters(sqlParameters);
return storedProcExecutor;
}

DymanicDestinations in Apache Beam

I have a PCollection [String] say "X" that I need to dump in a BigQuery table.
The table destination and the schema for it is in a PCollection[TableRow] say "Y".
How to accomplish this in the simplest manner?
I tried extracting the table and schema from "Y" and saving it in static global variables (tableName and schema respectively). But somehow oddly the BigQueryIO.writeTableRows() always gets the value of the variable tableName as null. But it gets the schema. I tried logging the values of those variables and I can see the values are there for both.
Here is my pipeline code:
static String tableName;
static TableSchema schema;
PCollection<String> read = p.apply("Read from input file",
TextIO.read().from(options.getInputFile()));
PCollection<TableRow> tableRows = p.apply(
BigQueryIO.read().fromQuery(NestedValueProvider.of(
options.getfilename(),
new SerializableFunction<String, String>() {
#Override
public String apply(String filename) {
return "SELECT table,schema FROM `BigqueryTest.configuration` WHERE file='" + filename +"'";
}
})).usingStandardSql().withoutValidation());
final PCollectionView<List<String>> dataView = read.apply(View.asList());
tableRows.apply("Convert data read from file to TableRow",
ParDo.of(new DoFn<TableRow,TableRow>(){
#ProcessElement
public void processElement(ProcessContext c) {
tableName = c.element().get("table").toString();
String[] schemas = c.element().get("schema").toString().split(",");
List<TableFieldSchema> fields = new ArrayList<>();
for(int i=0;i<schemas.length;i++) {
fields.add(new TableFieldSchema()
.setName(schemas[i].split(":")[0]).setType(schemas[i].split(":")[1]));
}
schema = new TableSchema().setFields(fields);
//My code to convert data to TableRow format.
}}).withSideInputs(dataView));
tableRows.apply("write to BigQuery",
BigQueryIO.writeTableRows()
.withSchema(schema)
.to("ProjectID:DatasetID."+tableName)
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_TRUNCATE)
.withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED));
Everything works fine. Only BigQueryIO.write operation fails and I get the error TableId is null.
I also tried using SerializableFunction and returning the value from there but i still get null.
Here is the code that I tried for it:
tableRows.apply("write to BigQuery",
BigQueryIO.writeTableRows()
.withSchema(schema)
.to(new GetTable(tableName))
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_TRUNCATE)
.withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED));
public static class GetTable implements SerializableFunction<String,String> {
String table;
public GetTable() {
this.table = tableName;
}
#Override
public String apply(String arg0) {
return "ProjectId:DatasetId."+table;
}
}
I also tried using DynamicDestinations but I get an error saying schema is not provided. Honestly I'm new to the concept of DynamicDestinations and I'm not sure that I'm doing it correctly.
Here is the code that I tried for it:
tableRows2.apply(BigQueryIO.writeTableRows()
.to(new DynamicDestinations<TableRow, TableRow>() {
private static final long serialVersionUID = 1L;
#Override
public TableDestination getTable(TableRow dest) {
List<TableRow> list = sideInput(bqDataView); //bqDataView contains table and schema
String table = list.get(0).get("table").toString();
String tableSpec = "ProjectId:DatasetId."+table;
String tableDescription = "";
return new TableDestination(tableSpec, tableDescription);
}
public String getSideInputs(PCollectionView<List<TableRow>> bqDataView) {
return null;
}
#Override
public TableSchema getSchema(TableRow destination) {
return schema; //schema is getting added from the global variable
}
#Override
public TableRow getDestination(ValueInSingleWindow<TableRow> element) {
return null;
}
}.getSideInputs(bqDataView)));
Please let me know what I'm doing wrong and which path I should take.
Thank You.
Part of the reason your having trouble is because of the two stages of pipeline execution. First the pipeline is constructed on your machine. This is when all of the applications of PTransforms occur. In your first example, this is when the following lines are executed:
BigQueryIO.writeTableRows()
.withSchema(schema)
.to("ProjectID:DatasetID."+tableName)
The code within a ParDo however runs when your pipeline executes, and it does so on many machines. So the following code runs much later than the pipeline construction:
#ProcessElement
public void processElement(ProcessContext c) {
tableName = c.element().get("table").toString();
...
schema = new TableSchema().setFields(fields);
...
}
This means that neither the tableName nor the schema fields will be set at when the BigQueryIO sink is created.
Your idea to use DynamicDestinations is correct, but you need to move the code to actually generate the schema the destination into that class, rather than relying on global variables that aren't available on all of the machines.

Execute read operations in sequence - Apache Beam

I need to execute below operations in sequence as given:-
PCollection<String> read = p.apply("Read Lines",TextIO.read().from(options.getInputFile()))
.apply("Get fileName",ParDo.of(new DoFn<String,String>(){
ValueProvider<String> fileReceived = options.getfilename();
#ProcessElement
public void procesElement(ProcessContext c)
{
fileName = fileReceived.get().toString();
LOG.info("File: "+fileName);
}
}));
PCollection<TableRow> rows = p.apply("Read from BigQuery",
BigQueryIO.read()
.fromQuery("SELECT table,schema FROM `DatasetID.TableID` WHERE file='" + fileName +"'")
.usingStandardSql());
How to accomplish this in Apache Beam/Dataflow?
It seems that you want to apply BigQueryIO.read().fromQuery() to a query that depends on a value available via a property of type ValueProvider<String> in your PipelineOptions, and the provider is not accessible at pipeline construction time - i.e. you are invoking your job via a template.
In that case, the proper solution is to use NestedValueProvider:
PCollection<TableRow> tableRows = p.apply(BigQueryIO.read().fromQuery(
NestedValueProvider.of(
options.getfilename(),
new SerializableFunction<String, String>() {
#Override
public String apply(String filename) {
return "SELECT table,schema FROM `DatasetID.TableID` WHERE file='" + fileName +"'";
}
})));

Given a TFS build definition how to get the value and type of an arbitrary process parameter?

Given a build definition, I extract the following pieces from it:
m_template = (DynamicActivity)WorkflowHelpers.DeserializeWorkflow(buildDefinition.Process.Parameters);
Properties = m_template.Properties.ToDictionary(p => p.Name, StringComparer.OrdinalIgnoreCase);
Metadata = WorkflowHelpers.GetCombinedMetadata(m_template).ToDictionary(m => m.ParameterName);
m_parameters = WorkflowHelpers.DeserializeProcessParameters(buildDefinition.ProcessParameters)
Now I wish to know the value of an arbitrary process parameter.
My current code is:
public ParameterValue GetParameterValue(string name)
{
object propValue;
var valueType = GetParameterType(name, out propValue);
object value;
if (!m_parameters.TryGetValue(name, out value))
{
value = propValue;
}
return new ParameterValue(valueType, value);
}
private Type GetParameterType(string name, out object value)
{
value = null;
if (Properties != null)
{
DynamicActivityProperty property;
if (Properties.TryGetValue(name, out property))
{
var inArgument = property.Value as InArgument;
if (inArgument != null)
{
if (inArgument.Expression != null)
{
var exprString = inArgument.Expression.ToString();
if (!exprString.StartsWith(": VisualBasicValue<"))
{
value = exprString;
}
}
return inArgument.ArgumentType;
}
if (property.Value != null)
{
value = property.Value;
return property.Value.GetType();
}
var typeName = property.Type.ToString();
if (typeName.StartsWith(IN_ARGUMENT_TYPE_NAME_PREFIX))
{
typeName = typeName.Substring(IN_ARGUMENT_TYPE_NAME_PREFIX.Length, typeName.Length - IN_ARGUMENT_TYPE_NAME_PREFIX.Length - 1);
return Type.GetType(typeName, true);
}
return property.Type;
}
}
return typeof(string);
}
Unfortunately, this code stumbles for parameters satisfying all of the following conditions:
The parameter value is wrapped as InArgument<T>.
T is a non primitive type, for example string[]
The build definition does not override the value inherited from the process template.
What happens is that:
Because the value is non primitive exprString.StartsWith(": VisualBasicValue<") and I do not know how to handle it. Hence propValue is null.
Because the value is not overridden by the build definition !m_parameters.TryGetValue(name, out value) and hence I just return propValue.
As a result my logic returns null. But it is wrong! For example, I have a string[] parameter which has a list of string in the process template, but my logic returns null for the reasons explained.
So, what is the proper way to compute it?
You can use the following code (included in another link) to get value and type of one process parameter:
TfsTeamProjectCollection tfctc = new TfsTeamProjectCollection(new Uri("http://tfsservername:8080/tfs/DefaultCollection"));
IBuildServer bs = tfctc.GetService<IBuildServer>();
IBuildDetail[] builds = bs.QueryBuilds("teamprojectname", "builddefinitionname");
foreach (var build in builds)
{
var buildefinition = build.BuildDefinition;
IDictionary<String, Object> paramValues = WorkflowHelpers.DeserializeProcessParameters(buildefinition.ProcessParameters);
string processParametersValue = paramValues["argument1"].ToString();
Console.WriteLine(processParametersValue);
}
Also have a check on this case: TFS 2010: Why is it not possible to deserialize a Dictionary<string, object> with XamlWriter.Save when I can use XamlReader for deserializing

Rhino, e4x and generating URLs in XHTML

I'm using Rhino to generate XHTML but my urls are being encoded as in:
-http://www.example.com/test.html?a=b&c=d
becomes
-http://www.example.com/test.html?a=b&c=d
Failing test case as follows:
public class E4XUrlTest extends TestCase {
public void testJavascript() throws Exception {
final Context context = new ContextFactory().enterContext();
context.setLanguageVersion(Context.VERSION_1_7);
try {
final ScriptableObject scope = new Global(context);
final Script compiledScript = context.compileReader(
new StringReader("<html><body><a href={'blah.html?id=2345&name=345'}></a></body></html>"), "test", 1, null);
HashMap<String, Object> variables = new HashMap<String, Object>();
Set<Entry<String, Object>> entrySet = variables.entrySet();
for (Entry<String, Object> entry : entrySet) {
ScriptableObject.putProperty(scope, entry.getKey(), Context.javaToJS(entry.getValue(), scope));
}
Object exec = compiledScript.exec(context, scope);
String html = exec.toString();
System.out.println(html);
assertTrue(html.indexOf("id=2345&name") > 0);
} finally {
Context.exit();
}
}
}
Any ideas?
Actually the encoding "&name" is correct in xHTML since &name; is NOT a valid xHTML entity. ALL browsers understand the URL correctly. So you need to fix your test rather than looking to break your correct xHTML.
:-) stw

Resources