Rhino, e4x and generating URLs in XHTML - rhino

I'm using Rhino to generate XHTML but my urls are being encoded as in:
-http://www.example.com/test.html?a=b&c=d
becomes
-http://www.example.com/test.html?a=b&c=d
Failing test case as follows:
public class E4XUrlTest extends TestCase {
public void testJavascript() throws Exception {
final Context context = new ContextFactory().enterContext();
context.setLanguageVersion(Context.VERSION_1_7);
try {
final ScriptableObject scope = new Global(context);
final Script compiledScript = context.compileReader(
new StringReader("<html><body><a href={'blah.html?id=2345&name=345'}></a></body></html>"), "test", 1, null);
HashMap<String, Object> variables = new HashMap<String, Object>();
Set<Entry<String, Object>> entrySet = variables.entrySet();
for (Entry<String, Object> entry : entrySet) {
ScriptableObject.putProperty(scope, entry.getKey(), Context.javaToJS(entry.getValue(), scope));
}
Object exec = compiledScript.exec(context, scope);
String html = exec.toString();
System.out.println(html);
assertTrue(html.indexOf("id=2345&name") > 0);
} finally {
Context.exit();
}
}
}
Any ideas?

Actually the encoding "&name" is correct in xHTML since &name; is NOT a valid xHTML entity. ALL browsers understand the URL correctly. So you need to fix your test rather than looking to break your correct xHTML.
:-) stw

Related

Neo4j return nested object from extension procedure

I am trying to create extension procedure to Neo4j that will return complex Object (mean object that conation another object).
public static class A {
public final String a;
public A(String a) {
this.a = a;
}
}
public static class Output {
public Object out;
public Output(Object out) {
this.out = out;
}
}
#Procedure(value = "my_proc", mode = Mode.READ)
public Stream<Output> myProc() {
return Stream.of(new Output(new A("a")));
}
When I execute call my_proc(); using Neo4j Browser I it just show the progress circle and never return.
When I execute the same using Java driver, I am getting the following exception:
SEVERE: [0xedd70cbd] Fatal error occurred in the pipeline
org.neo4j.driver.internal.shaded.io.netty.handler.codec.DecoderException: Failed to read inbound message:
at org.neo4j.driver.internal.async.inbound.InboundMessageHandler.channelRead0(InboundMessageHandler.java:87)
at org.neo4j.driver.internal.async.inbound.InboundMessageHandler.channelRead0(InboundMessageHandler.java:35)
....
at org.neo4j.driver.internal.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
at org.neo4j.driver.internal.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
at org.neo4j.driver.internal.shaded.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
at org.neo4j.driver.internal.shaded.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
at org.neo4j.driver.internal.shaded.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at org.neo4j.driver.internal.shaded.io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.IndexOutOfBoundsException: readerIndex(49) + length(1) exceeds writerIndex(49): PooledDuplicatedByteBuf(ridx: 49, widx: 49, cap: 133, unwrapped: PooledUnsafeDirectByteBuf(ridx: 51, widx: 89, cap: 133))
at org.neo4j.driver.internal.shaded.io.netty.buffer.AbstractByteBuf.checkReadableBytes0(AbstractByteBuf.java:1401)
at org.neo4j.driver.internal.shaded.io.netty.buffer.AbstractByteBuf.readByte(AbstractByteBuf.java:707)
at org.neo4j.driver.internal.async.inbound.ByteBufInput.readByte(ByteBufInput.java:45)
at org.neo4j.driver.internal.packstream.PackStream$Unpacker.unpackLong(PackStream.java:479)
at org.neo4j.driver.internal.messaging.PackStreamMessageFormatV1$Reader.unpackValue(PackStreamMessageFormatV1.java:479)
at org.neo4j.driver.internal.messaging.PackStreamMessageFormatV1$Reader.unpackRecordMessage(PackStreamMessageFormatV1.java:464)
at org.neo4j.driver.internal.messaging.PackStreamMessageFormatV1$Reader.read(PackStreamMessageFormatV1.java:390)
at org.neo4j.driver.internal.async.inbound.InboundMessageHandler.channelRead0(InboundMessageHandler.java:83)
... 39 more
Is there any way to return nested object without serialize it to json before return it?
#Procedure(value = "ebc.neo4j.justamap", mode = Mode.READ)
public Stream<MapResult> justamap() {
HashMap<String, Object> v1Map = new HashMap<String, Object>(1);
HashMap<String, Object> v2Map = new HashMap<String, Object>(1);
v2Map.put("a", "a string");
v1Map.put("map inside", v2Map);
return Stream.of(new MapResult(v1Map));
}
public static class MapResult {
public Map internalmap;
public MapResult(Map aInternalId) {
this.internalmap = aInternalId;
}
}
Something like the above is possible (as deep as you want) but returning custom objects to be used in Cypher (which is the idea when you're writing a procedure) is not going to work as Cypher has no idea what those custom objects are. So you will have to do some sanitizing.
Hope this helps.
Regards,
Tom

Verify Backing Bean values using Arquillian Warp

My goal is to test using Arquillian Warp
1. Already navigated to a JSF page on a previous test
2. On a another test set a text field to a value, using warp i need to inject the ViewScope Bean , and verify the value in the backing bean
Sample Code
#RunWith(Arquillian.class)
#WarpTest
#RunAsClient
public class TestIT {
private static final String WEBAPP_SRC = "src/main/webapp";
private static final String WEB_INF_SRC = "src/main/webapp/WEB-INF";
private static final String WEB_RESOURCES = "src/main/webapp/resources";
#Deployment(testable = true)
public static WebArchive createDeployment() {
File[] files = Maven.resolver().loadPomFromFile("pom.xml")
.importRuntimeDependencies().resolve().withTransitivity().asFile();
WebArchive war = ShrinkWrap.create(WebArchive.class, "test.war")
.addPackages(true, "com.mobitill")
.addAsWebInfResource(EmptyAsset.INSTANCE, "beans.xml")
.addAsWebInfResource(new File(WEB_INF_SRC, "template.xhtml"))
.addAsWebInfResource(new File(WEB_INF_SRC, "jboss-web.xml"))
.addAsWebInfResource(new File(WEB_INF_SRC, "web.xml"))
.addAsWebResource(new File(WEBAPP_SRC, "index.xhtml"))
.addAsWebResource(new File("src/main/webapp/demo", "home.xhtml"), "demo/home.xhtml")
.addAsResource("test-persistence.xml", "META-INF/persistence.xml")
.merge(ShrinkWrap.create(GenericArchive.class).as(ExplodedImporter.class)
.importDirectory(WEB_RESOURCES).as(GenericArchive.class), "resources")
.addAsLibraries(files);
System.out.println(war.toString(true));
return war;
}
#Drone
private WebDriver browser;
#ArquillianResource
private URL deploymentUrl;
#Test
#InSequence(1)
public final void browserTest() throws Exception {
browser.get(deploymentUrl.toExternalForm() + "index");
guardHttp(loginImage).click();
Assert.assertEquals("navigate to home page ", "https://127.0.0.1:8080/citi/demo/home", browser.getCurrentUrl());
}
#Test
#InSequence(2)
public final void homeManagedBean() throws Exception {
Warp
.initiate(new Activity() {
#Override
public void perform() {
WebElement txtMerchantEmailAddress = browser.findElement(By.id("txtMerchantEmailAddress"));
txtMerchantEmailAddress.sendKeys("demouser#yahoo.com");
guardAjax(btnMerchantSave).click();
}
})
.observe(request().header().containsHeader("faces-request"))
.inspect(new Inspection() {
private static final long serialVersionUID = 1L;
#Inject
HomeManagedBean hmb;
#ArquillianResource
FacesContext facesContext;
#BeforePhase(UPDATE_MODEL_VALUES)
public void initial_state_havent_changed_yet() {
Assert.assertEquals("email value ", "demouser#yahoo.com", hmb.getMerchantEmail());
}
#AfterPhase(UPDATE_MODEL_VALUES)
public void changed_input_value_has_been_applied() {
Assert.assertEquals(" email value ", "demouser#yahoo.com", hmb.getMerchantEmail());
}
});
}
}
the error i keep gettting is
org.jboss.arquillian.warp.impl.client.execution.WarpSynchronizationException: The Warp failed to observe requests or match them with response.
There were no requests matched by observer [containsHeader('faces-request')]
If Warp enriched a wrong request, use observe(...) method to select appropriate request which should be enriched instead.
Otherwise check the server-side log and enable Arquillian debugging mode on both, test and server VM by passing -Darquillian.debug=true.
at org.jboss.arquillian.warp.impl.client.execution.SynchronizationPoint.awaitResponses(SynchronizationPoint.java:155)
at org.jboss.arquillian.warp.impl.client.execution.DefaultExecutionSynchronizer.waitForResponse(DefaultExecutionSynchronizer.java:60)
at org.jboss.arquillian.warp.impl.client.execution.WarpExecutionObserver.awaitResponse(WarpExecutionObserver.java:64)
any help will be welcomed or an alternative way of validating a jsf viewscope bean during integration testing
I was able to sort out it was not working and able to create a sample project for future reference if anyone comes by the same problem
Testing using arquillian warp example

DymanicDestinations in Apache Beam

I have a PCollection [String] say "X" that I need to dump in a BigQuery table.
The table destination and the schema for it is in a PCollection[TableRow] say "Y".
How to accomplish this in the simplest manner?
I tried extracting the table and schema from "Y" and saving it in static global variables (tableName and schema respectively). But somehow oddly the BigQueryIO.writeTableRows() always gets the value of the variable tableName as null. But it gets the schema. I tried logging the values of those variables and I can see the values are there for both.
Here is my pipeline code:
static String tableName;
static TableSchema schema;
PCollection<String> read = p.apply("Read from input file",
TextIO.read().from(options.getInputFile()));
PCollection<TableRow> tableRows = p.apply(
BigQueryIO.read().fromQuery(NestedValueProvider.of(
options.getfilename(),
new SerializableFunction<String, String>() {
#Override
public String apply(String filename) {
return "SELECT table,schema FROM `BigqueryTest.configuration` WHERE file='" + filename +"'";
}
})).usingStandardSql().withoutValidation());
final PCollectionView<List<String>> dataView = read.apply(View.asList());
tableRows.apply("Convert data read from file to TableRow",
ParDo.of(new DoFn<TableRow,TableRow>(){
#ProcessElement
public void processElement(ProcessContext c) {
tableName = c.element().get("table").toString();
String[] schemas = c.element().get("schema").toString().split(",");
List<TableFieldSchema> fields = new ArrayList<>();
for(int i=0;i<schemas.length;i++) {
fields.add(new TableFieldSchema()
.setName(schemas[i].split(":")[0]).setType(schemas[i].split(":")[1]));
}
schema = new TableSchema().setFields(fields);
//My code to convert data to TableRow format.
}}).withSideInputs(dataView));
tableRows.apply("write to BigQuery",
BigQueryIO.writeTableRows()
.withSchema(schema)
.to("ProjectID:DatasetID."+tableName)
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_TRUNCATE)
.withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED));
Everything works fine. Only BigQueryIO.write operation fails and I get the error TableId is null.
I also tried using SerializableFunction and returning the value from there but i still get null.
Here is the code that I tried for it:
tableRows.apply("write to BigQuery",
BigQueryIO.writeTableRows()
.withSchema(schema)
.to(new GetTable(tableName))
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_TRUNCATE)
.withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED));
public static class GetTable implements SerializableFunction<String,String> {
String table;
public GetTable() {
this.table = tableName;
}
#Override
public String apply(String arg0) {
return "ProjectId:DatasetId."+table;
}
}
I also tried using DynamicDestinations but I get an error saying schema is not provided. Honestly I'm new to the concept of DynamicDestinations and I'm not sure that I'm doing it correctly.
Here is the code that I tried for it:
tableRows2.apply(BigQueryIO.writeTableRows()
.to(new DynamicDestinations<TableRow, TableRow>() {
private static final long serialVersionUID = 1L;
#Override
public TableDestination getTable(TableRow dest) {
List<TableRow> list = sideInput(bqDataView); //bqDataView contains table and schema
String table = list.get(0).get("table").toString();
String tableSpec = "ProjectId:DatasetId."+table;
String tableDescription = "";
return new TableDestination(tableSpec, tableDescription);
}
public String getSideInputs(PCollectionView<List<TableRow>> bqDataView) {
return null;
}
#Override
public TableSchema getSchema(TableRow destination) {
return schema; //schema is getting added from the global variable
}
#Override
public TableRow getDestination(ValueInSingleWindow<TableRow> element) {
return null;
}
}.getSideInputs(bqDataView)));
Please let me know what I'm doing wrong and which path I should take.
Thank You.
Part of the reason your having trouble is because of the two stages of pipeline execution. First the pipeline is constructed on your machine. This is when all of the applications of PTransforms occur. In your first example, this is when the following lines are executed:
BigQueryIO.writeTableRows()
.withSchema(schema)
.to("ProjectID:DatasetID."+tableName)
The code within a ParDo however runs when your pipeline executes, and it does so on many machines. So the following code runs much later than the pipeline construction:
#ProcessElement
public void processElement(ProcessContext c) {
tableName = c.element().get("table").toString();
...
schema = new TableSchema().setFields(fields);
...
}
This means that neither the tableName nor the schema fields will be set at when the BigQueryIO sink is created.
Your idea to use DynamicDestinations is correct, but you need to move the code to actually generate the schema the destination into that class, rather than relying on global variables that aren't available on all of the machines.

Continuously output from StandardOutput to text box in Visual C# [duplicate]

I have an external dll written in C# and I studied from the assemblies documentation that it writes its debug messages to the Console using Console.WriteLine.
this DLL writes to console during my interaction with the UI of the Application, so i don't make DLL calls directly, but i would capture all console output , so i think i got to intialize in form load , then get that captured text later.
I would like to redirect all the output to a string variable.
I tried Console.SetOut, but its use to redirect to string is not easy.
As it seems like you want to catch the Console output in realtime, I figured out that you might create your own TextWriter implementation that fires an event whenever a Write or WriteLine happens on the Console.
The writer looks like this:
public class ConsoleWriterEventArgs : EventArgs
{
public string Value { get; private set; }
public ConsoleWriterEventArgs(string value)
{
Value = value;
}
}
public class ConsoleWriter : TextWriter
{
public override Encoding Encoding { get { return Encoding.UTF8; } }
public override void Write(string value)
{
if (WriteEvent != null) WriteEvent(this, new ConsoleWriterEventArgs(value));
base.Write(value);
}
public override void WriteLine(string value)
{
if (WriteLineEvent != null) WriteLineEvent(this, new ConsoleWriterEventArgs(value));
base.WriteLine(value);
}
public event EventHandler<ConsoleWriterEventArgs> WriteEvent;
public event EventHandler<ConsoleWriterEventArgs> WriteLineEvent;
}
If it's a WinForm app, you can setup the writer and consume its events in the Program.cs like this:
/// <summary>
/// The main entry point for the application.
/// </summary>
[STAThread]
static void Main()
{
using (var consoleWriter = new ConsoleWriter())
{
consoleWriter.WriteEvent += consoleWriter_WriteEvent;
consoleWriter.WriteLineEvent += consoleWriter_WriteLineEvent;
Console.SetOut(consoleWriter);
Application.EnableVisualStyles();
Application.SetCompatibleTextRenderingDefault(false);
Application.Run(new Form1());
}
}
static void consoleWriter_WriteLineEvent(object sender, Program.ConsoleWriterEventArgs e)
{
MessageBox.Show(e.Value, "WriteLine");
}
static void consoleWriter_WriteEvent(object sender, Program.ConsoleWriterEventArgs e)
{
MessageBox.Show(e.Value, "Write");
}
It basically amounts to the following:
var originalConsoleOut = Console.Out; // preserve the original stream
using(var writer = new StringWriter())
{
Console.SetOut(writer);
Console.WriteLine("some stuff"); // or make your DLL calls :)
writer.Flush(); // when you're done, make sure everything is written out
var myString = writer.GetStringBuilder().ToString();
}
Console.SetOut(originalConsoleOut); // restore Console.Out
So in your case you'd set this up before making calls to your third-party DLL.
You can also call SetOut with Console.OpenStandardOutput, this will restore the original output stream:
Console.SetOut(new StreamWriter(Console.OpenStandardOutput()));
Or you can wrap it up in a helper method that takes some code as an argument run it and returns the string that was printed. Notice how we gracefully handle exceptions.
public string RunCodeReturnConsoleOut(Action code)
{
string result;
var originalConsoleOut = Console.Out;
try
{
using (var writer = new StringWriter())
{
Console.SetOut(writer);
code();
writer.Flush();
result = writer.GetStringBuilder().ToString();
}
return result;
}
finally
{
Console.SetOut(originalConsoleOut);
}
}
Using solutions proposed by #Adam Lear and #Carlo V. Dango I created a helper class:
public sealed class RedirectConsole : IDisposable
{
private readonly Action<string> logFunction;
private readonly TextWriter oldOut = Console.Out;
private readonly StringWriter sw = new StringWriter();
public RedirectConsole(Action<string> logFunction)
{
this.logFunction = logFunction;
Console.SetOut(sw);
}
public void Dispose()
{
Console.SetOut(oldOut);
sw.Flush();
logFunction(sw.ToString());
sw.Dispose();
}
}
which can be used in the following way:
public static void MyWrite(string str)
{
// print console output to Log/Socket/File
}
public static void Main()
{
using(var r = new RedirectConsole(MyWrite)) {
Console.WriteLine("Message 1");
Console.WriteLine("Message 2");
}
// After the using section is finished,
// MyWrite will be called once with a string containing all messages,
// which has been written during the using section,
// separated by new line characters
}

How to list all binding variables with GroovyShell

I'm very new to Groovy. How can I list all variables I passed to Binding constructor ?
Considering I have following :
#Test
public void test() {
List<String> outputNames = Arrays.asList("returnValue", "ce");
String script = getScript();
Script compiledScript = compileScript(script);
CustomError ce = new CustomError("shit", Arrays.asList(new Long(1)));
Map<String, Object> inputObjects = new HashMap<String, Object>();
inputObjects.put("input", "Hovada");
inputObjects.put("error", ce);
Binding binding = new Binding(inputObjects);
compiledScript.setBinding(binding);
compiledScript.run();
for (String outputName : outputNames) {
System.out.format("outputName : %s = %s", outputName, binding.getVariable(outputName));
}
}
private Script compileScript(String script) {
GroovyShell groovyShell = new GroovyShell();
Script compiledScript = groovyShell.parse(script);
return compiledScript;
}
How can I iterate over all the variables (over the hashMap) in groovy.script ?
Script compiledScript represents the script, if you look at its source code, you'll see that it has property binding and getter+setter and Binding has a variable "variables". So you go :
binding.variables.each{
println it.key
println it.value
}
For Map<String, String> ...
you can also set properties like this :
Binding binding = new Binding(inputObjects);
compiledScript.setBinding(binding);
compiledScript.setProperty("prop", "value");
compiledScript.run();
and it is stored into the Binding variables.

Resources