Dataflow launcher Payload Java - spring-cloud-dataflow

I build a stream with an http source and a sink dataflow-launcher to execute a spring batch task named batchPY546Task
To launch this task i set the localFilePath=path-of-the-file parameter.
So in the documentation, with the http source it's possible to pass informations thru the payload.
https://github.com/spring-cloud-stream-app-starters/tasklauncher-dataflow/blob/master/spring-cloud-starter-stream-sink-task-launcher-dataflow/README.adoc
{
"name":"foo",
"deploymentProps": {"key1":"val1","key2":"val2"},
"args":["--debug", "--foo", "bar"]
}
I try many syntaxes :
curl http://localhost:57110 -H"Content-Type:application/json" -d '{"name":"batchPy546Task", "args":{"localFilePath=/tmp/remote-files1/BLM-54.00.01_Multicontrat_Creation_IDCRT011-b.xml"}}'
and all are wrongs
Caused by: com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot deserialize instance of java.util.ArrayList<java.lang.Object> out of START_OBJECT token
at [Source: (byte[])"{"name":"batchPy546Task", "args":{"localFilePath=/tmp/remote-files1/BLM-54.00.01_Multicontrat_Creation_IDCRT011-b.xml"}}"; line: 1, column: 34] (through reference chain: org.springframework.cloud.stream.app.task.launcher.dataflow.sink.LaunchRequest["args"])
at com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:59) ~[jackson-databind-2.10.2.jar!/:2.10.2]
How can I pass the parameter lcalFilePath to my task ?
Versions :
dataflow server 14.2
skipper server 2.3.2
datafmow auncher is up to date and compatible with dataflow server 2.4.2.
Regards

Problem solved with this syntax :
curl http://localhost:57110 -H"Content-Type:application/json" -d '{"name":"batchPy546Task", "args":["localFilePath=/tmp/remote-files1/BLM-54.00.01_Multicontrat_Creation_IDCRT011-b.xml"]}'

args must be a json list "args":["localFilePath=/tmp/..."]

Related

Flink consumer job is unable to connect to Kafka from Docker

I have a simple streaming Flink Scala job which connects to a Kafka topic and maps its
org.apache.avro.generic.GenericRecord messages and map into Json string.
When it is running in IntelliJ it ingests the topic well and printing out the jsons.
When I run it in docker-compose I got the following exception:
com.esotericsoftware.kryo.KryoException: Error constructing instance of class: org.apache.avro.Schema$LockableArrayList
Serialization trace:
types (org.apache.avro.Schema$UnionSchema)
schema (org.apache.avro.Schema$Field)
fieldMap (org.apache.avro.Schema$RecordSchema)
schema (org.apache.avro.generic.GenericData$Record)
at com.twitter.chill.Instantiators$$anon$1.newInstance(KryoBase.scala:136)
at com.esotericsoftware.kryo.Kryo.newInstance(Kryo.java:1061)
at com.esotericsoftware.kryo.serializers.CollectionSerializer.create(CollectionSerializer.java:89)
at com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:93)
at com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:22)
at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:679)
at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:528)
at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:679)
at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:528)
at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:761)
at com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:143)
at com.esotericsoftware.kryo.serializers.MapSerializer.read(MapSerializer.java:21)
at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:679)
at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:528)
at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:679)
at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:106)
at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:528)
at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:657)
at org.apache.flink.api.java.typeutils.runtime.kryo.KryoSerializer.copy(KryoSerializer.java:262)
at org.apache.flink.api.java.typeutils.runtime.TupleSerializer.copy(TupleSerializer.java:111)
at org.apache.flink.api.java.typeutils.runtime.TupleSerializer.copy(TupleSerializer.java:37)
at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.pushToOperator(OperatorChain.java:635)
at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:612)
at org.apache.flink.streaming.runtime.tasks.OperatorChain$CopyingChainingOutput.collect(OperatorChain.java:592)
at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:727)
at org.apache.flink.streaming.api.operators.AbstractStreamOperator$CountingOutput.collect(AbstractStreamOperator.java:705)
at org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collect(StreamSourceContexts.java:104)
at org.apache.flink.streaming.api.operators.StreamSourceContexts$NonTimestampContext.collectWithTimestamp(StreamSourceContexts.java:111)
at org.apache.flink.streaming.connectors.kafka.internals.AbstractFetcher.emitRecordWithTimestamp(AbstractFetcher.java:398)
at org.apache.flink.streaming.connectors.kafka.internal.KafkaFetcher.emitRecord(KafkaFetcher.java:185)
at org.apache.flink.streaming.connectors.kafka.internal.KafkaFetcher.runFetchLoop(KafkaFetcher.java:150)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.run(FlinkKafkaConsumerBase.java:715)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:100)
at org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:63)
at org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:208)
Caused by: java.lang.IllegalAccessException: Class com.twitter.chill.Instantiators$ can not access a member of class org.apache.avro.Schema$LockableArrayList with modifiers "public"
at sun.reflect.Reflection.ensureMemberAccess(Reflection.java:102)
at java.lang.reflect.AccessibleObject.slowCheckMemberAccess(AccessibleObject.java:296)
at java.lang.reflect.AccessibleObject.checkAccess(AccessibleObject.java:288)
at java.lang.reflect.Constructor.newInstance(Constructor.java:413)
at com.twitter.chill.Instantiators$.$anonfun$normalJava$1(KryoBase.scala:170)
at com.twitter.chill.Instantiators$$anon$1.newInstance(KryoBase.scala:133)
... 37 more
I tried forcing Avro serialization with:
val env: StreamExecutionEnvironment = StreamExecutionEnvironment.getExecutionEnvironment
env.getConfig.disableForceKryo()
env.getConfig.enableForceAvro()
but got the same error.
Based on [this][1] I'm using all Flink related dependencies as "provided" with no good result.
What can be the difference between running the job in the IDE and in
Docker?
How can I fix the job to be able to read the Kafka topic from
Docker?
How shall I setup Docker for this?
What can I handle Kryo/Avro serialization issue?
SS
[1]: http://www.alternatestack.com/development/com-esotericsoftware-kryo-kryoexception-unusual-solution-upgrading-flink/

Standalone MockServer: Where do I implement expectations?

I am trying to mock an external (REST) server used by my system under test.
I am choosing MockServer (http://www.mock-server.com/) for mocking the external REST server.
I am running mock server standalone as in:
$ java -jar ./mockserver-netty-5.3.0-jar-with-dependencies.jar
-serverPort 1080 -proxyPort 1090 -proxyRemotePort 80 -proxyRemoteHost www.mock-server.com 2018-05-23 14:05:57,703 INFO o.m.m.MockServer
MockServer started on port: 1080 2018-05-23 14:05:57,747 INFO
o.m.p.d.DirectProxy MockServer started on port: 1090
I am not sure, having read the documentation, where I should define the expectations (viz., the responses the mock should yield based on incoming requests).
Can anyone explain?
Thanx,
R
It can be done by PUT, ie:
curl -v -X PUT "http://localhost:1080/expectation" -d '{
"httpRequest" : {
"path" : "/some/path"
},
"httpResponse" : {
"body" : "some_response_body"
}
}'
More info https://www.mock-server.com/mock_server/creating_expectations.html and go for REST API type of expectation
I used Postman to create the expectation. For creating expectations send a PUT request to http://localhost:portnumber/mockserver/expectation.
You can check the expectation and logs using this URL http://localhost:portnumber/mockserver/dashboard in the browser.

Elasticsearch river plugin for Neo4j: NoClassSettingsException[Failed to load class with value [neo4j]]

I am trying to make ES work with Neo4j and I followed the steps you said on the documentation. The thing is that I am having difficulties getting the connection done. ES doesn't has the values of NEo4j.
When I check for indexes on the ES the "_river" index is well created but some errors appear when i do:
curl http://localhost:9200/_river/_search?pretty=true&q=:
Here is the error:
{"error":"NoClassSettingsException[Failed to load class with value
[neo4j]]; nested: ClassNotFoundException[neo4j];
","node":{"id":"2rDeNA46SJ63jYFlpoGfKQ","name":"Zip-Zap","transport_address":"inet[/10.140.28.166:9300]"}}
When i get all the indexes from ES
(curl http://localhost:9200/_aliases?pretty=1)
I get:
{
"_river" : {
"aliases" : { }
}
Do you have an idea about what is the problem?
PS: the version used was the 1.1.1
The error is:
NoClassSettingsException Failed to load class with value neo4j nested:
ClassNotFoundException neo4j
Complete response:
http://p.shrib.com/lsfnzF3H?v=nc&s=m
Thanks in advance
I got the same error, and I think the installation instructions are missing one step:
Download "elasticsearch-river-neo4j-1.2.1.1.jar" from Click here and copy it to "HOME_DIRECTORY_OF_ES/lib" folder.
Doing this and an elasticsearch restart solved it for me.
Found at: Examples about Integration of Elasticsearch with neo4j

Nutch: Job Failed

i have problem while running nutch for inject
following is the command i am running
bin/nutch inject bin/crawl/crawldb bin/urls
after running above command, gets following error
Injector: starting at 2014-04-02 13:02:29
Injector: crawlDb: bin/crawl/crawldb
Injector: urlDir: bin/urls/seed.txt
Injector: Converting injected urls to crawl db entries.
Injector: total number of urls rejected by filters: 2
Injector: total number of urls injected after normalization and filtering: 0
Injector: Merging injected urls into crawl db.
Injector: overwrite: false
Injector: update: false
Injector: java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1357)
at org.apache.nutch.crawl.Injector.inject(Injector.java:294)
at org.apache.nutch.crawl.Injector.run(Injector.java:316)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.nutch.crawl.Injector.main(Injector.java:306)
I am running nutch for the first time.
i have checked solr, nutch are installed properly.
below details are from log file
java.io.IOException: The temporary job-output directory file:/usr/share/apache-nutch-1.8/bin/crawl/crawldb/1639805438/_temporary doesn't exist!
at org.apache.hadoop.mapred.FileOutputCommitter.getWorkPath(FileOutputCommitter.java:250)
at org.apache.hadoop.mapred.FileOutputFormat.getTaskOutputPath(FileOutputFormat.java:244)
at org.apache.hadoop.mapred.MapFileOutputFormat.getRecordWriter(MapFileOutputFormat.java:46)
at org.apache.hadoop.mapred.ReduceTask$OldTrackingRecordWriter.<init>(ReduceTask.java:449)
at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:491)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:421)
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:398)
2014-04-02 12:54:46,251 ERROR crawl.Injector - Injector: java.io.IOException: Job failed!
at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1357)
at org.apache.nutch.crawl.Injector.inject(Injector.java:294)
at org.apache.nutch.crawl.Injector.run(Injector.java:316)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.nutch.crawl.Injector.main(Injector.java:306)
was using bin/nutch inject bin/crawl/crawldb bin/urls command to inject
instead of bin/nutch inject crawl/crawldb bin/urls
Which solves the error.
and for fetching urls i have done changes to regex-urlfilter.txt file, now am able to fetch the urls.
Make sure you don't have any syntax errors in any of your nutch config files.

Jenkins-Testlink integration - HTTP server returned unexpected status: Found

I’m trying to connect Jenkins (1.482) with TestLink (1.9.4) thru Jenkins configuration in order to retrieve tests, but while running the job in Jenkins I get the below error in the console log.
Please note that Jenkins is hosted on tomcat (linux) on network“gnb” and Testlink is hosted on php (linux) on another network “<company network name>”. It works well when both are on my localhost (in windows)
but this integration does not work when both Jenkins and TestLink are on separate networks/hosts.
I get the below error on the console while running the job:
Preparing TestLink client API.
Using TestLink URL: http://<hostname>/mr61_php5/testlink/lib/api/xmlrpc.php
FATAL: Error verifying developer key: HTTP server returned unexpected status: Found
br.eti.kinoshita.testlinkjavaapi.util.TestLinkAPIException: Error verifying developer key: HTTP server returned unexpected status: Found
at br.eti.kinoshita.testlinkjavaapi.MiscService.checkDevKey(MiscService.java:66)
at br.eti.kinoshita.testlinkjavaapi.TestLinkAPI.(TestLinkAPI.java:162)
at hudson.plugins.testlink.TestLinkBuilder.getTestLinkSite(TestLinkBuilder.java:244)
at hudson.plugins.testlink.TestLinkBuilder.perform(TestLinkBuilder.java:134)
at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:19)
at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:717)
at hudson.model.Build$BuildExecution.build(Build.java:199)
at hudson.model.Build$BuildExecution.doRun(Build.java:160)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:499)
at hudson.model.Run.execute(Run.java:1502)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
at hudson.model.ResourceController.execute(ResourceController.java:88)
at hudson.model.Executor.run(Executor.java:236)
Caused by: org.apache.xmlrpc.client.XmlRpcHttpTransportException: HTTP server returned unexpected status: Found
at org.apache.xmlrpc.client.XmlRpcSunHttpTransport.getInputStream(XmlRpcSunHttpTransport.java:94)
at org.apache.xmlrpc.client.XmlRpcStreamTransport.sendRequest(XmlRpcStreamTransport.java:152)
at org.apache.xmlrpc.client.XmlRpcHttpTransport.sendRequest(XmlRpcHttpTransport.java:143)
at org.apache.xmlrpc.client.XmlRpcSunHttpTransport.sendRequest(XmlRpcSunHttpTransport.java:69)
at org.apache.xmlrpc.client.XmlRpcClientWorker.execute(XmlRpcClientWorker.java:56)
at org.apache.xmlrpc.client.XmlRpcClient.execute(XmlRpcClient.java:167)
at org.apache.xmlrpc.client.XmlRpcClient.execute(XmlRpcClient.java:158)
at org.apache.xmlrpc.client.XmlRpcClient.execute(XmlRpcClient.java:147)
at br.eti.kinoshita.testlinkjavaapi.BaseService.executeXmlRpcCall(BaseService.java:90)
at br.eti.kinoshita.testlinkjavaapi.MiscService.checkDevKey(MiscService.java:62)
... 12 more
ERROR: Error communicating with TestLink. Check your TestLink configuration.
I have below settings in my Jenkins’s global configuration for Testlink installation
Name: testlink
URL: http://<host name>/mr61_php5/testlink/lib/api/xmlrpc.php
Developer key: generated from Testlink (Settings->Generate a new key)
Can you please point me if I miss something?
Usually in the Testlink folder structure, the path that you have mentioned, does not contain the xmlrpc.php file
Probabaly worng URL: URL: http:///mr61_php5/testlink/lib/api/
The correct URL is usually of this format
.../testlink/lib/api/xmlrpc//xmlrpc.php
Kindly check the correct URL, or try opening the xmlrpc.php page, so that you can get the correct path of the file. As per my assumption it should be somewhat like this:
http:///mr61_php5/testlink/lib/api/xmlrpc/xmlrpc.php
Good answer In my case it is as below...
http://IP:PORT/testlink/lib/api/xmlrpc/v1/xmlrpc.php in 1.9.11 version of testlink

Resources