In my app I use spock in version 2.0-M1-groovy-2.5. I have a problem that even though I mokced a class I get npe from the internals of mocked method.
I have simple class called ReactiveRestHighLevelClient which looks like this
open class ReactiveRestHighLevelClient(val restHighLevelClient: RestHighLevelClient, private val objectMapper: ObjectMapper) {
...
fun index(indexRequest: IndexRequest): Mono<IndexResponse> {
return Mono.create<IndexResponse> { sink ->
restHighLevelClient.indexAsync( // < -- HERE I GET THE NPE EVEN THOUGH THE METHOD IS MOCKED
indexRequest,
object : ActionListener<IndexResponse> {
override fun onFailure(e: Exception) {
e.printStackTrace()
sink.error(e)
}
override fun onResponse(response: IndexResponse) {
sink.success(response)
}
}
)
}
}
...
}
I have class ThreadModelElasticsearchService which uses the ReactiveRestHighLevelClient and looks like this:
#Component
class ThreadModelElasticsearchService(
private val objectMapper: ObjectMapper,
private val reactiveElasticsearchClient: ReactiveRestHighLevelClient,
private val extraFactsService: ExtraFactsService,
private val customerDataService: CustomerDataService,
rateLimiterRegistry: RateLimiterRegistry
) {
...
fun save(operationId: String, threadModel: ThreadModel): Mono<ThreadModel> {
val id = threadModel.threadId
?: ThreadModel.createKey(threadModel.partnerId, threadModel.customerId)
return reactiveElasticsearchClient
.index(
IndexRequest(ThreadModel.INDEX, ThreadModel.TYPE, id)
.source(objectMapper.writeValueAsString(threadModel), XContentType.JSON)
)
.thenReturn(threadModel)
.doOnNext { logger.info("[operation_id=$operationId] -- Saved : $it") }
.onErrorMap { e ->
logger.error("[operation_id=$operationId] -- Error calling elasticSearch", e)
ElasticRepositoryError(e)
}
.rateLimit(elasticRateLimiter)
}
...
}
Finally I have my test class called ThreadModelElasticsearchServiceTest which looks like this:
class ThreadModelElasticsearchServiceTest extends Specification {
ReactiveRestHighLevelClient reactiveElasticsearchClient = Mock()
... other mocks
ThreadModelElasticsearchService threadModelReactiveElasticsearchClient = new
ThreadModelElasticsearchService(objectMapper, reactiveElasticsearchClient, extraFactsService, customerDataService, RateLimiterRegistry.of(RateLimiterConfig.ofDefaults()))
def "should work"() {
given:
ThreadModel threadModel = new ThreadModel(...)
reactiveElasticsearchClient.index(_ as IndexRequest) >> Mono.just(indexResponse)
expect:
threadModelReactiveElasticsearchClient.save("should-work", threadModel).block()
}
When I run the tests I get execption like
java.lang.NullPointerException: null
at com.cb.elastic.search.api.elasticsearch.ReactiveRestHighLevelClient$index$1.accept(ReactiveRestHighLevelClient.kt:76)
which points to the body of the index method of the ReactiveRestHighLevelClient mock which is strange.
How to solve this ?
Related
I want to use jmockit to test the static method in Spock, and combine the where tag to achieve different values of each mock to test different business logic. I tried a lot of writing methods, but they all failed. I hope I can get help or suggestions here. Thank you very much
Here is an example of my business code:
public class MyUtils {
public static int staticMethod(int origin) {
return 0;
}
}
public class MyClass {
public void verify(int origin) {
if (MyUtils.staticMethod(origin) == 1) {
System.out.println("1");
}
if (MyUtils.staticMethod(origin) == 2) {
System.out.println("2");
}
...
}
}
This is my Spock test code:
def "verify"() {
when:
myClass.verify(0)
then:
true
where:
mock | _
mockStatic(1) | _
mockStatic(2) | _
}
def mockStatic(val){
new MockUp<MyUtils>() {
#Mock
public int staticMethod(int origin) {
return val
}
}
}
I know that power can implement such a function, but because our team has been using jmockit, we want to know whether jmockit can implement such multiple different values of mock in Spock?
Put your method call into a closure and evaluate the closure during each iteration:
package de.scrum_master.stackoverflow.q67882559
import mockit.Mock
import mockit.MockUp
import mockit.internal.state.SavePoint
import spock.lang.Requires
import spock.lang.Specification
import spock.lang.Unroll
class StaticMethodJMockitTest extends Specification {
def jMockitSavePoint = new SavePoint()
def cleanup() {
jMockitSavePoint.rollback()
}
#Unroll
def "verify"() {
given:
mockClosure()
MyClass myClass = new MyClass()
when:
myClass.verify(0)
then:
true
where:
mockClosure << [
{ /* no mock */ },
{ mockStatic(1) },
{ mockStatic(2) }
]
}
def mockStatic(val) {
new MockUp<MyUtils>() {
#Mock
int staticMethod(int origin) {
return val
}
}
}
public static class MyUtils {
public static int staticMethod(int origin) {
return 0;
}
}
public static class MyClass {
public void verify(int origin) {
if (MyUtils.staticMethod(origin) == 1) {
System.out.println("1");
}
if (MyUtils.staticMethod(origin) == 2) {
System.out.println("2");
}
}
}
}
If you wish to use data tables, you need to help the parser a bit by explicitly adding it -> inside in the closure, if the closure is in the first column of the data table. You can also use some nice naming for your unrolled iterations:
#Unroll
def "verify #description"() {
given:
mockClosure()
MyClass myClass = new MyClass()
when:
myClass.verify(0)
then:
true
where:
description | mockClosure
"no mock" | { /* no mock */ }
"mock result 1" | { mockStatic(1) }
"mock result 2" | { mockStatic(2) }
}
The reason for creating and rolling back the save point is that JMockit does not play nice with Spock concerning mock lifecycles and the maintainer has no intention to even think about helping. See JMockit issue #668 for more info.
I have a DSL that includes blocks that need to be wrapped as methods returned inside an anonymous class created by the generated code. For example:
model {
task {
val x = 2*5;
Math.pow(2, x)
}
}
should compile to (note task becoming an instance of Runnable, with the body of the task becoming the body of the Runnable.run() method):
import java.util.Collection;
#SuppressWarnings("all")
public class MyFile {
public Collection<Runnable> tasks() {
ArrayList<Runnable> tasks = new ArrayList<>();
tasks.add(getTask0());
return tasks;
}
public static Runnable getTask0() {
Runnable _runnable = new Runnable() {
public void run() {
final int x = (2 * 5);
Math.pow(2, x);
}
}
return _runnable;
}
}
Following the discussion in this question, I was able to get this particular example to work. (Github repo includes unit tests.) But I had to do it by representing the Task element in the grammar as a sequence of XExpressions (source), which my XbaseCompiler subclass had to iterate over (source).
Instead, it would have been nice to be able to just have Task contain an XBlockExpression in a property action, and then in the compiler just do doInternalToJavaStatement(expr.action, it, isReferenced). My sense is that this is really the "right" solution in my case, but when I tried it, this would result in an empty body of the generated run method, as if the block was not processed at all. What's going on, and am I missing some required bits of setup/wiring things together/bindings that are necessary for this to work?
you ususally try to avoid that by using a better inference strategy e.g.
Grammar
Model:
{Model}"model" "{"
vars+=Variable*
tasks+=Task*
"}"
;
Variable:
"var" name=ID ":" type=JvmParameterizedTypeReference
;
Task:
{Task} "task" content=XBlockExpression
;
Inferrer
class MyDslJvmModelInferrer extends AbstractModelInferrer {
#Inject extension JvmTypesBuilder
def dispatch void infer(Model element, IJvmDeclaredTypeAcceptor acceptor, boolean isPreIndexingPhase) {
acceptor.accept(element.toClass("test.Model2")) [
for (v : element.vars) {
members+=v.toField(v.name, v.type.cloneWithProxies) [
]
}
var i = 0;
for (t : element.tasks) {
val doRunName = "doRun"+i
members += t.toMethod("task"+i, Runnable.typeRef()) [
body = '''
return new «Runnable» () {
public void run() {
«doRunName»();
}
};
'''
]
members += t.toMethod(doRunName, Void.TYPE.typeRef()) [
body = t.content
]
i = i + 1
}
]
}
}
and that basically is it.
you may follow https://bugs.eclipse.org/bugs/show_bug.cgi?id=481992
If you really want to adapt the xbase typesystem that may be a lot more of work e.g. (just covering a minimal case)
Grammar
Model:
{Model}"model" "{"
vars+=Variable*
tasks+=Task*
"}"
;
Variable:
"var" name=ID ":" type=JvmParameterizedTypeReference
;
Task:
{Task} "task" content=XTaskContent
;
XTaskContent returns xbase::XExpression:
{XTaskContent} block=XBlockExpression
;
Inferrer
class MyDslJvmModelInferrer extends AbstractModelInferrer {
#Inject extension JvmTypesBuilder
def dispatch void infer(Model element, IJvmDeclaredTypeAcceptor acceptor, boolean isPreIndexingPhase) {
acceptor.accept(element.toClass("test.Model")) [
for (v : element.vars) {
members+=v.toField(v.name, v.type.cloneWithProxies) [
]
}
var i = 0;
for (t : element.tasks) {
members += t.toMethod("task"+i, Runnable.typeRef()) [
body = t.content
]
i = i + 1
}
]
}
}
Type Computer
class MyDslTypeComputer extends XbaseTypeComputer {
override computeTypes(XExpression expression, ITypeComputationState state) {
if (expression instanceof XTaskContent) {
_computeTypes(expression as XTaskContent, state);
} else {
super.computeTypes(expression, state)
}
}
protected def void _computeTypes(XTaskContent object, ITypeComputationState state) {
state.withExpectation(getPrimitiveVoid(state)).computeTypes(object.block)
state.acceptActualType(getTypeForName(Runnable, state), ConformanceFlags.CHECKED_SUCCESS )
}
}
Compiler
class MyDslCompiler extends XbaseCompiler {
override protected internalToConvertedExpression(XExpression obj, ITreeAppendable appendable) {
if (obj instanceof XTaskContent) {
appendable.append("new ").append(Runnable).append("() {").newLine
appendable.increaseIndentation
appendable.append("public void run()").newLine
reassignThisInClosure(appendable, null)
internalToJavaStatement(obj.block, appendable, false)
appendable.newLine
appendable.decreaseIndentation
appendable.newLine.append("}")
} else {
super.internalToConvertedExpression(obj, appendable)
}
}
}
Bindings
class MyDslRuntimeModule extends AbstractMyDslRuntimeModule {
def Class<? extends ITypeComputer> bindITypeComputer() {
return MyDslTypeComputer
}
def Class<? extends XbaseCompiler> bindXbaseCompiler() {
return MyDslCompiler
}
}
I'm trying to use a custom Coder so that I can do some transforms, but I'm having trouble getting the PCollection to use my custom coder, and I suspect (???) it's because it's wrapped in a KV. Specifically:
Pipeline p = Pipeline.create ...
p.getCoderRegistry().registerCoder(MyClass.class, MyClassCoder.class);
...
PCollection<String> input = ...
PCollection<KV<String, MyClass>> t = input.apply(new ToKVTransform());
When I try to run something like this, I get a java.lang.ClassCastException and a stacktrace that includes a SerializableCoder instead of MyClassCoder like I would expect.
[error] at com.google.cloud.dataflow.sdk.coders.SerializableCoder.decode(SerializableCoder.java:133)
[error] at com.google.cloud.dataflow.sdk.coders.SerializableCoder.decode(SerializableCoder.java:50)
[error] at com.google.cloud.dataflow.sdk.coders.KvCoder.decode(KvCoder.java:95)
[error] at com.google.cloud.dataflow.sdk.coders.KvCoder.decode(KvCoder.java:42)
I see that the answer to another, somewhat related question (Using TextIO.Write with a complicated PCollection type in Google Cloud Dataflow) says to map everything to strings, and use that to pass stuff around PCollections. Is that really the recommended way??
(Note: the actual code is in Scala, but I'm pretty sure it's not a Scala <=> Java issue so I've translated it into Java here.)
Update to include Scala code and more background:
So this is the actual exception itself (should have included this at the beginning):
java.lang.ClassCastException: cannot assign instance of scala.collection.immutable.HashMap$SerializationProxy to field com.example.schema.Schema.keyTypes of type scala.collection.immutable.Map in instance of com.example.schema.Schema
Where com.example.schema.Schema is:
case class Schema(id: String, keyTypes: Map[String, Type])
And lastly, the SchemaCoder is:
class SchemaCoder extends com.google.cloud.dataflow.sdk.coders.CustomCoder[Schema] {
def decode(inputStream: InputStream, context: Context): Schema = {
val ois = new ObjectInputStream(inputStream)
val id: String = ois.readObject().asInstanceOf[String]
val javaMap: java.util.Map[String, Type] = ois.readObject().asInstanceOf[java.util.Map[String, Type]]
ois.close()
Schema(id, javaMap.asScala.toMap)
}
def encode(schema: Schema, outputStream: OutputStream, context: Context): Unit = {
val baos = new ByteArrayOutputStream()
val oos = new ObjectOutputStream(baos)
oos.writeObject(schema.id)
val javaMap: java.util.Map[String, Type] = schema.keyTypes.asJava
oos.writeObject(javaMap)
oos.close()
val encoded = new String(Base64.encodeBase64(baos.toByteArray()))
outputStream.write(encoded.getBytes())
}
}
====
Edit2: And here's what ToKVTransform actually looks like:
class SchemaExtractorTransform extends PTransform[PCollection[String], PCollection[Schema]] {
class InferSchemaFromStringWithKeyFn extends DoFn[String, KV[String, Schema]] {
override def processElement(c: DoFn[String, KV[String, Schema]]#ProcessContext): Unit = {
val line = c.element()
inferSchemaFromString(line)
}
}
class GetFirstFn extends DoFn[KV[String, java.lang.Iterable[Schema]], Schema] {
override def processElement(c: DoFn[KV[String, java.lang.Iterable[Schema]], Schema]#ProcessContext): Unit = {
val idAndSchemas: KV[String, java.lang.Iterable[Schema]] = c.element()
val it: java.util.Iterator[Schema] = idAndSchemas.getValue().iterator()
c.output(it.next())
}
}
override def apply(inputLines: PCollection[String]): PCollection[Schema] = {
val schemasWithKey: PCollection[KV[String, Schema]] = inputLines.apply(
ParDo.named("InferSchemas").of(new InferSchemaFromStringWithKeyFn())
)
val keyed: PCollection[KV[String, java.lang.Iterable[Schema]]] = schemasWithKey.apply(
GroupByKey.create()
)
val schemasOnly: PCollection[Schema] = keyed.apply(
ParDo.named("GetFirst").of(new GetFirstFn())
)
schemasOnly
}
}
This problem doesn't reproduce in Java; Scala is doing something differently with types that breaks Dataflow coder inference. To work around this, you can call setCoder on a PCollection to set its Coder explicitly, such as
schemasWithKey.setCoder(KvCoder.of(StringUtf8Coder.of(), SchemaCoder.of());
Here's the Java version of your code, just to make sure that it's doing approximately the same thing:
public static class SchemaExtractorTransform
extends PTransform<PCollection<String>, PCollection<Schema>> {
class InferSchemaFromStringWithKeyFn extends DoFn<String, KV<String, Schema>> {
public void processElement(ProcessContext c) {
c.output(KV.of(c.element(), new Schema()));
}
}
class GetFirstFn extends DoFn<KV<String, java.lang.Iterable<Schema>>, Schema> {
private static final long serialVersionUID = 0;
public void processElement(ProcessContext c) {
c.output(c.element().getValue().iterator().next());
}
}
public PCollection<Schema> apply(PCollection<String> inputLines) {
PCollection<KV<String, Schema>> schemasWithKey = inputLines.apply(
ParDo.named("InferSchemas").of(new InferSchemaFromStringWithKeyFn()));
PCollection<KV<String, java.lang.Iterable<Schema>>> keyed =
schemasWithKey.apply(GroupByKey.<String, Schema>create());
PCollection<Schema> schemasOnly =
keyed.apply(ParDo.named("GetFirst").of(new GetFirstFn()));
return schemasOnly;
}
}
I have sample code as below
import org.codehaus.groovy.control.CompilerConfiguration
abstract class MyClass extends Script {
void testMethod(Integer x) {
println "x = $x"
}
}
public static void main(String[] args) {
compilerConfiguration = new CompilerConfiguration();
compilerConfiguration.setScriptBaseClass("MyClass");
GroovyShell shell = new GroovyShell(new Binding(), compilerConfiguration);
shell.evaluate("testMethod 1")
}
When I run this class it prints x = 1
now if I change the "testMethod 1" to "testMethod -1" it fails with
Caught: groovy.lang.MissingPropertyException: No such property: testMethod for class: Script1
groovy.lang.MissingPropertyException: No such property: testMethod for class: Script1
at Script1.run(Script1.groovy:1)
at Test.run(Test.groovy:15)
Now I change "testMethod -1" to "testMethod (-1)". It again works and printed x = -1
What I need to understand is why Groovy is asking for the parentheses for negative numbers.
Because without parentheses, it is assuming you are trying to subtract 1 from a property called testMethod (ie: testMethod - 1)
You need the parentheses to inform the parser that this is a method call rather than a subtraction operation
Edit
I came up with a horrible way to get this to work:
import java.lang.reflect.Method
import org.codehaus.groovy.control.CompilerConfiguration
abstract class MyClass extends Script {
private methods = [:]
class MinusableMethod {
Script declarer
Method method
MinusableMethod( Script d, Method m ) {
this.declarer = d
this.method = m
}
def minus( amount ) {
method.invoke( declarer, -amount )
}
}
public MyClass() {
super()
methods = MyClass.getDeclaredMethods().grep {
it.name != 'propertyMissing' && !it.synthetic
}.collectEntries {
[ (it.name): new MinusableMethod( this, it ) ]
}
}
def propertyMissing( String name ) {
methods[ name ]
}
void testMethod(Integer x) {
println "x = $x"
}
}
static main( args ) {
def compilerConfiguration = new CompilerConfiguration();
compilerConfiguration.setScriptBaseClass( 'MyClass' );
GroovyShell shell = new GroovyShell(new Binding(), compilerConfiguration);
shell.evaluate("testMethod - 1")
}
But this will probably break under other conditions
In the long run, getting people to write valid scripts is probably the better route to take...
I need to perform some initialization when new instances of my domain class are created.
class ActivationToken {
String foo
String bar
}
When I do this I want bar to be initialized by code inside ActivationToken:
def tok = new ActivationToken(foo:'a')
I cannot see how to 'override' the 'constructor' to make this happen. I know in this case I could just add a normal constructor but this is just a simple example.
The map constructor is coming from Groovy - not Grails in this case. I did some experimentation, and this is what I came up with:
class Foo {
String name = "bob"
int num = 0
public Foo() {
this([:])
}
public Foo(Map map) {
map?.each { k, v -> this[k] = v }
name = name.toUpperCase()
}
public String toString() {
"$name=$num"
}
}
assert 'BOB=0' == new Foo().toString()
assert 'JOE=32' == new Foo(name:"joe", num: 32).toString()
Basically, it appears that you'll have to manually override the constructors if you need to process the property after construction.
Alternately, you can override individual setters, which is cleaner and safer in general:
class Foo {
String name = "bob"
int num = 0
public void setName(n) {
name = n.toUpperCase()
}
public String toString() {
"$name=$num"
}
}
assert 'bob=0' == new Foo().toString()
assert 'JOE=32' == new Foo(name:"joe", num: 32).toString()
Note that the default value isn't processed, but that should be OK in most instances.
The solution above is also good for cases where initializing an object from parameters in a web request, for example, where you wish to ignore extraneous values, catching Missing property exceptions.
public Foo(Map map) {
try {
map?.each { k, v -> this[k] = v }
}
catch(Exception e){
}
}