How do I use Spock Spy? - spock

I tried to use Spy test but it did not work. The following class is a Sut.
public class FileManager {
public int removeFiles(String directory) {
int count = 0;
if(isDirectory(directory)) {
String[] files = findFiles(directory);
for(String file : files) {
deleteFile(file);
count++;
}
}
return count;
}
private boolean isDirectory(String directory) {
return directory.endsWith("/");
}
private String[] findFiles(String directory) {
// read files from disk.
return null;
}
private void deleteFile(String file) {
// delete a file.
return;
}
}
Then, I created a test like the below.
class SpyTest extends Specification {
def "Should return the number of files deleted"() {
given:
def fileManager = Spy(FileManager)
1 * fileManager.findFiles("directory/") >> { return ["file1", "file2", "file3", "file4"] }
fileManager.deleteFile(_) >> { println "deleted file."}
when:
def count = fileManager.removeFiles("directory/")
then:
count == 4
}
But I got NullPointerException.
java.lang.NullPointerException
at example.spock.mock.FileManager.removeFiles(FileManager.java:8)
at net.sf.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:228)
at org.spockframework.mock.runtime.CglibRealMethodInvoker.respond(CglibRealMethodInvoker.java:32)
at org.spockframework.mock.runtime.MockInvocation.callRealMethod(MockInvocation.java:60)
at org.spockframework.mock.CallRealMethodResponse.respond(CallRealMethodResponse.java:29)
at org.spockframework.mock.runtime.MockController.handle(MockController.java:49)
at org.spockframework.mock.runtime.JavaMockInterceptor.intercept(JavaMockInterceptor.java:72)
at org.spockframework.mock.runtime.CglibMockInterceptorAdapter.intercept(CglibMockInterceptorAdapter.java:30)
at example.spock.mock.SpyTest.Should return the number of files deleted(SpyTest.groovy:13)
It means the real method is called. Is there any reason why it does not work?

In Spock, you can't mock private methods of a java class. Looking at your FileManager it's not clear why only removeFiles is public but others are private. Although all methods are related to file management. Possible solutions would be:
Make the rest of the FileManager methods public. This way Spock will work and FileManager will actually become a file manager, not just file remover
Decompose FileManager into different components. So you can mock these components separately and inject them into "file remover". Basically you already decomposed your code at the methods level. But private java methods are not mockable in Spock. And class decomposition might be overhead because FileManager looks kind of cohesive with all of its operations
Use other test/mocking frameworks that can mock private methods. E.g. mockito and powermock. However, mocking private methods is the worst option because it can harm maintainability of the entire codebase in the long term if out of control

Related

How can I inject with Guice my api into dataflow jobs without needed to be serializable?

This question is a follow on after such a great answer Is there a way to upload jars for a dataflow job so we don't have to serialize everything?
This made me realize 'ok, what I want is injection with no serialization so that I can mock and test'.
Our current method requires our apis/mocks to be serialiable BUT THEN, I have to put static fields in the mock because it gets serialized and deserialized creating a new instance that dataflow uses.
My colleague pointed out that perhaps this needs to be a sink and that is treated differently? <- We may try that later and update but we are not sure right now.
My desire is from the top to replace the apis with mocks during testing. Does someone have an example for this?
Here is our bootstrap code that does not know if it is in production or inside a feature test. We test end to end results with no apache beam imports in our tests meaning we swap to any tech if we want to pivot and keep all our tests. Not only that, we catch way more integration bugs and can refactor without rewriting tests since the contracts we test are customer ones we can't easily change.
public class App {
private Pipeline pipeline;
private RosterFileTransform transform;
#Inject
public App(Pipeline pipeline, RosterFileTransform transform) {
this.pipeline = pipeline;
this.transform = transform;
}
public void start() {
pipeline.apply(transform);
pipeline.run();
}
}
Notice that everything we do is Guice Injection based so the Pipeline may be direct runner or not. I may need to modify this class to pass things through :( but anything that works for now would be great.
The function I am trying to get our api(and mock and impl to) with no serialization is thus
private class ValidRecordPublisher extends DoFn<Validated<PractitionerDataRecord>, String> {
#ProcessElement
public void processElement(#Element Validated<PractitionerDataRecord>element) {
microServiceApi.writeRecord(element.getValue);
}
}
I am not sure how to pass in microServiceApi in a way that avoid serialization. I would be ok with delayed creation as well after deserialization using guice Provider provider; with provider.get() if there is a solution there too.
Solved in such a way that mocks no longer need static or serialization anymore by one since glass bridging the world of dataflow(in prod and in test) like so
NOTE: There is additional magic-ness we have in our company that passes through headers from service to service and through dataflow and that is some of it in there which you can ignore(ie. the RouterRequest request = Current.request();). so for anyone else, they will have to pass in projectId into getInstance each time.
public abstract class DataflowClientFactory implements Serializable {
private static final Logger log = LoggerFactory.getLogger(DataflowClientFactory.class);
public static final String PROJECT_KEY = "projectKey";
private transient static Injector injector;
private transient static Module overrides;
private static int counter = 0;
public DataflowClientFactory() {
counter++;
log.info("creating again(usually due to deserialization). counter="+counter);
}
public static void injectOverrides(Module dfOverrides) {
overrides = dfOverrides;
}
private synchronized void initialize(String project) {
if(injector != null)
return;
/********************************************
* The hardest part is this piece since this is specific to each Dataflow
* so each project subclasses DataflowClientFactory
* This solution is the best ONLY in the fact of time crunch and it works
* decently for end to end testing without developers needing fancy
* wrappers around mocks anymore.
***/
Module module = loadProjectModule();
Module modules = Modules.combine(module, new OrderlyDataflowModule(project));
if(overrides != null) {
modules = Modules.override(modules).with(overrides);
}
injector = Guice.createInjector(modules);
}
protected abstract Module loadProjectModule();
public <T> T getInstance(Class<T> clazz) {
if(!Current.isContextSet()) {
throw new IllegalStateException("Someone on the stack is extending DoFn instead of OrderlyDoFn so you need to fix that first");
}
RouterRequest request = Current.request();
String project = (String)request.requestState.get(PROJECT_KEY);
initialize(project);
return injector.getInstance(clazz);
}
}
I suppose this may not be what you're looking for, but your use case makes me think of using factory objects. They may depend on the pipeline options that you pass (i.e. your PipelineOptions object), or on some other configuration object.
Perhaps something like this:
class MicroserviceApiClientFactory implements Serializable {
MicroserviceApiClientFactory(PipelineOptions options) {
this.options = options;
}
public static MicroserviceApiClient getClient() {
MyPipelineOptions specialOpts = options.as(MySpecialOptions.class);
if (specialOpts.getMockMicroserviceApi()) {
return new MockedMicroserviceApiClient(...); // Or whatever
} else {
return new MicroserviceApiClient(specialOpts.getMicroserviceEndpoint()); // Or whatever parameters it needs
}
}
}
And for your DoFns and any other execution-time objects that need it, you would pass the factory:
private class ValidRecordPublisher extends DoFn<Validated<PractitionerDataRecord>, String> {
ValidRecordPublisher(MicroserviceApiClientFactory msFactory) {
this.msFactory = msFactory;
}
#ProcessElement
public void processElement(#Element Validated<PractitionerDataRecord>element) {
if (microServiceapi == null) microServiceApi = msFactory.getClient();
microServiceApi.writeRecord(element.getValue);
}
}
This should allow you to encapsulate the mocking functionality into a single class that lazily creates your mock or your client at pipeline execution time.
Let me know if this matches what you want somewhat, or if we should try to iterate further.
I have no experience with Guice, so I don't know if Guice configurations can easily pass the boundary between pipeline construction and pipeline execution (serialization / submittin JARs / etc).
Should this be a sink? Maybe, if you have an external service, and you're writing to it, you can write a PTransform that takes care of it - but the question of how you inject various dependencies will remain.

How to get the instance of an injected dependency, by its type using Umbraco.Core.Composing (Umbraco 8)

I need to find a way to get an instance of DataProcessingEngine without calling it's constractor.
I am trying to find a way to do so using the registered DataProcessingEngine in composition object (please see the following code). But I could not find a way to do so.
Anyone have a suggestion? Thanks in advance.
public class Composer : IUserComposer
{
public void Compose(Composition composition)
{
composition.Register<IDataProcessingEngine, DataProcessingEngine>(Lifetime.Singleton);
//DataProcessingEngine dataProcessing = compostion.Resolve<IDataProcessingEngine>()??//no resolve function exists in Umbraco.Core.Composing
SaveImagesThread(dataProcessingEngine);
}
public Task SaveImagesThread(IDataProcessingEngine dataProcessingEngine)//TODO - decide async
{
string dataTimerTime = WebConfig.SaveProductsDataTimer;
double time = GetTimeForTimer(dataTimerTime);
if (time > 0)
{
var aTimer = new System.Timers.Timer(time);
aTimer.Elapsed += new ElapsedEventHandler(dataProcessingEngine.SaveImages);
aTimer.Start();
}
return default;
}
}
For all of you who are looking for a way to call a function (that's defined in another class in your code, an Engine or ...) from the composer(where the app starts) and want to avoid calling this function's class' constractor. I've found another way to do so:
public class QueuePollingHandler
{
[RuntimeLevel(MinLevel = RuntimeLevel.Run)]
public class SubscribeToQueuePollingHandlerComponentComposer :
ComponentComposer<SubscribeToQueuePollingHandler>
{ }
public class SubscribeToQueuePollingHandler : IComponent
{
private readonly IDataProcessingEngine _dataProcessingEngine;
public SubscribeToQueuePollingHandler(IDataProcessingEngine
dataProcessingEngine)
{
_dataProcessingEngine = dataProcessingEngine;
SaveImagesThread(_dataProcessingEngine);
}
public void SaveImagesThread(IDataProcessingEngine
dataProcessingEngine)
{
....
}
}
And the logic explenation: You create a class (SubscribeToQueuePollingHandlerComponentComposer from the example) and define its base class to be ComponentComposer<Class_that_inherits_IComponent>.
And when you start the application you could see that it gets to the registered class' constractor (SubscribeToQueuePollingHandler constructor).
That's the way that I found to be able to call a function right when the application starts without needing to call its class constractor and actualy use dependency injection.

Jenkins pipeline job hangs on simple call to print simple object from shared library

I have a jenkins pipeline job that has been working fine. I have a small handful of similar pipelines, and I've been duplicating a small set of reusable utility methods into each one. So, I've started to construct a shared library to reduce that duplication.
I'm using the following page for guidance: https://jenkins.io/doc/book/pipeline/shared-libraries/ .
For each method that I move into the shared library, I create a "vars/methodname.groovy" file in the shared library, and change the method name to "call".
I've been doing these one at a time and verifying the pipeline job still works, and this is all working fine.
The original set of methods would reference several "global" variables, like "env.JOB_NAME" and "params.". In order for the method to work in the shared library, I would add references to those env vars and params as parameters to the methods. This also works fine.
However, I don't like the fact that I have to pass these "global" variables, that are essentially static from the start of the job, sometimes through a couple of levels of these methods that I've put into the shared library.
So, I've now created something like the "vars/acme.groovy" example from that doc page. I'm going to define instance variables to store all of those "global" variables, and move each of the single methods defined in each of the "vars/methodname.groovy" files as instance variables in this new class.
I also defined a "with" method in the class for each of the instance variables (setter that returns "this" for chaining).
I initially would configure it inside my "node" block with something like the following (the file in the library is called "vars/uslutils.groovy"):
uslutils.withCurrentBuild(currentBuild).with...
And then when I need to call any of the reused methods, I would just do "uslutils.methodname(optionalparameters)".
I also added a "toString()" method to the class, just for debugging (since debugging Jenkinsfiles is so easy :) ).
What's odd is that I'm finding that if I call this toString() method from the pipeline script, the job hangs forever, and I have to manually kill it. I imagine I'm hitting some sort of non-obvious recursion in some Groovy AST, but I don't see what I'm doing wrong.
Here is my "vars/uslutils.groovy" file in the shared library:
import hudson.model.Cause
import hudson.triggers.TimerTrigger
import hudson.triggers.SCMTrigger
import hudson.plugins.git.GitStatus
class uslutils implements Serializable {
def currentBuild
String mechIdCredentials
String baseStashURL
String jobName
String codeBranch
String buildURL
String pullRequestURL
String qBotUserID
String qBotPassword
def getCurrentBuild() { return currentBuild }
String getMechIdCredentials() { return mechIdCredentials }
String getBaseStashURL() { return baseStashURL }
String getJobName() { return jobName }
String getCodeBranch() { return codeBranch }
String getBuildURL() { return buildURL }
String getPullRequestURL() { return pullRequestURL }
String getQBotUserID() { return qBotUserID }
String getQBotPassword() { return qBotPassword }
def withCurrentBuild(currentBuild) { this.currentBuild = currentBuild; return this }
def withMechIdCredentials(String mechIdCredentials) { this.mechIdCredentials = mechIdCredentials; return this }
def withBaseStashURL(String baseStashURL) { this.baseStashURL = baseStashURL; return this }
def withJobName(String jobName) { this.jobName = jobName; return this }
def withCodeBranch(String codeBranch) { this.codeBranch = codeBranch; return this }
def withBuildURL(String buildURL) { this.buildURL = buildURL; return this }
def withPullRequestURL(String pullRequestURL) { this.pullRequestURL = pullRequestURL; return this }
def withQBotUserID(String qBotUserID) { this.qBotUserID = qBotUserID; return this }
def withQBotPassword(String qBotPassword) { this.qBotPassword = qBotPassword; return this }
public String toString() {
// return "[currentBuild[${this.currentBuild}] mechIdCredentials[${this.mechIdCredentials}] " +
// "baseStashURL[${this.baseStashURL}] jobName[${this.jobName}] codeBranch[${this.codeBranch}] " +
// "buildURL[${this.buildURL}] pullRequestURL[${this.pullRequestURL}] qBotUserID[${this.qBotUserID}] " +
// "qBotPassword[${this.qBotPassword}]]"
return this.mechIdCredentials
}
Note that I've simplified the toString() method temporarily until I figure out what I'm doing wrong here.
This is what I added at the top of my "node" block:
uslutils.currentBuild = currentBuild
println "uslutils[${uslutils}]"
When I run the job, it prints information from lines that come before this, and then it just shows the rotating thing forever, until I kill the job. If I comment out the "println", it works fine.

JaCoCo/EclEmma's Source highlighting function doesn't work when using PowerMock to Mock Constructor

I used PowerMock to Mock Constructor.Afer launching the application,I thought all lines shoud be green.However,actually all lines are red.
I think Mocking Constructor results in this phenomenon.Beacause mocking others,like final classes, is OK.How to fix this problem?
//code:
public class People {
public String sayHello(){
return "hello";
}
}
public class Family {
public String doEvent() {
People p = new People();
String str = p.sayHello();
System.out.println(str);
return str;
}
}
#RunWith(PowerMockRunner.class)
#PrepareForTest(Family.class)
public class FamilyTest {
#Test
public void test() throws Exception {
Family f = new Family();
String str = "hello mock";
People p = PowerMock.createMock(People.class);
PowerMock.expectNew(People.class).andReturn(p);
EasyMock.expect(p.sayHello()).andReturn(str);
PowerMock.replay(p, People.class);
String strActual = f.doEvent();
Assert.assertEquals(str, strActual);
PowerMock.verify(p, People.class);
}
}
You shouldn't have to use #PrepareForTest unless you are mocking static methods inside that class.
I believe your issue is that when you prepare a class for test using Powermocks runner, it does something funky with the byte code, which EclEmma uses for line coverage. Since you are not mocking any static methods in your family class, try removing that from your #PrepareForTest.

What are the alternatives to overriding asType() when writing conversion code?

It appears the convention for converting objects in Groovy is to use the as operator and override asType(). For example:
class Id {
def value
#Override
public Object asType(Class type) {
if (type == FormattedId) {
return new FormattedId(value: value.toUpperCase())
}
}
}
def formattedId = new Id(value: "test") as FormattedId
However, Grails over-writes the implementation of asType() for all objects at runtime so that it can support idioms like render as JSON.
An alternative is to re-write the asType() in the Grails Bootstrap class as follows:
def init = { servletContext ->
Id.metaClass.asType = { Class type ->
if (type == FormattedId) {
return new FormattedId(value: value.toUpperCase())
}
}
}
However, this leads to code duplication (DRY) as you now need to repeat the above in both the Bootstrap and the Id class otherwise the as FormattedId will not work outside the Grails container.
What alternatives exist to writing conversion code in Groovy/Grails that do not break good code/OO design principals like the Single Responsibility Principal or DRY? Are Mixins are good use here?
You can use the Grails support for Codecs to automatically add encodeAs* functions to your Grails archetypes:
class FormattedIdCodec {
static encode = { target ->
new FormattedId((target as String).toUpperCase()
}
}
Then you can use the following in your code:
def formattedId = new Id(value: "test").encodeAsFormattedId
My un-elegant solution is to rename the original asType(), and make a new asType() that calls it, and to also make your BootStrap overwrite astType with a call to that method:
so, your class:
class Id {
def value
#Override
public Object asType(Class type) {
return oldAsType(type);
}
public Object oldAsType(Class type) {
if (type == FormattedId) {
return new FormattedId(value: value.toUpperCase())
}
}
}
In my app, I had asType defined in a number of classes, so I ended up using a common closure in BootStrap.groovy:
def useOldAsType = {Class clazz ->
delegate.oldAsType(clazz)
}
Id.metaClass.asType = useOldAsType;
Value.metaClass.asType = useOldAsType;
OtherClass.metaClass.asType = useOldAsType;
SubclassOfValue.metaClass.asType = useOldAsType;
Note that if you have a subclass that does not override asType, but you want it to use the superclass's, you must also set it in BootStrap.

Resources