Job class must implement the Job interface."I created simple job using Quartz.Net 1.0.3
public class SimpleTestJob : IJob
{
public virtual void Execute(JobExecutionContext context)
{
System.Diagnostics.EventLog.WriteEntry("QuartzTest", "This is a test run");
}
}
Then I am tried dynamically add the job above to the Quartz server
First I received a Type using reflection
string jobType = "Scheduler.Quartz.Jobs.SimpleTestJob,Scheduler.Quartz,Version=1.0.0.0,Culture=neutral,PublicKeyToken=null";
var schedType= Type.GetType(jobType, false, true);
It's working.Then I am trying to create JobDetail object
JobDetail job = job = new JobDetail(jobName, groupName, schedType.GetType());
But I am receiving an error from Quartz.Net framework.
"Job class must implement the Job interface."
Please help
Try removing the virtual keyword and you might also want to try using the typeof operator where you have schedType.GetType(). I'm not sure what the type of schedType ends up being given it is defined as var.
I am using Quartz 1.0.3 it's compiled with .net 3.5.
But schedType.GetType
returned type with attribute RunTime version 4.
Really I do not need to use GetType function because I alread have a type, that I received before
var schedType= Type.GetType(jobType, false, true);
So my fix was
JobDetail job = new JobDetail(jobName, groupName, schedType);
Related
I would like a Dataflow template with a default value for one of the PipelineOptions parameters.
Inspired by examples online I use a ValueProvider placeholder for deferred parameter setting in my PipelineOptions "sub"-interface:
#Default.String("MyDefaultValue")
ValueProvider<String> getMyValue();
void setMyValue(ValueProvider<String> value);
If I specify the parameter at runtime, the template works for launching a real GCP Dataflow job. However if I try to test not including the parameter before doing this for real:
#Rule
public TestPipeline pipeline = TestPipeline.create();
...
#Test
public void test() {
PipelineOptions options = PipelineOptionsFactory.fromArgs(new String[] {...}).withValidation();
...
pipeline.run(options);
}
Then when my TestPipeline executes a DoFn processElement method where the parameter is needed I get
IllegalStateException: Value only available at runtime, but accessed from a non-runtime context:
RuntimeValueProvider{propertyName=myValue, default=MyDefaultValue}
...
More specifically it fails here in org.apache.beam.sdk.options.ValueProvider:
#Override
public T get() {
PipelineOptions options = optionsMap.get(optionsId);
if (options == null) {
throw new IllegalStateException(...
One could presumably be forgiven for thinking runtime is when the pipeline is running.
Anyway, does anybody know how would I unit test the parameter defaulting, assuming the top code snippet is how it should be set up and it is supported? Thank you.
I had the same problem when I was generating a Dataflow template from Eclipse, my Dataflow template receives a parameter from Cloud Composer DAG.
I got the solution from the Google Cloud documentation:
https://cloud.google.com/dataflow/docs/guides/templates/creating-templates#using-valueprovider-in-your-functions
You can also use Flex Tempaltes and avoid all the hassles with ValueProviders.
I have properly created and scheduled a Job (I don't write the Job and Trigger creation here, just to be brief). Scheduler is created and started as follows:
_scheduler = New StdSchedulerFactory().GetScheduler
_scheduler.Start()
Jobs are executed at the scheduled time.
Then I created a very simple (and empty, at the moment) a JobListener:
Imports Quartz
Public Class JobListener
Implements IJobListener
#Region "Public properties"
Public ReadOnly Property Name As String Implements Quartz.IJobListener.Name
Get
Return "JOB_LISTENER"
End Get
End Property
#End Region
#Region "Methods"
Public Sub JobExecutionVetoed(context As Quartz.IJobExecutionContext) Implements Quartz.IJobListener.JobExecutionVetoed
Throw New NotImplementedException
End Sub
Public Sub JobToBeExecuted(context As Quartz.IJobExecutionContext) Implements Quartz.IJobListener.JobToBeExecuted
Throw New NotImplementedException
End Sub
Public Sub JobWasExecuted(context As Quartz.IJobExecutionContext, jobException As Quartz.JobExecutionException) Implements Quartz.IJobListener.JobWasExecuted
End Sub
#End Region
End Class
and add it to the scheduler:
_scheduler = New StdSchedulerFactory().GetScheduler
_scheduler.Start()
_jobListener = New JobListener()
_scheduler.ListenerManager.AddJobListener(_jobListener, GroupMatcher(Of JobKey).AnyGroup())
and now the Jobs are not executed anymore.
Any hint why it happens?
Same result if I add the JobListener before starting the Scheduler:
_jobListener = New JobListener()
_scheduler = New StdSchedulerFactory().GetScheduler
_scheduler.ListenerManager.AddJobListener(_jobListener, GroupMatcher(Of JobKey).AnyGroup())
_scheduler.Start()
I figured out which was the problem.
First of all, an advice: always configure a log before starting to debug with Quartz .net.
When the Job is ready to be executed, the JobListener is notified and then the method JobToBeExecuted is called. As you can see in my JobListener's implementation I throw an exception in the method JobToBeExecuted and that exception prevents the Job to be executed.
I didn't investigate why an error in the JobListener should prevent the Job to be executed. I guess there's a chain of calls broken by the exception.
Anyway, this is the answer to my question.
I have a Maven plugin that I am attempting to test using a subclass of the AbstractMojoTestCase. The plugin Mojo defines an outputFolder parameter with a defaultValue. This parameter is not generally expected to be provided by the user in the POM.
#Parameter(defaultValue = "${project.build.directory}/someOutputFolder")
private File outputFolder;
And if I use the plugin in a real scenario then the outputFolder gets defaulted as expected.
But if I test the Mojo using the AbstractMojoTestCase then while parameters defined in the test POM are populated, parameters with a defaultValue that are not defined in the POM are not populated.
public class MyPluginTestCase extends AbstractMojoTestCase {
public void testAssembly() throws Exception {
final File pom = getTestFile( "src/test/resources/test-pom.xml");
assertNotNull(pom);
assertTrue(pom.exists());
final MyMojo myMojo = (BaselineAssemblyMojo) lookupMojo("assemble", pom);
assertNotNull(myMojo);
myMojo.execute(); // Dies due to NullPointerException on outputFolder.
}
}
Further: if I define the outputFolder parameter in the POM like so:
<outputFolder>${project.build.directory}/someOutputFolder</outputFolder>
then ${project.build.directory} is NOT resolved within the AbstractMojoTestCase.
So what do I need to do to get the defaultvalue populated when testing?
Or is this a fault in the AbstractMojoTestCase?
This is Maven-3.2.3, maven-plugin-plugin-3.2, JDK 8
You need to use lookupConfiguredMojo.
Here's what I ended up using:
public class MyPluginTest
{
#Rule
public MojoRule mojoRule = new MojoRule();
#Test
public void noSource() throws Exception
{
// Just give the location, where the pom.xml is located
MyPlugin plugin = (MyPlugin) mojoRule.lookupConfiguredMojo(getResourcesFile("basic-test"), "myGoal");
plugin.execute();
assertThat(plugin.getSomeInformation()).isEmpty();
}
public File getResourcesFile(String filename)
{
return new File("src/test/resources", filename);
}
}
Of course you need to replace myGoal with your plugin's goal. You also need to figure out how to assert that your plugin executed successfully.
For a more complete example, check out the tests I wrote for fmt-maven-plugin
Grails 1.3.7
Trouble with data binding Command objects that have List content. Example Command:
class Tracker {
String name
String description
List<Unit> units = new ArrayList()
}
class Unit {
String name
Long unitMax
Long unitMin
}
create GSP for Tracker has the Unit fields. One example:
<g:textField name="units[0].unitMax" value=""/>
TrackerController save method:
def save = { Tracker trackerInstance ->
trackerInstance = trackingService.saveOrUpdateTracker(trackerInstance)
}
But, always java.lang.IndexOutOfBoundsException
Alternatively, if I update controller to:
def save = {
Tracker trackerInstance = new Tracker()
trackerInstance.properties = params
....
Then groovy.lang.ReadOnlyPropertyException: Cannot set readonly property: properties for class: com.redbrickhealth.dto.Tracker
Any ideas?
There seems to be a difference between binding in GORM vs Command objects.
Maybe I need to extend and register a PropertyEditorSupport for Unit?
-Todd
Since Groovy 1.8.7 the List interface has a method called withLazyDefault that can be used instead of apache commons ListUtils:
List<Unit> units = [].withLazyDefault { new Unit() }
This creates a new Unit instance every time units is accessed with a non-existent index.
See the documentation of withLazyDefault for more details. I also wrote a small blog post about this a few days ago.
Grails requires an command with existing list, that will be filled with data from reques.
If you know exact number of units, say 3, you can:
class Tracker {
String name
String description
List<Unit> units = [new Unit(), new Unit(), new Unit()]
}
or use LazyList from apache commons collections
import org.apache.commons.collections.ListUtils
import org.apache.commons.collections.Factory
class Tracker {
String name
String description
List<Unit> units = ListUtils.lazyList([], {new Unit()} as Factory)
}
Is it possible to run Quartz.NET jobs in a separate AppDomain? If so, how can this be achieved?
Disclaimer: I've not tried this, it's just an idea. And none of this code has been compiled, even.
Create a custom job factory that creates a wrapper for your real jobs. Have this wrapper implement the Execute method by creating a new app domain and running the original job in that app domain.
In more detail: Create a new type of job, say IsolatedJob : IJob. Have this job take as a constructor parameter the type of a job that it should encapsulate:
internal class IsolatedJob: IJob
{
private readonly Type _jobType;
public IsolatedJob(Type jobType)
{
_jobType = jobType ?? throw new ArgumentNullException(nameof(jobType));
}
public void Execute(IJobExecutionContext context)
{
// Create the job in the new app domain
System.AppDomain domain = System.AppDomain.CreateDomain("Isolation");
var job = (IJob)domain.CreateInstanceAndUnwrap("yourAssembly", _jobType.Name);
job.Execute(context);
}
}
You may need to create an implementation of IJobExecutionContext that inherits from MarshalByRefObject and proxies calls onto the original context object. Given the number of other objects that IJobExecutionContext provides access to, I'd be tempted to implement many members with a NotImplementedException as most won't be needed during job execution.
Next you need the custom job factory. This bit is easier:
internal class IsolatedJobFactory : IJobFactory
{
public IJob NewJob(TriggerFiredBundle bundle, IScheduler scheduler)
{
return NewJob(bundle.JobDetail.JobType);
}
private IJob NewJob(Type jobType)
{
return new IsolatedJob(jobType);
}
}
Finally, you will need to instruct Quartz to use this job factory rather than the out of the box one. Use the IScheduler.JobFactory property setter and provide a new instance of IsolatedJobFactory.