Grails and Quartz: Bad value for type long - grails

I'm trying to save quartz jobs into the database. I've set up the tables, created quartz.properties files, but when I try to run the app, this exception shows up:
2012-02-01 17:36:23,708 [main] ERROR context.GrailsContextLoader - Error executing bootstraps: org.quartz.JobPersistenceException: Couldn't store trigger 'expirationTrigger' for 'com.pldtglobal.svngateway.ExpirationCheckerJob' job:Bad value for type long : \254\355\000\005sr\000\025org.quartz.JobDataMap\237\260\203\350\277\251\260\313\002\000\000xr\000&org.quartz.utils.StringKeyDirtyFlagMap\202\010\350\303\373\305](\002\000\001Z\000\023allowsTransientDataxr\000\035org.quartz.utils.DirtyFlagMap\023\346.\255(v\012\316\002\000\002Z\000\005dirtyL\000\003mapt\000\017Ljava/util/Map;xp\001sr\000\021java.util.HashMap\005\007\332\301\303\026`\321\003\000\002F\000\012loadFactorI\000\011thresholdxp?#\000\000\000\000\000\014w\010\000\000\000\020\000\000\000\001t\000'org.grails.plugins.quartz.grailsJobNamet\000.com.pldtglobal.svngateway.ExpirationCheckerJobx\000 [See nested exception: org.postgresql.util.PSQLException: Bad value for type long : \254\355\000\005sr\000\025org.quartz.JobDataMap\237\260\203\350\277\251\260\313\002\000\000xr\000&org.quartz.utils.StringKeyDirtyFlagMap\202\010\350\303\373\305](\002\000\001Z\000\023allowsTransientDataxr\000\035org.quartz.utils.DirtyFlagMap\023\346.\255(v\012\316\002\000\002Z\000\005dirtyL\000\003mapt\000\017Ljava/util/Map;xp\001sr\000\021java.util.HashMap\005\007\332\301\303\026`\321\003\000\002F\000\012loadFactorI\000\011thresholdxp?#\000\000\000\000\000\014w\010\000\000\000\020\000\000\000\001t\000'org.grails.plugins.quartz.grailsJobNamet\000.com.pldtglobal.svngateway.ExpirationCheckerJobx\000]
org.codehaus.groovy.runtime.InvokerInvocationException: org.quartz.JobPersistenceException: Couldn't store trigger 'expirationTrigger' for 'com.pldtglobal.svngateway.ExpirationCheckerJob' job:Bad value for type long : \254\355\000\005sr\000\025org.quartz.JobDataMap\237\260\203\350\277\251\260\313\002\000\000xr\000&org.quartz.utils.StringKeyDirtyFlagMap\202\010\350\303\373\305](\002\000\001Z\000\023allowsTransientDataxr\000\035org.quartz.utils.DirtyFlagMap\023\346.\255(v\012\316\002\000\002Z\000\005dirtyL\000\003mapt\000\017Ljava/util/Map;xp\001sr\000\021java.util.HashMap\005\007\332\301\303\026`\321\003\000\002F\000\012loadFactorI\000\011thresholdxp?#\000\000\000\000\000\014w\010\000\000\000\020\000\000\000\001t\000'org.grails.plugins.quartz.grailsJobNamet\000.com.pldtglobal.svngateway.ExpirationCheckerJobx\000 [See nested exception: org.postgresql.util.PSQLException: Bad value for type long : \254\355\000\005sr\000\025org.quartz.JobDataMap\237\260\203\350\277\251\260\313\002\000\000xr\000&org.quartz.utils.StringKeyDirtyFlagMap\202\010\350\303\373\305](\002\000\001Z\000\023allowsTransientDataxr\000\035org.quartz.utils.DirtyFlagMap\023\346.\255(v\012\316\002\000\002Z\000\005dirtyL\000\003mapt\000\017Ljava/util/Map;xp\001sr\000\021java.util.HashMap\005\007\332\301\303\026`\321\003\000\002F\000\012loadFactorI\000\011thresholdxp?#\000\000\000\000\000\014w\010\000\000\000\020\000\000\000\001t\000'org.grails.plugins.quartz.grailsJobNamet\000.com.pldtglobal.svngateway.ExpirationCheckerJobx\000]
at org.grails.tomcat.TomcatServer.start(TomcatServer.groovy:212)
at grails.web.container.EmbeddableServer$start.call(Unknown Source)
at _GrailsRun_groovy$_run_closure5_closure12.doCall(_GrailsRun_groovy:158)
at _GrailsRun_groovy$_run_closure5_closure12.doCall(_GrailsRun_groovy)
at _GrailsSettings_groovy$_run_closure10.doCall(_GrailsSettings_groovy:280)
at _GrailsSettings_groovy$_run_closure10.call(_GrailsSettings_groovy)
at _GrailsRun_groovy$_run_closure5.doCall(_GrailsRun_groovy:149)
at _GrailsRun_groovy$_run_closure5.call(_GrailsRun_groovy)
at _GrailsRun_groovy.runInline(_GrailsRun_groovy:116)
at _GrailsRun_groovy.this$4$runInline(_GrailsRun_groovy)
at _GrailsRun_groovy$_run_closure1.doCall(_GrailsRun_groovy:59)
at RunApp$_run_closure1.doCall(RunApp:33)
at gant.Gant$_dispatch_closure5.doCall(Gant.groovy:381)
at gant.Gant$_dispatch_closure7.doCall(Gant.groovy:415)
at gant.Gant$_dispatch_closure7.doCall(Gant.groovy)
at gant.Gant.withBuildListeners(Gant.groovy:427)
at gant.Gant.this$2$withBuildListeners(Gant.groovy)
at gant.Gant$this$2$withBuildListeners.callCurrent(Unknown Source)
at gant.Gant.dispatch(Gant.groovy:415)
at gant.Gant.this$2$dispatch(Gant.groovy)
at gant.Gant.invokeMethod(Gant.groovy)
at gant.Gant.executeTargets(Gant.groovy:590)
at gant.Gant.executeTargets(Gant.groovy:589)
Caused by: org.quartz.JobPersistenceException: Couldn't store trigger 'expirationTrigger' for 'com.pldtglobal.svngateway.ExpirationCheckerJob' job:Bad value for type long : \254\355\000\005sr\000\025org.quartz.JobDataMap\237\260\203\350\277\251\260\313\002\000\000xr\000&org.quartz.utils.StringKeyDirtyFlagMap\202\010\350\303\373\305](\002\000\001Z\000\023allowsTransientDataxr\000\035org.quartz.utils.DirtyFlagMap\023\346.\255(v\012\316\002\000\002Z\000\005dirtyL\000\003mapt\000\017Ljava/util/Map;xp\001sr\000\021java.util.HashMap\005\007\332\301\303\026`\321\003\000\002F\000\012loadFactorI\000\011thresholdxp?#\000\000\000\000\000\014w\010\000\000\000\020\000\000\000\001t\000'org.grails.plugins.quartz.grailsJobNamet\000.com.pldtglobal.svngateway.ExpirationCheckerJobx\000 [See nested exception: org.postgresql.util.PSQLException: Bad value for type long : \254\355\000\005sr\000\025org.quartz.JobDataMap\237\260\203\350\277\251\260\313\002\000\000xr\000&org.quartz.utils.StringKeyDirtyFlagMap\202\010\350\303\373\305](\002\000\001Z\000\023allowsTransientDataxr\000\035org.quartz.utils.DirtyFlagMap\023\346.\255(v\012\316\002\000\002Z\000\005dirtyL\000\003mapt\000\017Ljava/util/Map;xp\001sr\000\021java.util.HashMap\005\007\332\301\303\026`\321\003\000\002F\000\012loadFactorI\000\011thresholdxp?#\000\000\000\000\000\014w\010\000\000\000\020\000\000\000\001t\000'org.grails.plugins.quartz.grailsJobNamet\000.com.pldtglobal.svngateway.ExpirationCheckerJobx\000]
at org.quartz.impl.jdbcjobstore.JobStoreSupport.storeTrigger(JobStoreSupport.java:1241)
at org.quartz.impl.jdbcjobstore.JobStoreSupport$5.execute(JobStoreSupport.java:1147)
at org.quartz.impl.jdbcjobstore.JobStoreSupport$40.execute(JobStoreSupport.java:3670)
at org.quartz.impl.jdbcjobstore.JobStoreCMT.executeInLock(JobStoreCMT.java:242)
at org.quartz.impl.jdbcjobstore.JobStoreSupport.executeInLock(JobStoreSupport.java:3666)
at org.quartz.impl.jdbcjobstore.JobStoreSupport.storeTrigger(JobStoreSupport.java:1143)
at org.quartz.core.QuartzScheduler.scheduleJob(QuartzScheduler.java:790)
at org.quartz.impl.StdScheduler.scheduleJob(StdScheduler.java:254)
at org.quartz.Scheduler$scheduleJob.call(Unknown Source)
at QuartzGrailsPlugin$_closure5_closure24.doCall(QuartzGrailsPlugin.groovy:223)
at QuartzGrailsPlugin$_closure5.doCall(QuartzGrailsPlugin.groovy:218)
at QuartzGrailsPlugin.invokeMethod(QuartzGrailsPlugin.groovy)
at QuartzGrailsPlugin$_closure3_closure21.doCall(QuartzGrailsPlugin.groovy:169)
at QuartzGrailsPlugin$_closure3.doCall(QuartzGrailsPlugin.groovy:167)
... 23 more
Caused by: org.postgresql.util.PSQLException: Bad value for type long : \254\355\000\005sr\000\025org.quartz.JobDataMap\237\260\203\350\277\251\260\313\002\000\000xr\000&org.quartz.utils.StringKeyDirtyFlagMap\202\010\350\303\373\305](\002\000\001Z\000\023allowsTransientDataxr\000\035org.quartz.utils.DirtyFlagMap\023\346.\255(v\012\316\002\000\002Z\000\005dirtyL\000\003mapt\000\017Ljava/util/Map;xp\001sr\000\021java.util.HashMap\005\007\332\301\303\026`\321\003\000\002F\000\012loadFactorI\000\011thresholdxp?#\000\000\000\000\000\014w\010\000\000\000\020\000\000\000\001t\000'org.grails.plugins.quartz.grailsJobNamet\000.com.pldtglobal.svngateway.ExpirationCheckerJobx\000
at org.postgresql.jdbc2.AbstractJdbc2ResultSet.toLong(AbstractJdbc2ResultSet.java:2796)
at org.postgresql.jdbc2.AbstractJdbc2ResultSet.getLong(AbstractJdbc2ResultSet.java:2019)
at org.postgresql.jdbc4.Jdbc4ResultSet.getBlob(Jdbc4ResultSet.java:52)
at org.postgresql.jdbc2.AbstractJdbc2ResultSet.getBlob(AbstractJdbc2ResultSet.java:335)
at org.quartz.impl.jdbcjobstore.StdJDBCDelegate.getObjectFromBlob(StdJDBCDelegate.java:3462)
at org.quartz.impl.jdbcjobstore.StdJDBCDelegate.selectJobDetail(StdJDBCDelegate.java:904)
at org.quartz.impl.jdbcjobstore.JobStoreSupport.storeTrigger(JobStoreSupport.java:1197)
... 36 more
Application context shutting down...
Application context shutdown.
I don't have any idea on where the actual problem is. The code is alright and running when the jobs weren't saved in the database.

In your grails-app/conf/quartz.properties, replace
org.quartz.jobStore.driverDelegateClass = org.quartz.impl.jdbcjobstore.StdJDBCDelegate
with
org.quartz.jobStore.driverDelegateClass = org.quartz.impl.jdbcjobstore.PostgreSQLDelegate
I'm getting the same error even using the correct delegate, so no promises.

For spring boot, you can also specify the PG driver using the following property in application.properties -
spring.quartz.properties.org.quartz.jobStore.driverDelegateClass=org.quartz.impl.jdbcjobstore.PostgreSQLDelegate

For anyone using Quartz and Spring Boot, I had the same problem after migrating from using Quartz in Tomcat to Spring Boot. In Tomcat, we were using a quartz properties file and manually loading it when creating the Scheduler. One of those properties was:
org.quartz.jobStore.driverDelegateClass = org.quartz.impl.jdbcjobstore.PostgreSQLDelegate
In Spring Boot, the scheduler is created automatically through an auto config, and therefore our properties weren't being applied.
Our solution was to use a SchedulerFactoryBeanCustomizer and set the Quartz properties. This customizer is applied before the scheduler is created so it's a good place to configure Quartz.
#Bean
public SchedulerFactoryBeanCustomizer schedulerFactoryBeanCustomizer()
{
return new SchedulerFactoryBeanCustomizer()
{
#Override
public void customize(SchedulerFactoryBean bean)
{
bean.setQuartzProperties(createQuartzProperties());
}
};
}
private Properties createQuartzProperties()
{
// Could also load from a file
Properties props = new Properties();
props.put("org.quartz.jobStore.driverDelegateClass", "org.quartz.impl.jdbcjobstore.PostgreSQLDelegate");
return props;
}
And for reference here is the full quartz.properties we migrated from:
org.quartz.scheduler.instanceName=ProcessAutomation
org.quartz.scheduler.instanceId=AUTO
org.quartz.scheduler.jmx.export=true
org.quartz.threadPool.class=org.quartz.simpl.SimpleThreadPool
org.quartz.threadPool.threadCount=10
org.quartz.threadPool.threadPriority=5
org.quartz.jobStore.class=org.quartz.impl.jdbcjobstore.JobStoreCMT
org.quartz.jobStore.dataSource=QuartzDS
org.quartz.jobStore.nonManagedTXDataSource=springNonTxDataSource.ProcessAutomation
org.quartz.jobStore.driverDelegateClass=org.quartz.impl.jdbcjobstore.PostgreSQLDelegate
org.quartz.jobStore.misfireThreshold=60000
org.quartz.jobStore.isClustered=true
org.quartz.jobStore.clusterCheckinInterval=20000

#Bean
public Properties quartzProperties() throws IOException {
PropertiesFactoryBean propertiesFactoryBean = new PropertiesFactoryBean();
propertiesFactoryBean.setLocation(new ClassPathResource("application.properties"));
Properties props = new Properties();
props.put("org.quartz.jobStore.driverDelegateClass", "org.quartz.impl.jdbcjobstore.PostgreSQLDelegate");
propertiesFactoryBean.setProperties(props);
propertiesFactoryBean.afterPropertiesSet();
return propertiesFactoryBean.getObject();
}
Alternately if you want to set all quartz properties like clustered, thread-pool etc.. Instead of typing them here in this method, create a quartz.properties file and use below;
#Autowired
private QuartzProperties quartzProperties;
#Autowired
DataSource dataSource;
#Bean
public SchedulerFactoryBean schedulerFactoryBean() throws IOException {
SchedulerFactoryBean factory = new SchedulerFactoryBean();
factory.setOverwriteExistingJobs(true);
factory.setDataSource(dataSource);
factory.setQuartzProperties(quartzProperties());
AutowiringSpringBeanJobFactory jobFactory = new AutowiringSpringBeanJobFactory();
jobFactory.setApplicationContext(applicationContext);
factory.setJobFactory(jobFactory);
return factory;
}
#Bean
public Properties quartzProperties() throws IOException {
PropertiesFactoryBean propertiesFactoryBean = new PropertiesFactoryBean();
propertiesFactoryBean.setLocation(new ClassPathResource("/application.properties"));
Properties props = new Properties();
props.putAll(quartzProperties.getProperties());
propertiesFactoryBean.setProperties(props);
propertiesFactoryBean.afterPropertiesSet(); //it's important
return propertiesFactoryBean.getObject();
}
quartz.properties file example below:-
org.quartz.scheduler.instanceName=springBootQuartzApp
org.quartz.scheduler.instanceId=AUTO
org.quartz.threadPool.threadCount=50
org.quartz.jobStore.class=org.quartz.impl.jdbcjobstore.JobStoreTX
org.quartz.jobStore.driverDelegateClass=org.quartz.impl.jdbcjobstore.PostgreSQLDelegate
org.quartz.jobStore.useProperties=true
#org.quartz.jobStore.misfireThreshold=60000
org.quartz.jobStore.isClustered=true
org.quartz.plugin.shutdownHook.class=org.quartz.plugins.management.ShutdownHookPlugin
org.quartz.plugin.shutdownHook.cleanShutdown=TRUE

I also face this issue and I just add :
properties.put("org.quartz.jobStore.driverDelegateClass", "org.quartz.impl.jdbcjobstore.PostgreSQLDelegate");
full bean configuration below
#Bean
public SchedulerFactoryBean scheduler(Trigger... triggers) {
SchedulerFactoryBean schedulerFactory = new SchedulerFactoryBean();
Properties properties = new Properties();
properties.setProperty("org.quartz.scheduler.instanceName", "MY_INSTANCE_NAME");
properties.setProperty("org.quartz.scheduler.instanceId", "INSTANCE_ID_01");
properties.put("org.quartz.jobStore.driverDelegateClass", "org.quartz.impl.jdbcjobstore.PostgreSQLDelegate");
schedulerFactory.setOverwriteExistingJobs(true);
schedulerFactory.setAutoStartup(true);
schedulerFactory.setQuartzProperties(properties);
schedulerFactory.setDataSource(dataSource);
schedulerFactory.setJobFactory(springBeanJobFactory());
schedulerFactory.setWaitForJobsToCompleteOnShutdown(true);
if (ArrayUtils.isNotEmpty(triggers)) {
schedulerFactory.setTriggers(triggers);
}
return schedulerFactory;
}

Related

KStream-GlobalKTable-Join using Spring-Cloud-Stream - How to check the content of the GlobalKTable?

I'm implementing a KStream-GlobalKTable-Join using Spring-Cloud-Stream and I'm facing the problem, that the join operation doesn't get any matches, but it definitely should. The code looks as follows:
#Component
#EnableBinding(CustomProcessor.class)
public class MyProcessor {
private static final Log LOGGER =
LogFactory.getLog(MyProcessor.class);
#Autowired
private InteractiveQueryService interactiveQueryService;
ReadOnlyKeyValueStore<Object, Object> keyValueStore;
#StreamListener
#SendTo(CustomProcessor.OUTPUT)
public KStream<EventKey, EventEnriched> process(
#Input(CustomProcessor.INPUT) KStream<EventKey, EventEnriched> inputStream,
#Input(CustomProcessor.LOOKUP) GlobalKTable<LookupKey, LookupData> lookupStore
) {
keyValueStore = interactiveQueryService.getQueryableStore("lookupStore", QueryableStoreTypes.keyValueStore());
LOGGER.info("Lookup: " + keyValueStore.get(new LookupKey("google.de")));
return inputStream.leftJoin(
lookupStore,
(inputKey, inputValue) -> {
return new LookupKey(inputValue.getDomain().replace("www.", ""));
},
this::enrichData
);
}
public EventEnriched enrichData(EventEnriched input, LookupData lookupRecord) {
...
}
}
Here the CustomProcessor:
public interface CustomProcessor extends KafkaStreamsProcessor {
String INPUT = "input";
String OUTPUT = "output";
String LOOKUP = "lookupTable";
#Input(CustomProcessor.LOOKUP)
GlobalKTable<LookupKey, ?> lookupTable();
}
Without calling the line in MyProcessor
keyValueStore.get(...)
the code runs fine, but the GlobalKTable seems to be null. But if I call
LOGGER.info("Lookup: " + keyValueStore.get(new LookupKey("google.de")));
in order to inpect the GlobalKTable, runnig the application fails with:
Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.
2019-06-26T09:04:00.000 [ERROR] [main-858] [org.springframework.boot.SpringApplication] [reportFailure:858] Application run failed
org.springframework.beans.factory.BeanInitializationException: Cannot setup StreamListener for public org.apache.kafka.streams.kstream.KStream MyProcessor.process(org.apache.kafka.streams.kstream.KStream,org.apache.kafka.streams.kstream.GlobalKTable); nested exception is java.lang.reflect.InvocationTargetException
at org.springframework.cloud.stream.binder.kafka.streams.KafkaStreamsStreamListenerSetupMethodOrchestrator.orchestrateStreamListenerSetupMethod(KafkaStreamsStreamListenerSetupMethodOrchestrator.java:214)
at org.springframework.cloud.stream.binding.StreamListenerAnnotationBeanPostProcessor.doPostProcess(StreamListenerAnnotationBeanPostProcessor.java:226)
at org.springframework.cloud.stream.binding.StreamListenerAnnotationBeanPostProcessor.lambda$postProcessAfterInitialization$0(StreamListenerAnnotationBeanPostProcessor.java:196)
at java.base/java.lang.Iterable.forEach(Iterable.java:75)
at org.springframework.cloud.stream.binding.StreamListenerAnnotationBeanPostProcessor.injectAndPostProcessDependencies(StreamListenerAnnotationBeanPostProcessor.java:330)
at org.springframework.cloud.stream.binding.StreamListenerAnnotationBeanPostProcessor.afterSingletonsInstantiated(StreamListenerAnnotationBeanPostProcessor.java:113)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:866)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:877)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:549)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:142)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:775)
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:397)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:316)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1260)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1248)
at Transformer.main(Transformer.java:31)
Caused by: java.lang.reflect.InvocationTargetException: null
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.springframework.cloud.stream.binder.kafka.streams.KafkaStreamsStreamListenerSetupMethodOrchestrator.orchestrateStreamListenerSetupMethod(KafkaStreamsStreamListenerSetupMethodOrchestrator.java:179)
... 15 common frames omitted
Caused by: java.lang.NullPointerException: null
at MyProcessor.process(MyProcessor.java:62)
... 20 common frames omitted
Process finished with exit code 1
Does anybody see a problem in the code? How can I inspect the content of the GlobaKTable?
Best regards
Martin
Now I'm getting closer to the problem. I have tried to query the lookupStore. If I use
final ReadOnlyKeyValueStore<LookupKey, LookupData> lookupStore =
interactiveQueryService.getQueryableStore("myStore", QueryableStoreTypes.<LookupKey, LookupData>keyValueStore())
Then
lookupStore.get(key)
never returns a value. But if I create a HashMap like this:
final KeyValueIterator<LookupKey, LookupData> lookups = lookupStore.all();
Map<LookupKey, LookupData> lookupMap = new HashMap<>();
while (lookups.hasNext()) {
KeyValue<LookupKey, LookupData> nextLookup = lookups.next();
lookupMap.put(nextLookup.key, nextLookup.value);
}
lookups.close();
the hashMap contains the correct data and is returning the correct value to each key. But the GlobalKTable itself cannot be joined for some reason. It never gets any matches.

Apply Side input to BigQueryIO.read operation in Apache Beam

Is there a way to apply a side input to a BigQueryIO.read() operation in Apache Beam.
Say for example I have a value in a PCollection that I want to use in a query to fetch data from a BigQuery table. Is this possible using side input? Or should something else be used in such a case?
I used NestedValueProvider in a similar case but I guess we can use that only when a certain value depends on my runtime value. Or can I use the same thing here? Please correct me if I'm wrong.
The code that I tried:
Bigquery bigQueryClient = start_pipeline.newBigQueryClient(options.as(BigQueryOptions.class)).build();
Tabledata tableRequest = bigQueryClient.tabledata();
PCollection<TableRow> existingData = readData.apply("Read existing data",ParDo.of(new DoFn<String,TableRow>(){
#ProcessElement
public void processElement(ProcessContext c) throws IOException
{
List<TableRow> list = c.sideInput(bqDataView);
String tableName = list.get(0).get("table").toString();
TableDataList table = tableRequest.list("projectID","DatasetID",tableName).execute();
for(TableRow row:table.getRows())
{
c.output(row);
}
}
}).withSideInputs(bqDataView));
The error that I get is:
Exception in thread "main" java.lang.IllegalArgumentException: unable to serialize BeamTest.StarterPipeline$1#86b455
at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:53)
at org.apache.beam.sdk.util.SerializableUtils.clone(SerializableUtils.java:90)
at org.apache.beam.sdk.transforms.ParDo$SingleOutput.<init>(ParDo.java:569)
at org.apache.beam.sdk.transforms.ParDo.of(ParDo.java:434)
at BeamTest.StarterPipeline.main(StarterPipeline.java:158)
Caused by: java.io.NotSerializableException: com.google.api.services.bigquery.Bigquery$Tabledata
at java.io.ObjectOutputStream.writeObject0(Unknown Source)
at java.io.ObjectOutputStream.defaultWriteFields(Unknown Source)
at java.io.ObjectOutputStream.writeSerialData(Unknown Source)
at java.io.ObjectOutputStream.writeOrdinaryObject(Unknown Source)
at java.io.ObjectOutputStream.writeObject0(Unknown Source)
at java.io.ObjectOutputStream.writeObject(Unknown Source)
at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:49)
... 4 more
The Beam model does not currently support this kind of data-dependent operation very well.
A way of doing it is to code your own DoFn that receives the side input and connects directly to BQ. Unfortunately, this would not give you any parallelism, as the DoFn would run completely on the same thread.
Once Splittable DoFns are supported in Beam, this will be a different story.
In the current state of the world, you would need to use the BQ client library to add code that would query BQ as if you were not in a Beam pipeline.
Given the code in your question, a rough idea on how to implement this is the following:
class ReadDataDoFn extends DoFn<String,TableRow>(){
private Tabledata tableRequest;
private Bigquery bigQueryClient;
private Bigquery createBigQueryClientWithinDoFn() {
// I'm not sure how you'd implement this, but you had the right idea
}
#Setup
public void setup() {
bigQueryClient = createBigQueryClientWithinDoFn();
tableRequest = bigQueryClient.tabledata();
}
#ProcessElement
public void processElement(ProcessContext c) throws IOException
{
List<TableRow> list = c.sideInput(bqDataView);
String tableName = list.get(0).get("table").toString();
TableDataList table = tableRequest.list("projectID","DatasetID",tableName).execute();
for(TableRow row:table.getRows())
{
c.output(row);
}
}
}
PCollection<TableRow> existingData = readData.apply("Read existing data",ParDo.of(new ReadDataDoFn()));

Spring Boot Application with Hazelcast Backed Spring Session Serialization Exception on Active Directory Login Failure

We have a Spring Boot application using a Hazecast-backed Spring Session. The application authenicates with Active Directory using Spring Security. If a user attempts to log in with invalid credentials, a serialization error is thrown:
com.hazelcast.nio.serialization.HazelcastSerializationException: java.io.NotSerializableException: com.sun.jndi.ldap.LdapCtx
at com.hazelcast.nio.serialization.SerializationServiceImpl.handleException(SerializationServiceImpl.java:380)
at com.hazelcast.nio.serialization.SerializationServiceImpl.toData(SerializationServiceImpl.java:235)
at com.hazelcast.nio.serialization.SerializationServiceImpl.toData(SerializationServiceImpl.java:207)
at com.hazelcast.map.impl.MapServiceContextImpl.toData(MapServiceContextImpl.java:338)
at com.hazelcast.map.impl.proxy.MapProxySupport.toData(MapProxySupport.java:1160)
at com.hazelcast.map.impl.proxy.MapProxyImpl.put(MapProxyImpl.java:96)
at org.springframework.session.hazelcast.config.annotation.web.http.HazelcastHttpSessionConfiguration$ExpiringSessionMap.put(HazelcastHttpSessionConfiguration.java:112)
at org.springframework.session.hazelcast.config.annotation.web.http.HazelcastHttpSessionConfiguration$ExpiringSessionMap.put(HazelcastHttpSessionConfiguration.java:102)
at org.springframework.session.MapSessionRepository.save(MapSessionRepository.java:72)
at org.springframework.session.MapSessionRepository.save(MapSessionRepository.java:36)
at org.springframework.session.web.http.SessionRepositoryFilter$SessionRepositoryRequestWrapper.commitSession(SessionRepositoryFilter.java:194)
at org.springframework.session.web.http.SessionRepositoryFilter$SessionRepositoryRequestWrapper.access$100(SessionRepositoryFilter.java:170)
at org.springframework.session.web.http.SessionRepositoryFilter.doFilterInternal(SessionRepositoryFilter.java:128)
at org.springframework.session.web.http.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:65)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.springframework.web.filter.CharacterEncodingFilter.doFilterInternal(CharacterEncodingFilter.java:121)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.springframework.boot.actuate.autoconfigure.MetricsFilter.doFilterInternal(MetricsFilter.java:103)
at org.springframework.web.filter.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:107)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:212)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:141)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:522)
at org.apache.coyote.ajp.AbstractAjpProcessor.process(AbstractAjpProcessor.java:868)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:672)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1502)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1458)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
at java.lang.Thread.run(Thread.java:745)
This appears to be the identical to another issue (Spring Boot with Session/Redis Serialization Error with Bad Active Directory Ldap Credentials) with Redis, however there doesn't appear to be a similar mechanism to control serialization in the Hazelcast session mapping that there is for Redis in Spring Session.
We've come up with a workaround (below), but it seems less than ideal as HazelcastHttpSessionConfiguration doesn't really seem to lend itself to extension, so it seems like there should be a cleaner way that we aren't seeing.
We are extending the HazelcastHttpSessionConfiguration to get at the ExpiringSessionMap to remove the LdapCtx before serialization is attempted. This doesn't seem ideal as the HazelcastHttpSessionConfiguration doesn't really lend it self to extension, requiring duplication of code.
Is there a better solution that we're missing?
#Configuration
public class CustomHazelcastHttpSessionMapConfiguration extends HazelcastHttpSessionConfiguration{
private String sessionMapName = "spring:session:sessions";
private int maxInactiveIntervalInSeconds = 1800;
#Bean
public SessionRepository<ExpiringSession> sessionRepository(
HazelcastInstance hazelcastInstance, SessionEntryListener sessionListener) {
super.sessionRepository(hazelcastInstance, sessionListener);
MapSessionRepository sessionRepository = new MapSessionRepository(
new CustomExpiringSessionMap(hazelcastInstance.getMap(this.sessionMapName)));
sessionRepository
.setDefaultMaxInactiveInterval(this.maxInactiveIntervalInSeconds);
return sessionRepository;
}
#Override
public void setSessionMapName(String sessionMapName) {
this.sessionMapName = sessionMapName;
super.setSessionMapName(sessionMapName);
}
#Override
public void setMaxInactiveIntervalInSeconds(int maxInactiveIntervalInSeconds) {
this.maxInactiveIntervalInSeconds = maxInactiveIntervalInSeconds;
super.setMaxInactiveIntervalInSeconds(maxInactiveIntervalInSeconds);
}
static class CustomExpiringSessionMap implements Map<String, ExpiringSession> {
private IMap<String, ExpiringSession> delegate;
CustomExpiringSessionMap(IMap<String, ExpiringSession> delegate) {
this.delegate = delegate;
}
public ExpiringSession put(String key, ExpiringSession value) {
if (value == null) {
return this.delegate.put(key, value);
}
for (String attrName : value.getAttributeNames()) {
Object attrVal = value.getAttribute(attrName);
// Don't serialize LdapCtx in a BadCredentialsException
if (attrVal instanceof BadCredentialsException &&
((BadCredentialsException) attrVal).getCause() != null &&
((BadCredentialsException) attrVal).getCause() instanceof ActiveDirectoryAuthenticationException &&
((BadCredentialsException) attrVal).getCause().getCause() != null &&
((BadCredentialsException) attrVal).getCause().getCause() instanceof javax.naming.AuthenticationException) {
((javax.naming.AuthenticationException) ((BadCredentialsException) attrVal).getCause().getCause()).setResolvedObj(null);
}
}
return this.delegate.put(key, value, value.getMaxInactiveIntervalInSeconds(),
TimeUnit.SECONDS);
}
/*... copy and paste of the rest of ExpiringSessionMap */
}
}
You should configure a custom serialization for object(s) you're having issues with.
This way you would address your problem in Hazelcast configuration without extending/duplicating Spring Session's Hazelcast configuration.
A cleaner solution would be transient-attributes.
If you have a web filter, you can pass it a list of properties to control the behaviour, and this one is a comma separated list of attribute names to exclude from serialization.
DM me if you need more info.

NullPointerException while resolving navigation case on a ViewExpiredException

I've created a custom exception handler which should navigate to a specific view on a ViewExpiredException.
ExceptionQueuedEventContext context = (ExceptionQueuedEventContext) event.getSource();
Throwable t = context.getException();
if (t instanceof ViewExpiredException) {
ViewExpiredException v = (ViewExpiredException) t;
FacesContext fc = FacesContext.getCurrentInstance();
Map<String, Object> requestMap = fc.getExternalContext().getRequestMap();
NavigationHandler nav = fc.getApplication().getNavigationHandler();
try {
requestMap.put("currentViewId", v.getViewId());
nav.handleNavigation(fc, "*", "viewExpired"+"?faces-redirect=true");
fc.renderResponse();
However, it throws the following exception on the line nav.handleNavigation()
java.lang.NullPointerException
at org.apache.myfaces.application.NavigationHandlerImpl.getNavigationCase(NavigationHandlerImpl.java:203)
at org.apache.myfaces.application.NavigationHandlerImpl.handleNavigation(NavigationHandlerImpl.java:77)
at com.daimler.esr.ui.exception.DefaultExceptionHandler.handle(DefaultExceptionHandler.java:55)
at org.apache.myfaces.lifecycle.LifecycleImpl.executePhase(LifecycleImpl.java:191)
at org.apache.myfaces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:118)
at javax.faces.webapp.FacesServlet.service(FacesServlet.java:189)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.service(ServletWrapper.java:1188)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:763)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:454)
at com.ibm.ws.webcontainer.servlet.ServletWrapperImpl.handleRequest(ServletWrapperImpl.java:178)
at com.ibm.ws.webcontainer.filter.WebAppFilterChain.invokeTarget(WebAppFilterChain.java:125)
at com.ibm.ws.webcontainer.filter.WebAppFilterChain.doFilter(WebAppFilterChain.java:92)
at org.primefaces.webapp.filter.FileUploadFilter.doFilter(FileUploadFilter.java:79)
at com.ibm.ws.webcontainer.filter.FilterInstanceWrapper.doFilter(FilterInstanceWrapper.java:192)
at com.ibm.ws.webcontainer.filter.WebAppFilterChain.doFilter(WebAppFilterChain.java:89)
at com.ibm.ws.webcontainer.filter.WebAppFilterManager.doFilter(WebAppFilterManager.java:919)
at com.ibm.ws.webcontainer.filter.WebAppFilterManager.invokeFilters(WebAppFilterManager.java:1016)
at com.ibm.ws.webcontainer.webapp.WebApp.handleRequest(WebApp.java:3703)
at com.ibm.ws.webcontainer.webapp.WebGroup.handleRequest(WebGroup.java:304)
at com.ibm.ws.webcontainer.WebContainer.handleRequest(WebContainer.java:962)
at com.ibm.ws.webcontainer.WSWebContainer.handleRequest(WSWebContainer.java:1662)
at com.ibm.ws.webcontainer.channel.WCChannelLink.ready(WCChannelLink.java:195)
at com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.handleDiscrimination(HttpInboundLink.java:452)
at com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.handleNewRequest(HttpInboundLink.java:511)
at com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.processRequest(HttpInboundLink.java:305)
at com.ibm.ws.http.channel.inbound.impl.HttpInboundLink.ready(HttpInboundLink.java:276)
I'm using MyFaces and PrimFaces 3.4.
Don't specify a "from" of "*", instead make it null.
nav.handleNavigation(fc, null, "viewExpired?faces-redirect=true");
When you specify a non-null "from", then the current view ID needs to be determined in order to find the associated "from" even though it's a wildcard like "*". As the current view is expired (you got a ViewExpiredException, right?), it's not available anymore and hence the NullPointerException in the internal code on the line context.getViewRoot().getViewId().

UriFragmentUtility causes servlet exception

As i was following this vaadin: https://vaadin.com/book/-/page/advanced.urifu.html tutorial on how to properly uses the UriFragmentUtility, i ended up creating the object and after trying to add this component to my main window, it fails with the following exception:
SEVERE: Servlet.service() for servlet [Dugsi_Manager Vaadin Application Servlet] in context with path [/Dugsi_Manager] threw exception [java.lang.UnsupportedOperationException] with root cause
java.lang.UnsupportedOperationException
com.vaadin.ui.CustomComponent.addComponent(CustomComponent.java:218)
com.vaadin.ui.Panel.addComponent(Panel.java:301)
com.vaadin.ui.Window.addComponent(Window.java:281)
org.bixin.dugsi.web.DugsiManagerApplication.init(DugsiManagerApplication.java:44)
com.vaadin.Application.start(Application.java:554)
com.vaadin.terminal.gwt.server.AbstractApplicationServlet.startApplication(AbstractApplicationServlet.java:1213)
com.vaadin.terminal.gwt.server.AbstractApplicationServlet.service(AbstractApplicationServlet.java:484)
javax.servlet.http.HttpServlet.service(HttpServlet.java:722)
org.apache.shiro.web.servlet.AbstractShiroFilter.executeChain(AbstractShiroFilter.java:359)
org.apache.shiro.web.servlet.AbstractShiroFilter$1.call(AbstractShiroFilter.java:275)
org.apache.shiro.subject.support.SubjectCallable.doCall(SubjectCallable.java:90)
org.apache.shiro.subject.support.SubjectCallable.call(SubjectCallable.java:83)
org.apache.shiro.subject.support.DelegatingSubject.execute(DelegatingSubject.java:344)
org.apache.shiro.web.servlet.AbstractShiroFilter.doFilterInternal(AbstractShiroFilter.java:272)
org.apache.shiro.web.servlet.OncePerRequestFilter.doFilter(OncePerRequestFilter.java:81)
org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:237)
org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:167)
Here is how i added the object to my main application class:
//Thread Local to create instances of our application
private static ThreadLocal<DugsiManagerApplication> threadLocal = new ThreadLocal<DugsiManagerApplication>();
#Override
public void init() {
setInstance(this); // immediate access to the app
//Window homeWindow = createNewWindow();
Subject currentUser = SecurityUtils.getSubject();
// Create the URI fragment utility
Window window = createLoginWindow();
setMainWindow(window);
final UriFragmentUtility urifu = new UriFragmentUtility();
window.addComponent(urifu);
}
In the tutorial it talks about the URI primary part (address + path + optional query parameters), my path is set as /Dugsi_Manager (web.xml) should it not then start after adding the urifu object as https://localhost:8080/Dugsi_Manger#login?
Edit: Added the declaration of the LoginWindow:
public Window createLoginWindow(){
final Window loginWindow = new LoginWindow();
//remove the window if closed to avoid memory leaks
loginWindow.addListener(new CloseListener() {
#Override
public void windowClose(CloseEvent e) {
if (getMainWindow() != loginWindow) {
DugsiManagerApplication.this.removeWindow(loginWindow);
}
}
});
return loginWindow;
}
\
** It seems as the UriFragmentUtility object can be added to a standard Vaadin Window but does not work on a window created with my createLoginWindow function? i cannot figure out why?
The exception is thrown by CustomComponent's addComponent method. So I guess CustomComponent is the content of a window. To fix the problem, add your UriFragmentUtility directly to the layout that's CustomComponent's composition root instead of
window.addComponent(urifu);

Resources