Google cloud data flow INTERNAL: GOAWAY received - google-cloud-dataflow

We are using google cloud dataflow to create a cache of data to make our web requests fast. The dataset and how it gets grouped together is slightly out of our control, so we are doing some very unorthodox things. Anyway, we have been getting this error every once in a while, but the job will sometimes continue to run and succeed.
JobID:2017-12-19_22_30_10-4314752451342817881
java.lang.RuntimeException: org.apache.beam.sdk.util.UserCodeException: java.lang.RuntimeException: unexpected
at com.google.cloud.dataflow.worker.GroupAlsoByWindowsParDoFn$1.output(GroupAlsoByWindowsParDoFn.java:182)
at com.google.cloud.dataflow.worker.GroupAlsoByWindowFnRunner$1.outputWindowedValue(GroupAlsoByWindowFnRunner.java:104)
at com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowViaIteratorsFn.processElement(BatchGroupAlsoByWindowViaIteratorsFn.java:121)
at com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowViaIteratorsFn.processElement(BatchGroupAlsoByWindowViaIteratorsFn.java:53)
at com.google.cloud.dataflow.worker.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:117)
at com.google.cloud.dataflow.worker.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:74)
at com.google.cloud.dataflow.worker.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:113)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:187)
at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:148)
at com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:68)
at com.google.cloud.dataflow.worker.DataflowWorker.executeWork(DataflowWorker.java:330)
at com.google.cloud.dataflow.worker.DataflowWorker.doWork(DataflowWorker.java:302)
at com.google.cloud.dataflow.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:251)
at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:135)
at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:115)
at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:102)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.beam.sdk.util.UserCodeException: java.lang.RuntimeException: unexpected
at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:36)
at com.monsanto.product360.beam.dataflow.materialviews.groupers.CabnAuspGrouper$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:324)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:272)
at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:436)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:424)
at org.apache.beam.sdk.transforms.join.CoGroupByKey$ConstructCoGbkResultFn.processElement(CoGroupByKey.java:206)
at org.apache.beam.sdk.transforms.join.CoGroupByKey$ConstructCoGbkResultFn$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:324)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
at com.google.cloud.dataflow.worker.GroupAlsoByWindowsParDoFn$1.output(GroupAlsoByWindowsParDoFn.java:180)
... 21 more
Caused by: java.lang.RuntimeException: unexpected
at com.google.cloud.dataflow.worker.util.common.worker.CachingShuffleBatchReader.read(CachingShuffleBatchReader.java:79)
at com.google.cloud.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:133)
at com.google.cloud.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntriesIfNeeded(BatchingShuffleEntryReader.java:126)
at com.google.cloud.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.hasNext(BatchingShuffleEntryReader.java:90)
at com.google.cloud.dataflow.worker.util.common.ForwardingReiterator.hasNext(ForwardingReiterator.java:62)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:283)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:278)
at com.google.cloud.dataflow.worker.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:357)
at org.apache.beam.runners.core.PeekingReiterator.computeNext(PeekingReiterator.java:94)
at org.apache.beam.runners.core.PeekingReiterator.hasNext(PeekingReiterator.java:48)
at com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowViaIteratorsFn$WindowReiterator.skipToValidElement(BatchGroupAlsoByWindowViaIteratorsFn.java:226)
at com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowViaIteratorsFn$WindowReiterator.hasNext(BatchGroupAlsoByWindowViaIteratorsFn.java:202)
at org.apache.beam.sdk.repackaged.com.google.common.collect.Iterators$PeekingImpl.hasNext(Iterators.java:1105)
at org.apache.beam.sdk.transforms.join.CoGbkResult$UnionValueIterator.advance(CoGbkResult.java:430)
at org.apache.beam.sdk.transforms.join.CoGbkResult$UnionValueIterator.hasNext(CoGbkResult.java:407)
at org.apache.beam.sdk.repackaged.com.google.common.collect.MultitransformedIterator.hasNext(MultitransformedIterator.java:47)
at com.monsanto.product360.beam.dataflow.materialviews.groupers.CabnAuspGrouper.processElement(CabnAuspGrouper.java:35)
Caused by: java.util.concurrent.ExecutionException: java.io.IOException: INTERNAL: GOAWAY received
at com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:500)
at com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:459)
at com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.AbstractFuture$TrustedFuture.get(AbstractFuture.java:76)
at com.google.cloud.dataflow.worker.repackaged.com.google.common.util.concurrent.Uninterruptibles.getUninterruptibly(Uninterruptibles.java:142)
at com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.getAndRecordStats(LocalCache.java:2373)
at com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2337)
at com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295)
at com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
at com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
at com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4057)
at com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4986)
at com.google.cloud.dataflow.worker.util.common.worker.CachingShuffleBatchReader.read(CachingShuffleBatchReader.java:76)
at com.google.cloud.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntries(BatchingShuffleEntryReader.java:133)
at com.google.cloud.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.fillEntriesIfNeeded(BatchingShuffleEntryReader.java:126)
at com.google.cloud.dataflow.worker.util.common.worker.BatchingShuffleEntryReader$ShuffleReadIterator.hasNext(BatchingShuffleEntryReader.java:90)
at com.google.cloud.dataflow.worker.util.common.ForwardingReiterator.hasNext(ForwardingReiterator.java:62)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.advance(GroupingShuffleEntryIterator.java:283)
at com.google.cloud.dataflow.worker.util.common.worker.GroupingShuffleEntryIterator$ValuesIterator.hasNext(GroupingShuffleEntryIterator.java:278)
at com.google.cloud.dataflow.worker.GroupingShuffleReader$GroupingShuffleReaderIterator$ValuesIterator.hasNext(GroupingShuffleReader.java:357)
at org.apache.beam.runners.core.PeekingReiterator.computeNext(PeekingReiterator.java:94)
at org.apache.beam.runners.core.PeekingReiterator.hasNext(PeekingReiterator.java:48)
at com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowViaIteratorsFn$WindowReiterator.skipToValidElement(BatchGroupAlsoByWindowViaIteratorsFn.java:226)
at com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowViaIteratorsFn$WindowReiterator.hasNext(BatchGroupAlsoByWindowViaIteratorsFn.java:202)
at org.apache.beam.sdk.repackaged.com.google.common.collect.Iterators$PeekingImpl.hasNext(Iterators.java:1105)
at org.apache.beam.sdk.transforms.join.CoGbkResult$UnionValueIterator.advance(CoGbkResult.java:430)
at org.apache.beam.sdk.transforms.join.CoGbkResult$UnionValueIterator.hasNext(CoGbkResult.java:407)
at org.apache.beam.sdk.repackaged.com.google.common.collect.MultitransformedIterator.hasNext(MultitransformedIterator.java:47)
at com.monsanto.product360.beam.dataflow.materialviews.groupers.CabnAuspGrouper.processElement(CabnAuspGrouper.java:35)
at com.monsanto.product360.beam.dataflow.materialviews.groupers.CabnAuspGrouper$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:324)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:272)
at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:436)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:424)
at org.apache.beam.sdk.transforms.join.CoGroupByKey$ConstructCoGbkResultFn.processElement(CoGroupByKey.java:206)
at org.apache.beam.sdk.transforms.join.CoGroupByKey$ConstructCoGbkResultFn$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:324)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
at com.google.cloud.dataflow.worker.GroupAlsoByWindowsParDoFn$1.output(GroupAlsoByWindowsParDoFn.java:180)
at com.google.cloud.dataflow.worker.GroupAlsoByWindowFnRunner$1.outputWindowedValue(GroupAlsoByWindowFnRunner.java:104)
at com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowViaIteratorsFn.processElement(BatchGroupAlsoByWindowViaIteratorsFn.java:121)
at com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowViaIteratorsFn.processElement(BatchGroupAlsoByWindowViaIteratorsFn.java:53)
at com.google.cloud.dataflow.worker.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:117)
at com.google.cloud.dataflow.worker.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:74)
at com.google.cloud.dataflow.worker.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:113)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:187)
at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:148)
at com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:68)
at com.google.cloud.dataflow.worker.DataflowWorker.executeWork(DataflowWorker.java:330)
at com.google.cloud.dataflow.worker.DataflowWorker.doWork(DataflowWorker.java:302)
at com.google.cloud.dataflow.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:251)
at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:135)
at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:115)
at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:102)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: INTERNAL: GOAWAY received
at com.google.cloud.dataflow.worker.ApplianceShuffleReader.readIncludingPosition(Native Method)
at com.google.cloud.dataflow.worker.ChunkingShuffleBatchReader.read(ChunkingShuffleBatchReader.java:62)
at com.google.cloud.dataflow.worker.util.common.worker.CachingShuffleBatchReader$1.load(CachingShuffleBatchReader.java:57)
at com.google.cloud.dataflow.worker.util.common.worker.CachingShuffleBatchReader$1.load(CachingShuffleBatchReader.java:53)
at com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3628)
at com.google.cloud.dataflow.worker.repackaged.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2336)
... 62 more

This looks like a transient network error. In a distributed system transient errors are bound to happen, so Dataflow retries failures a few times (but it logs them because they sometimes knowing about them can be useful). If your job is succeeding, there's nothing to worry about.

Related

UserCodeException: java.lang.OutOfMemoryError: Java heap space when streaming autoscaling

I have two streaming pipelines running on productions that has no troubles so far (both with n1-standard-4). However, when I decided to try autoscaling, it gives me said error. I've tried both with normal autoscaling and streaming engine (with n2-highmem1, n1-highmen1) but nothing works.
Here's the structure of my pipeline. It's a Pubsub to BigQuery.
java.lang.RuntimeException: org.apache.beam.sdk.util.UserCodeException: java.lang.OutOfMemoryError: Java heap space
org.apache.beam.runners.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:194)
org.apache.beam.runners.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:165)
org.apache.beam.runners.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
org.apache.beam.runners.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
org.apache.beam.runners.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
org.apache.beam.runners.dataflow.worker.IntrinsicMapTaskExecutorFactory.create(IntrinsicMapTaskExecutorFactory.java:125)
org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1203)
org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.access$1000(StreamingDataflowWorker.java:149)
org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker$6.run(StreamingDataflowWorker.java:1024)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.beam.sdk.util.UserCodeException: java.lang.OutOfMemoryError: Java heap space
org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:34)
drivemode.com.dataflow.reader.PubsubToAnalyticsTableRowFN$DoFnInvoker.invokeSetup(Unknown Source)
org.apache.beam.runners.dataflow.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.deserializeCopy(DoFnInstanceManagers.java:80)
org.apache.beam.runners.dataflow.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.peek(DoFnInstanceManagers.java:62)
org.apache.beam.runners.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:95)
org.apache.beam.runners.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:75)
org.apache.beam.runners.dataflow.worker.IntrinsicMapTaskExecutorFactory.createParDoOperation(IntrinsicMapTaskExecutorFactory.java:264)
org.apache.beam.runners.dataflow.worker.IntrinsicMapTaskExecutorFactory.access$000(IntrinsicMapTaskExecutorFactory.java:86)
org.apache.beam.runners.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:183)\
11 more\nCaused by: java.lang.OutOfMemoryError: Java heap space
org.tukaani.xz.lz.LZDecoder.<init>(Unknown Source)
org.tukaani.xz.LZMA2InputStream.<init>(Unknown Source)
org.tukaani.xz.LZMA2InputStream.<init>(Unknown Source)
org.apache.commons.compress.archivers.sevenz.LZMA2Decoder.decode(LZMA2Decoder.java:39)
org.apache.commons.compress.archivers.sevenz.Coders.addDecoder(Coders.java:76)
org.apache.commons.compress.archivers.sevenz.SevenZFile.buildDecoderStack(SevenZFile.java:933)
org.apache.commons.compress.archivers.sevenz.SevenZFile.buildDecodingStream(SevenZFile.java:909)
org.apache.commons.compress.archivers.sevenz.SevenZFile.getNextEntry(SevenZFile.java:222)
net.iakovlev.timeshape.TimeZoneEngine.lambda$initialize$0(TimeZoneEngine.java:111)
net.iakovlev.timeshape.TimeZoneEngine$$Lambda$76/1740730188.apply(Unknown Source)
java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948) java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471) java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151 java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418) net.iakovlev.timeshape.Index.build(Index.java:73) net.iakovlev.timeshape.TimeZoneEngine.initialize(TimeZoneEngine.java:126) net.iakovlev.timeshape.TimeZoneEngine.initialize(TimeZoneEngine.java:94) drivemode.com.dataflow.reader.AnalyticsTableRowFN.setUp(AnalyticsTableRowFN.java:60) drivemode.com.dataflow.reader.PubsubToAnalyticsTableRowFN$DoFnInvoker.invokeSetup(Unknown Source)
org.apache.beam.runners.dataflow.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.deserializeCopy(DoFnInstanceManagers.java:80) org.apache.beam.runners.dataflow.worker.DoFnInstanceManagers$ConcurrentQueueInstanceManager.peek(DoFnInstanceManagers.java:62) org.apache.beam.runners.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:95) org.apache.beam.runners.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:75)
org.apache.beam.runners.dataflow.worker.IntrinsicMapTaskExecutorFactory.createParDoOperation(IntrinsicMapTaskExecutorFactory.java:264) org.apache.beam.runners.dataflow.worker.IntrinsicMapTaskExecutorFactory.access$000(IntrinsicMapTaskExecutorFactory.java:86) org.apache.beam.runners.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:183)
org.apache.beam.runners.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:165)
org.apache.beam.runners.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
or sometimes I get
org.apache.beam.vendor.guava.v20_0.com.google.common.util.concurrent.ExecutionError: java.lang.NoClassDefFoundError: Could not initialize class drivemode.com.dataflow.reader.AnalyticsTableRowFN
org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2212)
org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache.get(LocalCache.java:4053)
org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4899)
org.apache.beam.runners.dataflow.worker.UserParDoFnFactory.create(UserParDoFnFactory.java:91)
org.apache.beam.runners.dataflow.worker.DefaultParDoFnFactory.create(DefaultParDoFnFactory.java:75)
org.apache.beam.runners.dataflow.worker.IntrinsicMapTaskExecutorFactory.createParDoOperation(IntrinsicMapTaskExecutorFactory.java:264)
org.apache.beam.runners.dataflow.worker.IntrinsicMapTaskExecutorFactory.access$000(IntrinsicMapTaskExecutorFactory.java:86)
org.apache.beam.runners.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:183)
org.apache.beam.runners.dataflow.worker.IntrinsicMapTaskExecutorFactory$1.typedApply(IntrinsicMapTaskExecutorFactory.java:165)
org.apache.beam.runners.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:63)
org.apache.beam.runners.dataflow.worker.graph.Networks$TypeSafeNodeFunction.apply(Networks.java:50)
org.apache.beam.runners.dataflow.worker.graph.Networks.replaceDirectedNetworkNodes(Networks.java:87)
org.apache.beam.runners.dataflow.worker.IntrinsicMapTaskExecutorFactory.create(IntrinsicMapTaskExecutorFactory.java:125)
org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.process(StreamingDataflowWorker.java:1203)
org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker.access$1000(StreamingDataflowWorker.java:149)
org.apache.beam.runners.dataflow.worker.StreamingDataflowWorker$6.run(StreamingDataflowWorker.java:1024)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
java.lang.Thread.run(Thread.java:745
Caused by: java.lang.NoClassDefFoundError: Could not initialize class drivemode.com.dataflow.reader.AnalyticsTableRowFN java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1787) java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:72) java.io.ObjectStreamClass$1.run(ObjectStreamClass.java:253)
java.io.ObjectStreamClass$1.run(ObjectStreamClass.java:251)
java.security.AccessController.doPrivileged(Native Method)
java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:250)
java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:611)
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630)
java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1630) java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521) java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808) java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
java.io.ObjectInputStream.readObject(ObjectInputStream.java:373) org.apache.beam.sdk.util.SerializableUtils.deserializeFromByteArray(SerializableUtils.java:71) org.apache.beam.runners.dataflow.worker.UserParDoFnFactory$UserDoFnExtractor.getDoFnInfo(UserParDoFnFactory.java:62) org.apache.beam.runners.dataflow.worker.UserParDoFnFactory.lambda$create$0(UserParDoFnFactory.java:93)
org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4904) org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3628) org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2336)
org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2295)
org.apache.beam.vendor.guava.v20_0.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2208)
... 18 more
Some of your ParDo functions or MapElements might be consuming more memory than they should, as per the logs pasted, review PubsubToAnalyticsTableRowFN.
This beam doclink might help you to tune the pipeline.
Apparently I found that the issue is in setup where I tried to initialize a time-zone lookup library that gives tz_string from (lat,lng). The memory consumption was too high and I solved the issue by loading said class at construction time.

Out of memory exception in dataflow with windowing on bounded input

I have created a dataflow which takes input from datastore and performs transform to convert it to BigQuery TableRow. I am attaching timestamp with each element in a transform. Then window of one day is applied to the PCollection. The windowed output is written to a partition in BigQuery table using Apache Beam's BigQueryIO.
The job fails with OOM error.
Dataflow Job ID: 2018-03-26_20_45_39-10536011060742036262
workerMachineType used: n1-standard-8
The dataflow pipeline at a high level is:
// Read from datastore
PCollection<Entity> entities =
pipeline.apply("ReadFromDatastore",
DatastoreIO.v1().read().withProjectId(options.getProject())
.withQuery(query).withNamespace(options.getNamespace()));
// Apply processing to convert it to BigQuery TableRow
PCollection<TableRow> tableRow =
entities.apply("ConvertToTableRow", ParDo.of(new ProcessEntityFn()));
// Apply timestamp to TableRow element, and then apply windowing of one day on that
PCollection<TableRow> tableRowWindow =
tableRow.apply("tableAddTimestamp", ParDo.of(new ApplyTimestampFn())).apply(
"tableApplyWindow",
Window.<TableRow> into(CalendarWindows.days(1).withTimeZone(
DateTimeZone.forID(options.getTimeZone()))));
// Write windowed output to BigQuery partitions
tableRowWindow.apply(
"WriteTableToBQ",
BigQueryIO
.writeTableRows()
.withSchema(BigqueryHelper.getSchema())
.to(TableRefPartition.perDay(options.getProject(),
options.getBigQueryDataset(), options.getTableName()))
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_TRUNCATE));
I am getting following error logs several times:
An OutOfMemoryException occurred. Consider specifying higher memory instances in PipelineOptions.
java.lang.RuntimeException: org.apache.beam.sdk.util.UserCodeException: java.lang.OutOfMemoryError: Java heap space
at com.google.cloud.dataflow.worker.GroupAlsoByWindowsParDoFn$1.output(GroupAlsoByWindowsParDoFn.java:182)
at com.google.cloud.dataflow.worker.GroupAlsoByWindowFnRunner$1.outputWindowedValue(GroupAlsoByWindowFnRunner.java:104)
at com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:54)
at com.google.cloud.dataflow.worker.util.BatchGroupAlsoByWindowReshuffleFn.processElement(BatchGroupAlsoByWindowReshuffleFn.java:37)
at com.google.cloud.dataflow.worker.GroupAlsoByWindowFnRunner.invokeProcessElement(GroupAlsoByWindowFnRunner.java:117)
at com.google.cloud.dataflow.worker.GroupAlsoByWindowFnRunner.processElement(GroupAlsoByWindowFnRunner.java:74)
at com.google.cloud.dataflow.worker.GroupAlsoByWindowsParDoFn.processElement(GroupAlsoByWindowsParDoFn.java:113)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:187)
at com.google.cloud.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:148)
at com.google.cloud.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:68)
at com.google.cloud.dataflow.worker.DataflowWorker.executeWork(DataflowWorker.java:330)
at com.google.cloud.dataflow.worker.DataflowWorker.doWork(DataflowWorker.java:302)
at com.google.cloud.dataflow.worker.DataflowWorker.getAndPerformWork(DataflowWorker.java:251)
at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:135)
at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:115)
at com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:102)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.beam.sdk.util.UserCodeException: java.lang.OutOfMemoryError: Java heap space
at org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:36)
at org.apache.beam.sdk.io.gcp.bigquery.WriteBundlesToFiles$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:138)
at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:324)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
at com.google.cloud.dataflow.worker.AssignWindowsParDoFnFactory$AssignWindowsParDoFn.processElement(AssignWindowsParDoFnFactory.java:116)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:272)
at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:436)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:424)
at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1.processElement(PrepareWrite.java:62)
at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:138)
at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:324)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
at com.google.cloud.dataflow.worker.AssignWindowsParDoFnFactory$AssignWindowsParDoFn.processElement(AssignWindowsParDoFnFactory.java:116)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:272)
at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.outputWithTimestamp(SimpleDoFnRunner.java:443)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.outputWithTimestamp(SimpleDoFnRunner.java:430)
at com.ittiam.cvml.dataflow.service.LoadIsmDataflow$ApplyTimestampFn.processElement(LoadIsmDataflow.java:526)
at com.ittiam.cvml.dataflow.service.LoadIsmDataflow$ApplyTimestampFn$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:324)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:272)
at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:436)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:424)
at com.ittiam.cvml.dataflow.service.LoadIsmDataflow$ProcessEntityFn.processElement(LoadIsmDataflow.java:478)
at com.ittiam.cvml.dataflow.service.LoadIsmDataflow$ProcessEntityFn$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:138)
at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:324)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:272)
at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:436)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:424)
at org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read$ReadFn.processElement(DatastoreV1.java:919)
at org.apache.beam.sdk.io.gcp.datastore.DatastoreV1$Read$ReadFn$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:324)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:272)
at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:436)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:424)
at org.apache.beam.sdk.transforms.MapElements$1.processElement(MapElements.java:122)
at org.apache.beam.sdk.transforms.MapElements$1$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:324)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
at com.google.cloud.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:272)
at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:211)
at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:66)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:436)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:424)
at org.apache.beam.runners.dataflow.ReshuffleOverrideFactory$ReshuffleWithOnlyTrigger$1.processElement(ReshuffleOverrideFactory.java:84)
at org.apache.beam.runners.dataflow.ReshuffleOverrideFactory$ReshuffleWithOnlyTrigger$1$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:177)
at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:141)
at com.google.cloud.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:324)
at com.google.cloud.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:48)
at com.google.cloud.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:52)
at com.google.cloud.dataflow.worker.GroupAlsoByWindowsParDoFn$1.output(GroupAlsoByWindowsParDoFn.java:180)
... 21 more
Caused by: java.lang.OutOfMemoryError: Java heap space
at com.google.api.client.googleapis.media.MediaHttpUploader.setContentAndHeadersOnCurrentRequest(MediaHttpUploader.java:603)
at com.google.api.client.googleapis.media.MediaHttpUploader.resumableUpload(MediaHttpUploader.java:409)
at com.google.api.client.googleapis.media.MediaHttpUploader.upload(MediaHttpUploader.java:336)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:427)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:352)
at com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:469)
at com.google.cloud.hadoop.util.AbstractGoogleAsyncWriteChannel$UploadOperation.call(AbstractGoogleAsyncWriteChannel.java:357)
Dataflow pipeline fails with the following error log:
Workflow failed. Causes: S53:ReadIsmFromDatastore/Reshuffle/Reshuffle/GroupByKey/Read+ReadIsmFromDatastore/Reshuffle/Reshuffle/GroupByKey/GroupByWindow+ReadIsmFromDatastore/Reshuffle/Reshuffle/ExpandIterable+ReadIsmFromDatastore/Reshuffle/Values/Values/Map+ReadIsmFromDatastore/Read+ConvertToTableRow+IsmTableAddTimestamp+TrackerTableAddTimestamp+IsmTableApplyWindow/Window.Assign+TrackerTableApplyWindow/Window.Assign+WriteTrackerTableToBQ/PrepareWrite/ParDo(Anonymous)+WriteTrackerTableToBQ/BatchLoads/rewindowIntoGlobal/Window.Assign+WriteTrackerTableToBQ/BatchLoads/WriteBundlesToFiles+WriteTrackerTableToBQ/BatchLoads/ReifyResults/View.AsIterable/View.CreatePCollectionView/ParDo(ToIsmRecordForGlobalWindow)+WriteTrackerTableToBQ/BatchLoads/GroupByDestination/Reify+WriteTrackerTableToBQ/BatchLoads/GroupByDestination/Write+WriteIsmTableToBQ/PrepareWrite/ParDo(Anonymous)+WriteIsmTableToBQ/BatchLoads/rewindowIntoGlobal/Window.Assign+WriteIsmTableToBQ/BatchLoads/WriteBundlesToFiles+WriteIsmTableToBQ/BatchLoads/ReifyResults/View.AsIterable/View.CreatePCollectionView/ParDo(ToIsmRecordForGlobalWindow)+WriteIsmTableToBQ/BatchLoads/GroupByDestination/Reify+WriteIsmTableToBQ/BatchLoads/GroupByDestination/Write failed., A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service.

Possible issue with ehcache plugin on grails 3

configured was :
compile 'org.grails.plugins:cache-ehcache:2.0.0.BUILD-SNAPSHOT'
This problem did not seem aparent whilst testing the app using grails run-app or grails prod run-app.
when I made a war and ported it over to tomcat, upon starting the app I had this error and the app would not start up. I removed a load of references to any thing related to cache and still had same issue. I found after disabling above plugin it started up.
This is the stack trace (Maybe it relates to multiple dsn) I did try to follow the issue link but the jira link was down:
ERROR org.springframework.boot.SpringApplication - Application startup failed
java.lang.IllegalStateException: Attempt to reinitialise the CacheManager
at net.sf.ehcache.CacheManager.reinitialisationCheck(CacheManager.java:852)
at net.sf.ehcache.CacheManager.parseConfiguration(CacheManager.java:747)
at net.sf.ehcache.CacheManager.init(CacheManager.java:386)
at grails.plugin.cache.ehcache.GrailsEhCacheManagerFactoryBean$ReloadableCacheManager.rebuild(GrailsEhCacheManagerFactoryBean.java:184)
at grails.plugin.cache.ehcache.GrailsEhCacheManagerFactoryBean$ReloadableCacheManager$rebuild.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125)
at grails.plugin.cache.ehcache.EhcacheConfigLoader.reload(EhcacheConfigLoader.groovy:63)
at grails.plugin.cache.ehcache.EhcacheConfigLoader$reload.callCurrent(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:52)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:154)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:174)
at grails.plugin.cache.ConfigLoader.reload(ConfigLoader.groovy:42)
at grails.plugin.cache.ConfigLoader$reload.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125)
at grails.plugin.cache.CacheGrailsPlugin.reloadCaches(CacheGrailsPlugin.groovy:181)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:210)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:59)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:52)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:154)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:166)
at grails.plugin.cache.CacheGrailsPlugin.doWithApplicationContext(CacheGrailsPlugin.groovy:137)
at org.grails.plugins.DefaultGrailsPlugin.doWithApplicationContext(DefaultGrailsPlugin.java:524)
at org.grails.plugins.AbstractGrailsPluginManager.doPostProcessing(AbstractGrailsPluginManager.java:229)
at grails.boot.config.GrailsApplicationPostProcessor.onApplicationEvent(GrailsApplicationPostProcessor.groovy:231)
at grails.boot.config.GrailsApplicationPostProcessor.onApplicationEvent(GrailsApplicationPostProcessor.groovy)
at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:163)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:136)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:381)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:335)
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:855)
at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.finishRefresh(EmbeddedWebApplicationContext.java:140)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:541)
at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.refresh(EmbeddedWebApplicationContext.java:118)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:766)
at org.springframework.boot.SpringApplication.createAndRefreshContext(SpringApplication.java:361)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:307)
at org.springframework.boot.context.web.SpringBootServletInitializer.run(SpringBootServletInitializer.java:149)
at org.springframework.boot.context.web.SpringBootServletInitializer.createRootApplicationContext(SpringBootServletInitializer.java:129)
at org.springframework.boot.context.web.SpringBootServletInitializer.onStartup(SpringBootServletInitializer.java:85)
at org.springframework.web.SpringServletContainerInitializer.onStartup(SpringServletContainerInitializer.java:175)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5225)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:725)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:701)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:717)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:945)
at org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1795)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
17-Apr-2016 14:12:32.742 SEVERE [localhost-startStop-1] org.apache.catalina.core.ContainerBase.addChildInternal ContainerBase.addChild: start:
org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Catalina].StandardHost[localhost].StandardContext[]]
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:154)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:725)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:701)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:717)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:945)
at org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1795)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalStateException: Attempt to reinitialise the CacheManager
at net.sf.ehcache.CacheManager.reinitialisationCheck(CacheManager.java:852)
at net.sf.ehcache.CacheManager.parseConfiguration(CacheManager.java:747)
at net.sf.ehcache.CacheManager.init(CacheManager.java:386)
at grails.plugin.cache.ehcache.GrailsEhCacheManagerFactoryBean$ReloadableCacheManager.rebuild(GrailsEhCacheManagerFactoryBean.java:184)
at grails.plugin.cache.ehcache.GrailsEhCacheManagerFactoryBean$ReloadableCacheManager$rebuild.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125)
at grails.plugin.cache.ehcache.EhcacheConfigLoader.reload(EhcacheConfigLoader.groovy:63)
at grails.plugin.cache.ehcache.EhcacheConfigLoader$reload.callCurrent(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:52)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:154)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:174)
at grails.plugin.cache.ConfigLoader.reload(ConfigLoader.groovy:42)
at grails.plugin.cache.ConfigLoader$reload.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125)
at grails.plugin.cache.CacheGrailsPlugin.reloadCaches(CacheGrailsPlugin.groovy:181)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:210)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:59)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:52)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:154)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:166)
at grails.plugin.cache.CacheGrailsPlugin.doWithApplicationContext(CacheGrailsPlugin.groovy:137)
at org.grails.plugins.DefaultGrailsPlugin.doWithApplicationContext(DefaultGrailsPlugin.java:524)
at org.grails.plugins.AbstractGrailsPluginManager.doPostProcessing(AbstractGrailsPluginManager.java:229)
at grails.boot.config.GrailsApplicationPostProcessor.onApplicationEvent(GrailsApplicationPostProcessor.groovy:231)
at grails.boot.config.GrailsApplicationPostProcessor.onApplicationEvent(GrailsApplicationPostProcessor.groovy)
at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:163)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:136)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:381)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:335)
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:855)
at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.finishRefresh(EmbeddedWebApplicationContext.java:140)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:541)
at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.refresh(EmbeddedWebApplicationContext.java:118)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:766)
at org.springframework.boot.SpringApplication.createAndRefreshContext(SpringApplication.java:361)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:307)
at org.springframework.boot.context.web.SpringBootServletInitializer.run(SpringBootServletInitializer.java:149)
at org.springframework.boot.context.web.SpringBootServletInitializer.createRootApplicationContext(SpringBootServletInitializer.java:129)
at org.springframework.boot.context.web.SpringBootServletInitializer.onStartup(SpringBootServletInitializer.java:85)
at org.springframework.web.SpringServletContainerInitializer.onStartup(SpringServletContainerInitializer.java:175)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5225)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
... 10 more
17-Apr-2016 14:12:32.743 SEVERE [localhost-startStop-1] org.apache.catalina.startup.HostConfig.deployWAR Error deploying web application archive /opt/apache-tomcat-8.0.29/webapps/ROOT.war
java.lang.IllegalStateException: ContainerBase.addChild: start: org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Catalina].StandardHost[localhost].StandardContext[]]
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:729)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:701)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:717)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:945)
at org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1795)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
I had the same problem and managed to get round by doing following changes in the build.gradle:
compile ("org.grails.plugins:cache-ehcache:3.0.0.BUILD-SNAPSHOT") {
exclude group:'net.sf.ehcache'
}
compile "net.sf.ehcache:ehcache:2.4.3"
It appears that the cache-ehcache:3.0.0.BUILD-SNAPSHOT uses newer ehcache(2.10.2) than the one coming with grails for hibernate caching (2.4.3 core). When both are included in the war the ehcache startup fails in tomcat.
Tested with grails 3.0.17.

cloudbees jenkins build ioexception2

I am building iOS project, using cloudbees, Jenkins continuous integration.
This project I have already up and running on my local Jenkins just fine. So I have imported the job, provisioning profiles, keychain and setup everithing as on my localhost.
The only difference is that I have selected "osx" as a Label Expression of running enviroment. (since cloudbees are using multiple build platforms) and I cannot even start the building process.
Output:
Building remotely on 51e861e6 in workspace /scratch/jenkins/workspace/TestFlight
Checkout:TestFlight / /scratch/jenkins/workspace/TestFlight - hudson.remoting.Channel#1b27f122:51e861e6
Using strategy: Default
hudson.util.IOException2: remote file operation failed: /scratch/jenkins/workspace/TestFlight at hudson.remoting.Channel#1b27f122:51e861e6
at hudson.FilePath.act(FilePath.java:907)
at hudson.FilePath.act(FilePath.java:884)
at hudson.plugins.git.GitSCM.determineRevisionToBuild(GitSCM.java:942)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1108)
at hudson.model.AbstractProject.checkout(AbstractProject.java:1411)
at hudson.model.AbstractBuild$AbstractBuildExecution.defaultCheckout(AbstractBuild.java:662)
at jenkins.scm.SCMCheckoutStrategy.checkout(SCMCheckoutStrategy.java:88)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:571)
at hudson.model.Run.execute(Run.java:1665)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
at hudson.model.ResourceController.execute(ResourceController.java:88)
at hudson.model.Executor.run(Executor.java:246)
Caused by: java.io.IOException: Remote call on 51e861e6 failed
at hudson.remoting.Channel.call(Channel.java:748)
at hudson.FilePath.act(FilePath.java:900)
... 11 more
Caused by: java.lang.LinkageError: Failed to load hudson.plugins.git.GitSCM
at hudson.remoting.RemoteClassLoader.loadClassFile(RemoteClassLoader.java:326)
at hudson.remoting.RemoteClassLoader.findClass(RemoteClassLoader.java:236)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
at java.lang.Class.getDeclaredFields0(Native Method)
at java.lang.Class.privateGetDeclaredFields(Class.java:2348)
at java.lang.Class.getDeclaredField(Class.java:1916)
at java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1622)
at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:51)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:433)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:421)
at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:318)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:555)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1601)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1496)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1750)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1329)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1970)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1895)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1777)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1329)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:349)
at hudson.remoting.UserRequest.deserialize(UserRequest.java:182)
at hudson.remoting.UserRequest.perform(UserRequest.java:98)
at hudson.remoting.UserRequest.perform(UserRequest.java:48)
at hudson.remoting.Request$2.run(Request.java:328)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
at java.lang.Thread.run(Thread.java:695)
Caused by: java.lang.NoClassDefFoundError: hudson/plugins/git/GitSCMBackwardCompatibility
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClassCond(ClassLoader.java:637)
at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
at java.lang.ClassLoader.defineClass(ClassLoader.java:471)
at hudson.remoting.RemoteClassLoader.loadClassFile(RemoteClassLoader.java:322)
... 32 more
Caused by: java.lang.ClassNotFoundException: hudson.plugins.git.GitSCMBackwardCompatibility
at jenkins.util.AntClassLoader.findClassInComponents(AntClassLoader.java:1375)
at jenkins.util.AntClassLoader.findClass(AntClassLoader.java:1325)
at jenkins.util.AntClassLoader.loadClass(AntClassLoader.java:1078)
at java.lang.ClassLoader.loadClass(Unknown Source)
at hudson.remoting.RemoteClassLoader$ClassLoaderProxy.fetch4(RemoteClassLoader.java:742)
at hudson.remoting.RemoteClassLoader$ClassLoaderProxy.fetch3(RemoteClassLoader.java:784)
at sun.reflect.GeneratedMethodAccessor1170.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at hudson.remoting.RemoteInvocationHandler$RPCRequest.perform(RemoteInvocationHandler.java:300)
at hudson.remoting.Request$2.run(Request.java:328)
at hudson.remoting.InterceptingExecutorService$1.call(InterceptingExecutorService.java:72)
at jenkins.util.ContextResettingExecutorService$2.call(ContextResettingExecutorService.java:46)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Build step 'Upload to Testflight' marked build as failure
As discussed with support team, the Jenkins needed to be restarted for the plugin updates to take effect. This fixed this issue.

Struts2 upgrade error

I just upgraded from Struts 2.0.6 with Spring 2.0 to Struts 2.2.3. I am getting the following error on application startup. I have added the following jars. Please let me know if i missed something. In windows machine, my own validation methods are ignored and xworks validation gets control and throws a validation error. This does not happen in linux machine.
struts2-core-2.2.3
struts2-json-plugin-2.2.3
struts2-spring-plugin-2.2.3
xwork-core-2.2.3
commons-fileupload-1.2.2
commons-collections-3.2
org.xml.sax.SAXParseException: Document is invalid: no grammar found.
at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.createSAXParseException(ErrorHandlerWrapper.java:195)
at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.error(ErrorHandlerWrapper.java:131)
at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:384)
at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:318)
at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.scanStartElement(XMLNSDocumentScannerImpl.java:250)
at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl$NSContentDriver.scanRootElementHook(XMLNSDocumentScannerImpl.java:626)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl$FragmentContentDriver.next(XMLDocumentFragmentScannerImpl.java:3103)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl$PrologDriver.next(XMLDocumentScannerImpl.java:922)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:648)
at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.next(XMLNSDocumentScannerImpl.java:140)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:511)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:808)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:737)
at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:119)
at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1205)
at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:522)
at javax.xml.parsers.SAXParser.parse(SAXParser.java:395)
at com.opensymphony.xwork2.util.DomHelper.parse(DomHelper.java:113)
at com.opensymphony.xwork2.validator.DefaultValidatorFileParser.parseValidatorDefinitions(DefaultValidatorFileParser.java:105)
at com.opensymphony.xwork2.validator.DefaultValidatorFactory.retrieveValidatorConfiguration(DefaultValidatorFactory.java:184)
at com.opensymphony.xwork2.validator.DefaultValidatorFactory.parseValidators(DefaultValidatorFactory.java:173)
at com.opensymphony.xwork2.validator.DefaultValidatorFactory.<init>(DefaultValidatorFactory.java:44)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at com.opensymphony.xwork2.inject.ContainerImpl$ConstructorInjector.construct(ContainerImpl.java:419)
at com.opensymphony.xwork2.inject.ContainerImpl.inject(ContainerImpl.java:492)
at com.opensymphony.xwork2.inject.ContainerImpl$7.call(ContainerImpl.java:532)
at com.opensymphony.xwork2.inject.ContainerImpl.callInContext(ContainerImpl.java:581)
at com.opensymphony.xwork2.inject.ContainerImpl.inject(ContainerImpl.java:530)
at com.opensymphony.xwork2.config.impl.LocatableFactory.create(LocatableFactory.java:32)
at com.opensymphony.xwork2.inject.ContainerBuilder$4.create(ContainerBuilder.java:130)
at com.opensymphony.xwork2.inject.Scope$2$1.create(Scope.java:51)
at com.opensymphony.xwork2.inject.ContainerImpl$ParameterInjector.inject(ContainerImpl.java:462)
at com.opensymphony.xwork2.inject.ContainerImpl.getParameters(ContainerImpl.java:477)
at com.opensymphony.xwork2.inject.ContainerImpl.access$000(ContainerImpl.java:34)
at com.opensymphony.xwork2.inject.ContainerImpl$MethodInjector.inject(ContainerImpl.java:293)
at com.opensymphony.xwork2.inject.ContainerImpl$ConstructorInjector.construct(ContainerImpl.java:431)
at com.opensymphony.xwork2.inject.ContainerImpl.inject(ContainerImpl.java:492)
at com.opensymphony.xwork2.inject.ContainerImpl$7.call(ContainerImpl.java:532)
at com.opensymphony.xwork2.inject.ContainerImpl.callInContext(ContainerImpl.java:581)
at com.opensymphony.xwork2.inject.ContainerImpl.inject(ContainerImpl.java:530)
at com.opensymphony.xwork2.config.impl.LocatableFactory.create(LocatableFactory.java:32)
at com.opensymphony.xwork2.inject.ContainerBuilder$4.create(ContainerBuilder.java:130)
at com.opensymphony.xwork2.inject.Scope$2$1.create(Scope.java:51)
at com.opensymphony.xwork2.inject.ContainerImpl$ParameterInjector.inject(ContainerImpl.java:462)
at com.opensymphony.xwork2.inject.ContainerImpl.getParameters(ContainerImpl.java:477)
at com.opensymphony.xwork2.inject.ContainerImpl.access$000(ContainerImpl.java:34)
at com.opensymphony.xwork2.inject.ContainerImpl$MethodInjector.inject(ContainerImpl.java:293)
at com.opensymphony.xwork2.inject.ContainerImpl.inject(ContainerImpl.java:485)
at com.opensymphony.xwork2.inject.ContainerImpl$6.call(ContainerImpl.java:523)
at com.opensymphony.xwork2.inject.ContainerImpl$6.call(ContainerImpl.java:521)
at com.opensymphony.xwork2.inject.ContainerImpl.callInContext(ContainerImpl.java:574)
at com.opensymphony.xwork2.inject.ContainerImpl.inject(ContainerImpl.java:521)
at com.opensymphony.xwork2.ObjectFactory.injectInternalBeans(ObjectFactory.java:127)
at com.opensymphony.xwork2.spring.SpringObjectFactory.autoWireBean(SpringObjectFactory.java:187)
at com.opensymphony.xwork2.spring.SpringObjectFactory.buildBean(SpringObjectFactory.java:162)
at com.opensymphony.xwork2.spring.SpringObjectFactory.buildBean(SpringObjectFactory.java:133)
at com.opensymphony.xwork2.ObjectFactory.buildBean(ObjectFactory.java:139)
at com.opensymphony.xwork2.ObjectFactory.buildInterceptor(ObjectFactory.java:180)
at com.opensymphony.xwork2.config.providers.InterceptorBuilder.constructInterceptorReference(InterceptorBuilder.java:59)
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.lookupInterceptorReference(XmlConfigurationProvider.java:987)
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.loadInterceptorStack(XmlConfigurationProvider.java:806)
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.loadInterceptorStacks(XmlConfigurationProvider.java:819)
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.loadInterceptors(XmlConfigurationProvider.java:842)
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.addPackage(XmlConfigurationProvider.java:449)
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.loadPackages(XmlConfigurationProvider.java:264)
at org.apache.struts2.config.StrutsXmlConfigurationProvider.loadPackages(StrutsXmlConfigurationProvider.java:111)
at com.opensymphony.xwork2.config.impl.DefaultConfiguration.reloadContainer(DefaultConfiguration.java:193)
at com.opensymphony.xwork2.config.ConfigurationManager.getConfiguration(ConfigurationManager.java:55)
at org.apache.struts2.dispatcher.Dispatcher.init_PreloadConfiguration(Dispatcher.java:374)
at org.apache.struts2.dispatcher.Dispatcher.init(Dispatcher.java:418)
at org.apache.struts2.dispatcher.FilterDispatcher.init(FilterDispatcher.java:190)
at org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:295)
at org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:422)
at org.apache.catalina.core.ApplicationFilterConfig.<init>(ApplicationFilterConfig.java:115)
at org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4001)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4651)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:791)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:771)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:546)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:905)
at org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:740)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:500)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1277)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:321)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:119)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:785)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:445)
at org.apache.catalina.core.StandardService.start(StandardService.java:519)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
at org.apache.catalina.startup.Catalina.start(Catalina.java:581)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
[2011-05-19 22:37:43,354] com.opensymphony.xwork2.config.providers.InterceptorBuilder ERROR (CommonsLogger.java:error:27):
Actual exception
Caught Exception while registering Interceptor class org.apache.struts2.interceptor.validation.AnnotationValidationInterceptor - interceptor - jar:file:/home/mr/Apps/apache-tomcat-6.0.29/webapps/slmp/WEB-INF/lib/struts2-core-2.2.3.jar!/struts-default.xml:149:127
at com.opensymphony.xwork2.ObjectFactory.buildInterceptor(ObjectFactory.java:202)
at com.opensymphony.xwork2.config.providers.InterceptorBuilder.constructInterceptorReference(InterceptorBuilder.java:59)
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.lookupInterceptorReference(XmlConfigurationProvider.java:987)
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.loadInterceptorStack(XmlConfigurationProvider.java:806)
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.loadInterceptorStacks(XmlConfigurationProvider.java:819)
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.loadInterceptors(XmlConfigurationProvider.java:842)
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.addPackage(XmlConfigurationProvider.java:449)
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.loadPackages(XmlConfigurationProvider.java:264)
at org.apache.struts2.config.StrutsXmlConfigurationProvider.loadPackages(StrutsXmlConfigurationProvider.java:111)
at com.opensymphony.xwork2.config.impl.DefaultConfiguration.reloadContainer(DefaultConfiguration.java:193)
at com.opensymphony.xwork2.config.ConfigurationManager.getConfiguration(ConfigurationManager.java:55)
at org.apache.struts2.dispatcher.Dispatcher.init_PreloadConfiguration(Dispatcher.java:374)
at org.apache.struts2.dispatcher.Dispatcher.init(Dispatcher.java:418)
at org.apache.struts2.dispatcher.FilterDispatcher.init(FilterDispatcher.java:190)
at org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:295)
at org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:422)
at org.apache.catalina.core.ApplicationFilterConfig.<init>(ApplicationFilterConfig.java:115)
at org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4001)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4651)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:791)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:771)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:546)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:905)
at org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:740)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:500)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1277)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:321)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:119)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1053)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:785)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1045)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:445)
at org.apache.catalina.core.StandardService.start(StandardService.java:519)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:710)
at org.apache.catalina.startup.Catalina.start(Catalina.java:581)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at com.opensymphony.xwork2.inject.ContainerImpl$MethodInjector.inject(ContainerImpl.java:295)
at com.opensymphony.xwork2.inject.ContainerImpl.inject(ContainerImpl.java:485)
at com.opensymphony.xwork2.inject.ContainerImpl$6.call(ContainerImpl.java:523)
at com.opensymphony.xwork2.inject.ContainerImpl$6.call(ContainerImpl.java:521)
at com.opensymphony.xwork2.inject.ContainerImpl.callInContext(ContainerImpl.java:574)
at com.opensymphony.xwork2.inject.ContainerImpl.inject(ContainerImpl.java:521)
at com.opensymphony.xwork2.ObjectFactory.injectInternalBeans(ObjectFactory.java:127)
at com.opensymphony.xwork2.spring.SpringObjectFactory.autoWireBean(SpringObjectFactory.java:187)
at com.opensymphony.xwork2.spring.SpringObjectFactory.buildBean(SpringObjectFactory.java:162)
at com.opensymphony.xwork2.spring.SpringObjectFactory.buildBean(SpringObjectFactory.java:133)
at com.opensymphony.xwork2.ObjectFactory.buildBean(ObjectFactory.java:139)
at com.opensymphony.xwork2.ObjectFactory.buildInterceptor(ObjectFactory.java:180)
... 40 more
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at com.opensymphony.xwork2.inject.ContainerBuilder$4.create(ContainerBuilder.java:132)
at com.opensymphony.xwork2.inject.Scope$2$1.create(Scope.java:51)
at com.opensymphony.xwork2.inject.ContainerImpl$ParameterInjector.inject(ContainerImpl.java:462)
at com.opensymphony.xwork2.inject.ContainerImpl.getParameters(ContainerImpl.java:477)
at com.opensymphony.xwork2.inject.ContainerImpl.access$000(ContainerImpl.java:34)
at com.opensymphony.xwork2.inject.ContainerImpl$MethodInjector.inject(ContainerImpl.java:293)
... 51 more
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at com.opensymphony.xwork2.inject.ContainerImpl.inject(ContainerImpl.java:495)
at com.opensymphony.xwork2.inject.ContainerImpl$7.call(ContainerImpl.java:532)
at com.opensymphony.xwork2.inject.ContainerImpl.callInContext(ContainerImpl.java:581)
at com.opensymphony.xwork2.inject.ContainerImpl.inject(ContainerImpl.java:530)
at com.opensymphony.xwork2.config.impl.LocatableFactory.create(LocatableFactory.java:32)
at com.opensymphony.xwork2.inject.ContainerBuilder$4.create(ContainerBuilder.java:130)
... 56 more
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at com.opensymphony.xwork2.inject.ContainerImpl$MethodInjector.inject(ContainerImpl.java:295)
at com.opensymphony.xwork2.inject.ContainerImpl$ConstructorInjector.construct(ContainerImpl.java:431)
at com.opensymphony.xwork2.inject.ContainerImpl.inject(ContainerImpl.java:492)
... 61 more
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at com.opensymphony.xwork2.inject.ContainerBuilder$4.create(ContainerBuilder.java:132)
at com.opensymphony.xwork2.inject.Scope$2$1.create(Scope.java:51)
at com.opensymphony.xwork2.inject.ContainerImpl$ParameterInjector.inject(ContainerImpl.java:462)
at com.opensymphony.xwork2.inject.ContainerImpl.getParameters(ContainerImpl.java:477)
at com.opensymphony.xwork2.inject.ContainerImpl.access$000(ContainerImpl.java:34)
at com.opensymphony.xwork2.inject.ContainerImpl$MethodInjector.inject(ContainerImpl.java:293)
... 63 more
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at com.opensymphony.xwork2.inject.ContainerImpl.inject(ContainerImpl.java:495)
at com.opensymphony.xwork2.inject.ContainerImpl$7.call(ContainerImpl.java:532)
at com.opensymphony.xwork2.inject.ContainerImpl.callInContext(ContainerImpl.java:581)
at com.opensymphony.xwork2.inject.ContainerImpl.inject(ContainerImpl.java:530)
at com.opensymphony.xwork2.config.impl.LocatableFactory.create(LocatableFactory.java:32)
at com.opensymphony.xwork2.inject.ContainerBuilder$4.create(ContainerBuilder.java:130)
... 68 more
Caused by: java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
at com.opensymphony.xwork2.inject.ContainerImpl$ConstructorInjector.construct(ContainerImpl.java:440)
at com.opensymphony.xwork2.inject.ContainerImpl.inject(ContainerImpl.java:492)
... 73 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
at com.opensymphony.xwork2.inject.ContainerImpl$ConstructorInjector.construct(ContainerImpl.java:419)
... 74 more
Caused by: Document is invalid: no grammar found. - file:///home/mr/Apps/apache-tomcat-6.0.29/bin/validators.xml:1:12
at com.opensymphony.xwork2.util.DomHelper.parse(DomHelper.java:115)
at com.opensymphony.xwork2.validator.DefaultValidatorFileParser.parseValidatorDefinitions(DefaultValidatorFileParser.java:105)
at com.opensymphony.xwork2.validator.DefaultValidatorFactory.retrieveValidatorConfiguration(DefaultValidatorFactory.java:184)
at com.opensymphony.xwork2.validator.DefaultValidatorFactory.parseValidators(DefaultValidatorFactory.java:173)
at com.opensymphony.xwork2.validator.DefaultValidatorFactory.<init>(DefaultValidatorFactory.java:44)
... 79 more
Caused by: org.xml.sax.SAXParseException: Document is invalid: no grammar found.
at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.createSAXParseException(ErrorHandlerWrapper.java:195)
at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.error(ErrorHandlerWrapper.java:131)
at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:384)
at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:318)
at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.scanStartElement(XMLNSDocumentScannerImpl.java:250)
at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl$NSContentDriver.scanRootElementHook(XMLNSDocumentScannerImpl.java:626)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl$FragmentContentDriver.next(XMLDocumentFragmentScannerImpl.java:3103)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl$PrologDriver.next(XMLDocumentScannerImpl.java:922)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:648)
at com.sun.org.apache.xerces.internal.impl.XMLNSDocumentScannerImpl.next(XMLNSDocumentScannerImpl.java:140)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:511)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:808)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:737)
at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:119)
at com.sun.org.apache.xerces.internal.parsers.AbstractSAXParser.parse(AbstractSAXParser.java:1205)
at com.sun.org.apache.xerces.internal.jaxp.SAXParserImpl$JAXPSAXParser.parse(SAXParserImpl.java:522)
at javax.xml.parsers.SAXParser.parse(SAXParser.java:395)
at com.opensymphony.xwork2.util.DomHelper.parse(DomHelper.java:113)
Now OpenSymphony has seen it's final days. All xml have to change the dtd to struts.apache.org.
xworks related DTD's have been uploaded on struts site "http://struts.apache.org/dtds/"
E.g. To fix your validation xml files.
Old way of validation DTD
<!DOCTYPE validators PUBLIC
"-//OpenSymphony Group//XWork Validator 1.0.2//EN"
"http://www.opensymphony.com/xwork/xwork-validator-1.0.2.dtd">
Replace with New validation DTD
<!DOCTYPE validators PUBLIC
"-//Apache Struts//XWork Validator 1.0.2//EN"
"http://struts.apache.org/dtds/xwork-validator-1.0.2.dtd">
Try and see if above option helps.

Resources