TikaApp JAR Classes - apache-tika

I`m using Apache Tika 1.4 to extract content from my documents. But it also comes with org.bouncycastle.* classes, and I use another version of bouncycastle which is conflicting with the Tika packages.
If the Tika was using the bouncycastle (bcprov) jar, I could exclude that using exclusion tag from Maven, but the TikaApp has copied the org.bouncycastle classes into it, so, I cannot exclude them.
There`s some way to remove this package without recompiling or branching Apache Tika and set to use another JAR to this specified package or something like that?
Thanks

Your problem is that you're using completely the wrong packaging of Tika!
The tika-app jar is a standalone, runnable jar, containing all of the Tika code + all dependencies required to let it run. It's intended to be used from the command line, standalone, to allow non-Java users to call Tika, and to allow for easy testing.
If you're writing your own Java application, which it sounds like you are, you will want to depend on the tika-core artifact as a minimum. That contains all the interfaces, the mime detection, service loaders etc. You'll then almost certainly also want to depend on tika-parsers , which provides all the code to do the actual parsing of the file formats, along with pulling in their required dependencies. This gives you the full control you seem to want.
Finally, there's also an OSGi bundle available, for those who prefer the control and classloading that OSGi offers, that's in the tika-bundle artifact. There's also a CXF powered JAX-RS version, which offers Tika's services over a RESTful interface, that comes in the tika-server artifact.

Related

custom codec not found in plugin

I have a grails application with multiple internally developed plugins. Since upgrading from 4.x to 5.2.3, codecs are not found in one plugin, but are found in others. Specifically, I can place the same file (UsernameListCodec.groovy, package name changed from one plugin to the next but otherwise no changes) in grails-app/utils in one plugin and it works; when placed in grails-app/utils in another plugin it fails with MissingMethodException.
What could cause this? The plugins are fairly different in terms of what they provide, but very similar in terms of how they're built, published, etc. Clearly this is something I'm doing wrong (since the codec works in another plugin) but I don't even know where to begin looking. Does a plugin need to do something in particular to be able to provide custom codecs as of grails 5?

ImageJ: How to use third-party plugins API?

In Eclipse, I'm using the already packed ij.jar instead of the source code. I added the ij.jar file as an external jar in Eclipse. Every plugin shipped in the original ij.jar works fine after I imported from ij.
Currently, I'm trying to use functions in the third-party plugin StackReg. Does anyone know how I can import the classes inside StackReg? I've tried to add StackReg_.jar as an external jar. However, this does not work.
From quickly looking at the source of StackReg plugin, I see that the classes are in the default package. In java, importing classes from default package to a named package is not possible without using reflection.
Possible solutions are:
Put your classes in the default package. Then you can use the classes in the default package without importing them. Note that using default package is bad practice in java.
Use reflection: https://stackoverflow.com/a/561183/1903534
Alter the StackReg plugin to not use the default package. But this might not be compatible with its license and your solution will not be compatible with the original jar.

Codenameone with WSDL NB - ClassDefNotFound

I have used NB to add a "client web service" to a Codename one app through the NB interface. This works fine in the simulator.
The WSDL classes are generated during build automatically and I have them landing in com.myco.myapp.generated package.
Having checked the generated JAR the WSDL classes are there all ok.
But when I push this to the "build for Android" to codename1, run on the device I get
An Internal application error occurred : java.lang.NoClassDefFoundError: com.myco.myapp.generated.SimpleStockList_Service
But the class is definitely there in the JAR.
I am sure its something to do with the JAR and its manifest, but never really had to get behind the scenes with Ant and JARs and builds to know what to do.
As the classes are generated during ant build, I can not pack them up into a library. (tried that and get fail due to 2 instances of same class.)
Codename One doesn't support binary libraries at this time, you will need to integrate the source code into the build process. There are many complexities involved in supporting binary libraries in such a setup.
Thanking Shai for his help.
Ultimate answer is not to use WSDL as moving objects relies Serialization which is not included in the small Java package.
Due to this I created a custom servlet which codename1 ConnectionRequest can deal with via a standard HTTP request.
This is how I achieved it
http://www.jamesarbrown.com/?p=164

How should I maintain JDK7 projects, so that they automatically could be downgraded for JDK6?

I have few own APIs with around 2000 classes overall. Some of them use the new Path API from JDK7. Most other classes, however, do not rely on any new JDK APIs or new language features. So most classes could be used in a JDK6 environment (which I plan to do). Let's assume, I've annotated all JDK7-only classes with #Java7Only.
What I need now, is a way to create a JDK6-only subset of all my projects more-or-less automatically, without introducing new version branching or product lines (would be too complicated to maintain).
All projects are created using Netbeans, thus using Ant. Many projects depend on others.
Please help me evaluate, which ideas according to my problem is most appropriate. Which problems could occur with each idea?
Common first step for all ideas
Let an annotation processor search for #Java7Only-annotated classes and store the list to a properties file.
Idea 1 (specific)
Write a tool which would use the properties file to recursively copy the whole project, except JDK7-only files.
Build the copied project using JDK6 by invoking ant, thus getting a JDK6-compliant jar.
Idea 2 (specific)
Write a second annotation processor which would use the properties file to pass everything except JDK7-only files to a JavaCompiler instance.
Either build a jar using Java APIs or use Ant API for that.
(This would be a Java-only idea, but probably too complicated)
Idea X (abstract)
Somehow influence the Ant build process (by overwriting some targets?) and for each JDK6-compliant class: let Ant compile two versions of it (one time with JDK6 compiler, another time with JDK7 compiler).
(JDK7-only classes would be compiled only once, using the JDK7 compiler, of course)
Package each bunch to a separate jar.
Possible common problems to the ideas
Some projects dependent on others, so some actions (such as packaging) should consider this.
Remember: the JDK7 compiler generates downward incompatible class files, that's why every possible idea has to happen on sources-level (before or during the build process, not afterwards).
My thoughts on Idea 2:
Essentially this is invoking a compiler within a compiler. Annotation processors are run as part of compilation. Can this be done safely? Is there any static state in Sun's javac that would cause problems. (I don't know the answer but from memory there might be some static state that could cause problems in this scenario).
Idea 1 seems simpler and better to me.
But taking a step back, is it possible to separate out all the JDK 7 specific stuff into a separate module and compile it separately, into a different JAR?
Have the 'main' project, compiled using JDK 6 (which JDK 7 would have no problems reading because it is backwards compatible)
The JDK 7 specific module(s), with source in a different directory, which includes the 'main' JAR on the compilation classpath, could be built separately, with a different build.xml if necessary.
This only partially applies but I'd thought I'd mention it anyway.
The problem with just using -source 1.6 -target 1.6 options for validation is that you can still use Java 7 API when compiled using JDK 7.
I've used the Animal Sniffer Maven Plugin for a few projects now and it has proved quite useful. This plugin scans byte-code of your classes for JDK API usage. That is, you can tell it to fail the build if you attempt to use JDK 7 API when you are targeting JDK 6. This wont help much for separating out classes as you need but it could be useful as a final validation step combined with -source 1.6 -target 1.6 compiler options.
There is also an animal sniffer Ant plugin, as mentioned from the Animal Sniffer main page.

Java builder with proper dependency handling

After a recent juggling with our ant scripts I've started to wonder if something better is possible.
I need a builder that will know to recompile all required .java files for me.
For ex. for this structure
public class A { ]
public class B extends A {}
public class C {
B b;
}
For: Compile('C') Will know to compile A, B, C.
For: B changed, Compile('C') will know to recompile just B.
I know of several alternatives, Ivy which seems like an extension of ant which is our current java builder. Scons which we are currently using for building C++ code, scons is excellent in doing the above described behavior for C code. Then there are reports of Maven being almost but not quite there.
What would you suggest? What tools are you using Free Software / Commercial for you build system?
Thank you,
Maxim.
Ant, with 'depend' task and with 'closure' option turned on
'make', from IDEA ide
None of ivy, scons or maven will help you with your problem as stated.
What do you mean by "for Compile('C')"? I don't think this is what you have in your ant file.
For this case, Ant should be working as desired: you have described its default behaviour. In the same javac element, Ant will only recompile changed classes. See the Ant manual entry for the javac task, especially the 'includeDestClasses' attribute.
You should probably post an example ant file that you are finding inadequate.
maven, both for my personal and my commercial products
In your question you describe inter-class dependencies. Most build systems, in particular Maven, are aimed more at inter-project dependencies. I believe most systems just recompile all the classes in a project and most of the benefits of these build systems is in building as few projects as possible.
Both Maven and Ivy will allow you to easily specify both external and internal dependencies of your project, including which version of the project you depend on. They will both also automatically download external libraries (such as apache commons) to your local machine as part of the build process if they are not already locally cached, saving a lot of work manually downloading and organizing third party jar files.
Ivy is an extension of ant, like you mention. I recommend Maven. It is a convention oriented build system that I've used successfully and feel is quite mature. Maven requires far less up front effort to start using and is quite extensible.

Resources