Meaning of bean discovery mode annotated in CDI 1.1 - dependency-injection

I am migrating an application to Java EE 7 and would like to CDI 1.1. But I don't get the meaning of bean-discovery-mode="annotated". The
CDI 1.1 specification is not very helpful. At least I have not found any useful paragraph. Did I miss it?
This example runs perfectly with bean-discovery-mode="all" and injects an instance of LoggingClass:
public class LoggingClass {
public Logger logger = Logger.getLogger("ALOGGER");
}
#Test
public class MMLoggerProducerIT extends Arquillian {
#Inject private LoggingClass lc;
}
But if I change from bean-discovery-mode="all" to bean-discovery-mode="annotated" the container is not able to inject an instance into the field lc.
How do I have to annotate LoggingClass to use bean-discovery-mode="annotated" correctly?

When using bean-discovery-mode="annotated" only classes with a bean defining annotation are discovered. All other classes are ignored. Any scope type is a bean defining annotation. If a scope type is declared on a bean class, then the bean class is said to have a bean defining annotation [spec]. The 1.1 spec is not completely clear here. Only classes with a #NormalScope scope or #Dependent pseudo scope are discovered, #javax.inject.Singleton and all other #Scope (pseudo) scopes are ignored.
Note that the definition of a "bean defining annotation" changed in CDI 1.2 and is now very well defined:
The set of bean defining annotations contains:
#ApplicationScoped, #SessionScoped, #ConversationScoped and #RequestScoped annotations,
all other normal scope types,
#Interceptor and #Decorator annotations,
all stereotype annotations (i.e. annotations annotated with #Stereotype),
and the #Dependent scope annotation.

As a practical matter, bean-discovery-mode="ALL" turns on scanning of all classes in an archive. This is called an "explicit archive".
Omitting beans.xml, or setting bean-discovery-mode="ANNOTATED", makes the archive an implicit archive. In this case, the container will scan for beans with annotated scope types.
This explains why LoggingClass isn't injected when you set bean-discovery-mode="ANNOTATED". As documented in the Java EE 7 Tutorial:
CDI can only manage and inject beans annotated with a scope type in an implicit archive.
Edit: so just to be absolutely clear, you need to add a scope type to LoggingClass. So something like this:
#SessionScoped
public class LoggingClass {
public Logger logger = Logger.getLogger("ALOGGER");
}
In Java EE 7 and CDI 1.1, we removed the requirement to include the beans.xml deployment descriptor to turn on CDI for an archive, bringing CDI 1.1 in line with most other Java EE APIs where deployment descriptors are optional. It also removed the binary on/off nature of including beans.xml or not. You can control which files are scanned by the container with the settings in bean-discovery-mode.
See the JavaEE tutorial on packaging CDI applications here:
http://docs.oracle.com/javaee/7/tutorial/cdi-adv001.htm#CACDCFDE

I also agree with the answer form #rmuller. But I want to point out that there is still different behavior on application servers Payara and Wildfly.
See the following example with a normal not scoped class but having a #EJB injection:
public class SomeClass {
#EJB
MyService myService;
...
}
If you provide a beans.xml file with:
.... version="1.2" bean-discovery-mode="annotated"....
Payara 4.1 will treat the class SomeClass NOT as a CDI bean and will NOT inject the service EJB.
This is clear to me that it behaves as stated in the specification.
But Wildfly 10 treats the class as an CDI bean and injects the service EJB which is not expected. To get this working the beans.xml file should look like this:
.... version="1.2" bean-discovery-mode="all"....
It's amazing that the two most common application servers are different here in behavior.

Related

CDI inject specific interface implementation

Say I have an interface Config
and two clasess implementing this interface:
class Config1: Config {
}
class Config2: Config {
}
I create one of those implementations by deserializing JSON using Jackson and want to provide it as CDI ApplicationScope bean in the common library:
#Produces
#ApplicationScope
fun config: Config {
return mapper.readValue(json, Config)
}
Now I have two applications, App1 rely on Config1 and App2 on Config2. I'd like to achieve the bahaviour that for example starting App1 Config1 is injected or Exception is thrown that there's no bean to satisfy constructor. Unfortunatelly this doesn't work:
class App1 (val config: Config1) { }
Is there any way to achieve it? Or i have to write a service that would do the instance check?
You might need to re-think the approach. With what you have, you cannot even perform instanceof check because you will be getting a proxy, not the actual bean instance (since your producer is #ApplicationScoped).
In CDI, beans are defined by their types and their qualifiers. From your description I understand that you need to differentiate between the two config anyway, so abstracting it into just interface won't work. You could have two producer methods, each for one of the configs and with different qualifiers. Each method looking something like this:
#Produces
#ApplicationScoped
#Config1Qualifier //some custom qualifier that you slap on it
fun config: Config {
return someFunctionThatResolvesIt()
}
The injection point for it would then look like this (using Java syntax):
#Inject
#Config1Qualifier
Config config;
The upside of this approach is that so long as your producer methods have normal scopes on them (such as #ApplicationScoped), then if they attempt to return null, it is automatically treated as an error - which looks like something you are trying to achieve anyway.
Approaching it from a completely different angle, you could take your current producer method and change #ApplicationScoped to #Dependent or #Singleton none of which use proxies and would allow to perform instanceof. But mind that specifically #Dependent might behave differently from what you want (just assuming you are using app scoped to keep singular instance).

Is #javax.annotation.ManagedBean a CDI bean defining annotation?

The Question
Given that the archive we deploy is an "implicit bean archive" (see below), using #javax.inject.Inject to inject a #javax.annotation.ManagedBean into another managed bean work in WildFly 8.1.0, but it won't work in GlassFish 4.0.1-b08 nor GlassFish 4.1-b13. GlassFish crash with this message:
WELD-001408: Unsatisfied dependencies for type...
Do I misunderstand the following outlined specifications or do GlassFish have a bug?
CDI 1.1 Part 1
CDI 1.1 (JSR-346) section 12.1 "Bean Archives" says:
An explicit bean archive is an archive which contains a beans.xml file
[..]. An implicit bean archive is any other archive which contains one
or more bean classes with a bean defining annotation [..].
If then, my archive has no beans.xml descriptor file, I will still be able to use beans that have a "bean defining annotation". Question is, what is a bean defining annotation?
The CDI specification section 2.5 "Bean defining annotations" says:
Any scope type is a bean defining annotation.
So that's pritty clear and that's all there is to it according to this section of the CDI specification. If I deploy an archive with no beans.xml descriptor file in it, then I can #Inject beans as long as they have an explicitly declared scope, #javax.enterprise.context.RequestScoped for example. It works in both WildFly and GlassFish. However..
Managed Beans
The subset specification that all specifications within the Java EE technology stack must adhere to, Managed Beans (JSR-316), has a "base model" in which #javax.annotation.ManagedBean do define a managed bean. The managed beans specification doesn't say that #ManagedBean makes the bean a plausible injection target for an injection point (i.e., field or parameter). The specification do say that the beans "can be used anywhere in a Java EE application" (section MB.1.2 "Why Managed Beans?") which in my ears sound like they should be injectable too.
Java EE 7 Umbrella specification
The Java EE 7 specification (JSR-342) has this to say in section EE.5.24 "Support for Dependency Injection":
Containers must support injection points annotated with the
javax.inject.Inject annotation only to the extent dictated by CDI. Per
the CDI specification, dependency injection is supported on managed
beans.
There are currently three ways for a class to become a managed bean:
Being an EJB session bean component.
Being annotated with the ManagedBean annotation.
Satisfying the conditions described in the CDI specification.
Classes that satisfy at least one of these conditions will be eligible
for full dependency injection support as described in the CDI
specification.
There you go: #ManagedBean has "full dependency injection support". Not half or just a little bit of support. Yet, I'm not that sure exactly what "dependency injection support" is. But I think that the paragraph that follows describe it well enough:
Component classes listed in Table EE.5-1 that satisfy the third
condition above, but neither the first nor the second condition, can
also be used as CDI managed beans if they are annotated with a CDI
bean-defining annotation or contained in a bean archive for which CDI
is enabled. However, if they are used as CDI managed beans (e.g.,
injected into other managed classes), the instances that are managed
by CDI may not be the instances that are managed by the Java EE
container.
Basically, what this paragraph says is that the second condition is CDI managed beans that may be injected into other managed classes (because the exception beans "can also").
The umbrella specification and the managed beans specification has both made it somewhat clear that the CDI specification has the last word.
CDI 1.1 Part 2
The #ManagedBean annotation is only found mentioned in the CDI specification two times, both of which occur in chapter 11 which speaks of life cycle CDI events that a CDI extension can observe. Section 11.5.7 is one of the hits and define a ProcessInjectionPoint event. A managed bean may use dependency injection - no surprise there. However, section 11.5.8 define a ProcessInjectionTarget event. Here's what the specification has to say about the ProcessInjectionTarget event:
The container must fire an event for every Java EE component class
supporting injection that may be instantiated by the container at
runtime, including every managed bean declared using #ManagedBean, EJB
session or message-driven bean, bean, interceptor or decorator.
This phrase says undoubtedly that a #ManagedBean may be used as target for an injection point without adding the notion of scope types (#Dependent is always default).
As stated earlier, injecting a #ManagedBean from an implicit bean archive work in WildFly and as far as I can understand, this is required by all Java EE specifications just quoted. So I think it is GlassFish that has a bug. But the CDI spec never said a word about #ManagedBean in section 2.5 "Bean defining annotations", and as always, I'm a nervewreck when reading through overlapping Java EE specifications, so I thought I should ask before I go and file a "critical" bug to the GlassFish team.
EDIT 2014-08-22
Filed a GlassFish bug: https://java.net/jira/browse/GLASSFISH-21169.
This is not a complete answer as confusion will inevitably arise when we try to put together and make sense out of all specifications. I just wanted to say that CDI 1.2 has made a clarification about what exactly a bean defining annotation is (see section "2.5.1. Bean defining annotations"). CDI 1.2 give a list:
The set of bean defining annotations contains:
#ApplicationScoped, #SessionScoped, #ConversationScoped and #RequestScoped annotations,
all other normal scope types,
#Interceptor and #Decorator annotations,
all stereotype annotations (i.e. annotations annotated with #Stereotype),
and the #Dependent scope annotation.
It should be added that what defines a "normal scope type" (second bullet point) is a custom scope annotated #NormalScope.

Is grailsApplication is predefined

I have a sample code where they used def grailsAppication like
class ViewSourceController {
def grailsApplication
def controllerClass = grailsApplication.getArtefactByLogicalPropertyName(
"Controller", controllerName)
}
is grailsApplication is predefined one, will it search in application's directory for required files, I want to know about its usage
grailsApplication is a Spring bean of type GrailsApplication that is created by the framework. According to the docs, GrailsApplication is:
the main interface representing a running Grails application. This interface's main purpose is to provide a mechanism for analysing the conventions within a Grails application as well as providing metadata and information about the execution environment.
Refer to the docs for more information about the methods provided by GrailsApplication.
GrailsApplication is an interface of grails and This interface's main purpose is to provide a mechanism for analysing the conventions within a Grails application as well as providing metadata and information about the execution environment.
The GrailsApplication interface interacts with ArtefactHandler instances which are capable of analysing different artefact types (controllers, domain classes etc.) and introspecting the artefact conventions
Implementors of this inteface should be aware that a GrailsApplication here is only initialised when the initialise() method is called. In other words GrailsApplication instances are lazily initialised by the Grails runtime.

Using CDI to inject a Data Access Object

Assuming I have a data access object that I've already written, I'd like to be able to use CDI to inject that into say, a service class. Furthermore, I have two implementations of that DAO.
My understanding of CDI is that I'd have to annotate my DAO implementation class so that CDI would know which implementation to inject.
The problem is, the DAO is in a .jar file. By annotating it with CDI annotations, I'm using JavaEE imports in a non-JavaEE class.
For example, let's say I have the following class
public class BusinessService {
#Inject #SomeMybatisQualifier AccountDAO accountDao;
...
}
The #Inject annotation comes from javax.inject.Inject. Now, this service class is dependent on a JavaEE environment.
Can someone please explain to me what I'm missing? How do I inject a non-annotated class into another non-annotated class? This is fairly simple with Spring.
I agree with LightGuard if there's enough classes. But for a couple, why not just produce them with #Produces?
Here's a decent example of implementing your own producer:
Depedency inject request parameter with CDI and JSF2
You should be able to write return new MyObject(); and you can add whatever qualifiers you want
Not sure what's unclear but here's the gist of things: For CDI to scan a jar for beans it must have a beans.xml. Else it will not be scanned and thus not available for injects.A String is not available either. If you try to inject a String say;
#Inject
String myString;
CDI will have no clue what to give you just like your jar. But I know what String I want (a requestparam) and I can let CDI know as well. How? Well I supply a qualifier #RequestParam to my producer (see example again) and now when I want to use it in client code I do it like this:
#Inject
#RequestParam
String myString;
You can do the same thing. Have a producer and just create a new instance of whatever you need and then return it. Now CDI will know just how to dependency inject that particular bean.
Now say you have 40 classes. Then it gets messy to produce them and you want to make sure it gets scanned instead. Then you write your own little extension, observe when CDI is about to scan and instruct it to scan additional jars. Such extension is probably easy to write but I don't know the details because I have not written any extensions like it
By far, the easiest thing would be to create a CDI extension to add the classes in the jar (because there's no beans.xml in that jar so it won't be picked up by CDI) and add additional qualifiers to the metadata.

EJB not serialized in Managed Bean

My application server is WebSphere Application Server V8. I have a session scoped managed bean in which I have injected EJB (EJB 3.0) using #EJB annotation. EJB is stateless.
#ManagedBean
#SessionScoped
public class MyBean extends BaseBackingBean implements
Serializable {
#EJB
private IDetails custInfo;
I was analyzing the session data and noticed NotSerializableException
java.io.NotSerializableException: com.ejb.EJSLocal0SLDetailsImpl_081f812d at
java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184) at
java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1537) at
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1502) at
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1420) at
java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178) at
Now I tried to mark the EJB as transient and it works fine without throwing NotSerializableException exception.
#EJB
private transient IDetails custInfo;
Is this correct implementation or what can be alternate solution?
I have referred Should EJBs be instance variables and marked as transient in JSF Managed Beans? where it is mentioned that marking EJB as transient is not required; then what can be wrong?
Implemented a POC code on WAS V8 with both Local and Remote interfaces, noted the following:
a. With Local interface (EJB does not implement Serializable)
// Initializing the EJB in the servlet
SerializableTestEJBLocal localSrvlt=new SerializableTestEJB();
//Try to serialize
FileOutputStream objFOS = new FileOutputStream("D:\\MYTEST\\testsrv.txt");
ObjectOutputStream objOpStr = new ObjectOutputStream(objFOS);
objOpStr.writeObject(localSrvlt);
This resulted in java.io.NotSerializableException: com.ibm.test.SerializableTestEJB
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:..
To prevent this, the EJB had to explicitly implement Serializable.
b. With Remote Interface (EJB does not implement serializable)
//Obtain Remote Stub.
SerializableTestEJBRemote seremoteSrvlt=(SerializableTestEJBRemote)PortableRemoteObject.narrow(homeObject, SerializableTestEJBRemote.class);
//Try serialization
FileOutputStream objFOS = new FileOutputStream("D:\\MYTEST\\testsrv.txt");
ObjectOutputStream objOpStr = new ObjectOutputStream(objFOS);
objOpStr.writeObject(seremoteSrvlt);
The serialization was successfully.
Conclusion:
The inherent mechanism of remote interface is to obtain a stub or proxy to allow for client-server communication occurs using this proxy pattern. This involves marshalling and unmarshalling of data and hence the proxy-stub is Serializable by default and hence the EJB does not need to implement the Serializable interface.
But the local interfaces does not involve remote look ups and stub handlers. EJB initializes is similar to initializing a locally available class, hence serialization is not available by default. In this scenario, either the EJB will need to implement serializable interface or the object will need to be declared transient to skip serialization.
I am declaring the variable as transient. This might be WebSphere specific solution

Resources