I'm currently migrating away from Seam's #Factory annotation. Combined with #Observer, I could do this:
#Factory(value = "optionsList", scope = ScopeType.APPLICATION)
#Observer("entity.modified")
public List<MyBean> produceEntityOptions() {
List l = getEm().createQuery('select e from entity e').getResultList();
Contexts.getApplicationContext().set("optionsList", l);
return l;
}
Which would cache a list of possible options for use in e.g. <f:selectItems> (the actual computation can be more complicated).
I've translated this for use with CDI to
#Produces #Named("optionsList") #ApplicationScoped
public List<MyBean> produceEntityOptions() {
return getEm().createQuery('select e from entity e').getResultList();
}
but this loses the ability to recreate the cache (only) when an external event signals the cache has gone stale. How can I get that back?
Here's what you could do:
#ApplicationScoped
public class MyListProducer {
// the current list
private List<MyBean> listOfBeans;
// resets / reloads/ refreshes list
private void loadList() {
this.listOfBeans = getEm().createQuery('select e from entity e').getResultList();
}
// initialize the list
#PostConstruct
protected void postConstruct() {
loadList();
}
// listen for the stale event - you'll have to create a type (maybe even qualifiers) yourself
private void resetList(#Observes MyCustomListIsStaleEvent evt) {
loadList();
}
// the producer - to ensure that the producer is called after you refresh the list, make the list of scope #Dependent instead of #ApplicationScoped
#Produces #Named("optionsList")
protected List<MyBean> getList() {
return this.listOfBeans;
}
}
I think that in effect, this is what you want. But I don't exclude the possibility that there might be differences - don't know Seam very much.
Side note: You should think about synchronizing the observer and the producer methods, either with plain old synchronization or by making the above a stateful session bean and taking advantage of EJB synchronization mechanisms.
Related
I'm writing a REST controller that exposes CRUD operations based on the type of OAuth2 services beans that are found, something like this:
#Bean
#ConditionalOnBean(ClientDetailsService::class)
fun clientServiceController(
clientDetailsService: ClientDetailsService
): ClientDetailsServiceController {
return ClientDetailsServiceController(clientDetailsService)
}
#Bean
#ConditionalOnBean(ClientRegistrationService::class)
fun clientRegistrationServiceController(
clientRegistrationService: ClientRegistrationService
): ClientRegistrationServiceController {
return ClientRegistrationServiceController(clientRegistrationService)
}
I want to only register a controller that exposes ClientDetailsService if we do not have a ClientRegistrationService. If it does exist, to additionally register a controller for the methods in that interface.
One of our modules that registers these controllers, also registers a JdbcClientDetailsService bean, which implements both interfaces. Yet, the #ConditionalOnBean(ClientRegistrationService::class) fails to match it, so only the first bean is created by not the second.
This is an example of how we declare the JdbcClientDetailsService:
#Bean
fun jdbcClientDetailsService(
passwordEncoder: PasswordEncoder,
dataSource: DataSource): JdbcClientDetailsService {
return JdbcClientDetailsService(dataSource).apply { setPasswordEncoder(passwordEncoder) }
}
The odd thing is that #Autowired ClientRegistrationService does successfully inject JdbcClientDetailsService.
What am I missing? How can I declare a bean that implements both interfaces, and match correctly against the conditionals? Is there a work around?
I succeed to get around this with the following:
#Bean
#Lazy
#Scope(proxyMode = ScopedProxyMode.INTERFACES)
public ClientRegistrationService registrationDetailsService(ClientDetailsServiceConfigurer configurer)
throws Exception {
ClientDetailsService built = configurer.and().build();
if (built instanceof ClientRegistrationService) {
return (ClientRegistrationService) built;
} else {
throw new IllegalStateException(built + " is not instanceof " + ClientRegistrationService.class);
}
}
It applies the same pattern as ClientDetailsServiceConfiguration, and rely on the same configurer.
We might get ride of '#Scope(proxyMode = ScopedProxyMode.INTERFACES)' if you want to retrieve an actual JdbcClientDetailsService
Domain objects shouldn't have any dependencies, hence no dependency injection either. However, when dispatching domain events from within domain objects, I'll likely want to use a centralised EventDispatcher. How could I get hold of one?
I do not want to return a list of events to the caller, as I'd like them to remain opaque and guarantee their dispatch. Those events should only be consumed by other domain objects and services that need to enforce an eventual consistent constraint.
See Udi Dahan's domain events
Basically, you register one or more handlers for your domain events, then raise an event like this:
public class Customer
{
public void DoSomething()
{
DomainEvents.Raise(new CustomerBecamePreferred() { Customer = this });
}
}
And all the registered handler will be executed:
public void DoSomethingShouldMakeCustomerPreferred()
{
var c = new Customer();
Customer preferred = null;
DomainEvents.Register<CustomerBecamePreferred>(p => preferred = p.Customer);
c.DoSomething();
Assert(preferred == c && c.IsPreferred);
}
This is basically implementing Hollywood Principle (Don't call us, we will call you), as you don't call the event handler directly - instead the event handler(s) get executed when the event is raised.
I'll likely want to use a centralised EventDispatcher. How could I get hold of one?
Pass it in as an argument.
It probably won't look like an EventDispatcher, but instead like some Domain Service that describes the required capability in domain specific terms. When composing the application, you choose which implementation of the service to use.
You are asking to have it both ways. You either need to inject the dependency or invert control and let another object manager the interaction between Aggregate and EventDispatcher. I recommend keeping your Aggregates as simple as possible so that they are free of dependencies and remain testable as well.
The following code sample is very simple and would not be what you put into production, but illustrates how to design Aggregates free of dependencies without passing around a list of events outside of a context that needs them.
If your Aggregate has a list of events within it:
class MyAggregate
{
private List<IEvent> events = new List<IEvent>();
// ... Constructor and event sourcing?
public IEnumerable<IEvent> Events => events;
public string Name { get; private set; }
public void ChangeName(string name)
{
if (Name != name)
{
events.Add(new NameChanged(name);
}
}
}
Then you might have a handler that looks like:
public class MyHandler
{
private Repository repository;
// ... Constructor and dependency injection
public void Handle(object id, ChangeName cmd)
{
var agg = repository.Load(id);
agg.ChangeName(cmd.Name);
repository.Save(agg);
}
}
And a repository that looks like:
class Repository
{
private EventDispatcher dispatcher;
// ... Constructor and dependency injection
public void Save(MyAggregate agg)
{
foreach (var e in agg.Events)
{
dispatcher.Dispatch(e);
}
}
}
This line in TopLevelTransaction (neo4j-kernel-2.1.2) throws a NullPointerException every time I call next() on an iterator obtained via GraphRepository#findAll():
protected void markAsRollbackOnly()
{
try
{
transactionManager.getTransaction().setRollbackOnly(); // NPE here
}
catch ( Exception e )
{
throw new TransactionFailureException(
"Failed to mark transaction as rollback only.", e );
}
}
I found some threads about similar crashes with slightly different stack traces. The accepted solution on this question is to use "proxy" transaction management, but that seems like a band-aid solution. This question also mentions "proxy" transaction management and suggests that there might be something wrong with the #Transactional annotation when using AspectJ.
Is this legitimately a bug, or have I just set up my project incorrectly? My code is essentially the same as in my standalone hello world, with a slightly more complex main class:
#Component
public class Test2 {
#Autowired
FooRepository repo;
public static void main(String[] args) {
AbstractApplicationContext context = new AnnotationConfigApplicationContext("test2");
Test2 test2 = context.getBean(Test2.class);
test2.doStuff();
}
public void doStuff() {
createFoo();
printFoos();
}
#Transactional
public Foo createFoo() {
Foo foo = new Foo();
foo.setName("Derp" + System.currentTimeMillis());
repo.save(foo);
System.out.println("saved " + foo.toString());
return foo;
}
#Transactional
public void printFoos() {
Iterable<Foo> foos = repo.findAll();
System.out.println("findAll() returned instance of " + foos.getClass().getName());
Iterator<Foo> iter = foos.iterator();
System.out.println("iterator is instance of " + iter.getClass().getName());
if(iter.hasNext()) {
iter.next(); // CRASHES HERE
}
}
}
I can post my POM if needed.
I didn't find a bug. Two or three things are required to make this work, depending on whether you want to use proxy or AspectJ transaction management.
First, transaction management must be enabled. Since I'm using annotation-based configuration, I did this by annotating my #Configuration class with #EnableTransactionManagement. Contrary to the docs, the default mode now seems to be AdviceMode.ASPECTJ, not AdviceMode.PROXY.
Next, you need to ensure that the Iterator is used within a transaction. In my example, if I use AdviceMode.PROXY the entire bean containing the #Autowired repository has to be annotated #Transactional. If I use AdviceMode.ASPECTJ I can annotate just the method. This is because the call to the method using the iterator is a self-call from within the bean, and proxy transaction management cannot intercept and manage internal calls.
Finally, if you're using AdviceMode.ASPECTJ you must set up weaving as discussed here.
The code below used to work under the JAXB implementation used by JDK 1.7, but now under JDK 1.8 it's broken. In the code below you will find the key change that seems to make it work in 1.8. The "fix" under 1.8 is not really a fix because it's bad practice to expose internal collections for direct modification by the outside world. I want to control access to the internal list through my class and I don't want to complicate things by making observable collections and listening to them. This is not acceptable.
Is there any way to get my original code to work under the JAXB of JD 1.8?
#XmlElementWrapper(name = "Wrap")
#XmlElement(name = "Item", required = true)
public synchronized void setList(List<CustomObject> values) {
list.clear();
list.addAll(values);
}
public synchronized List<CustomObject> getList() {
// return new ArrayList(list); // this was the original code that worked under 1.7
return list; //this is the only thing that works under 1.8
}
After more analysis, the problem seems to be coming from JAXB not calling the setter method for collections anymore (it used to under JDK 1.7). Now under JDK 1.8, it calls the getter and modifies the collection directly. This poses several problems:
1-forces the user to expose an internal collection to the outside world for free modification (bad practice)
2-doesn't allow the user to do any custom code when the list changes (such as what you could do if the setter was called). It might be possible to make an observable collection and listen to it, but this is a much more complicated workaround than just calling the setter method.
Background
When a collection property is mapped in JAXB it first checks the getter to see if the collection property has been pre-initialized. In the example below I want to have my property exposed as List<String>, but have the backing implementation be a LinkedList ready to hold 1000 items.
private List<String> foos = new LinkedList<String>(1000);
#XmlElement(name="foo")
public List<String> getFoos() {
return foos;
}
Why Your Code Used to Work
If you previously had JAXB call the setter on a property mapped to a collection that returned a non-null response from the getter, then there was a bug in that JAXB implementation. Your code should not have worked in the previous version either.
How to Get the Setter Called
To have the setter called you just need to have your getter return null, on a new instance of the object. Your code could look something like:
import java.util.*;
import javax.xml.bind.annotation.*;
#XmlRootElement(name = "Foo")
public class Foo {
private List<CustomObject> list = null;
#XmlElementWrapper(name = "Wrap")
#XmlElement(name = "Item", required = true)
public synchronized void setList(List<CustomObject> values) {
if (null == list) {
list = new ArrayList<CustomObject>();
} else {
list.clear();
}
list.addAll(values);
}
public synchronized List<CustomObject> getList() {
if (null == list) {
return null;
}
return new ArrayList(list);
}
}
UPDATE
If you don't need to perform any logic on the List returned from JAXB's unmarshalling then using field access may be an acceptable solution.
#XmlRootElement(name = "Foo")
#XmlAccessorType(XmlAccessType.FIELD)
public class Foo {
#XmlElementWrapper(name = "Wrap")
#XmlElement(name = "Item", required = true)
private List<CustomObject> list = null;
public synchronized void setList(List<CustomObject> values) {
if(null == list) {
list = new ArrayList<CustomObject>();
} else {
list.clear();
}
list.addAll(values);
}
public synchronized List<CustomObject> getList() {
return new ArrayList(list);
}
}
The code below used to work under the JAXB implementation used by JDK 1.7, but now under JDK 1.8 it's broken. In the code below you will find the key change that seems to make it work in 1.8. The "fix" under 1.8 is not really a fix because it's bad practice to expose internal collections for direct modification by the outside world. I want to control access to the internal list through my class and I don't want to complicate things by making observable collections and listening to them. This is not acceptable.
Is there any way to get my original code to work under the JAXB of JD 1.8?
#XmlElementWrapper(name = "Wrap")
#XmlElement(name = "Item", required = true)
public synchronized void setList(List<CustomObject> values) {
list.clear();
list.addAll(values);
}
public synchronized List<CustomObject> getList() {
// return new ArrayList(list); // this was the original code that worked under 1.7
return list; //this is the only thing that works under 1.8
}
After more analysis, the problem seems to be coming from JAXB not calling the setter method for collections anymore (it used to under JDK 1.7). Now under JDK 1.8, it calls the getter and modifies the collection directly. This poses several problems:
1-forces the user to expose an internal collection to the outside world for free modification (bad practice)
2-doesn't allow the user to do any custom code when the list changes (such as what you could do if the setter was called). It might be possible to make an observable collection and listen to it, but this is a much more complicated workaround than just calling the setter method.
Background
When a collection property is mapped in JAXB it first checks the getter to see if the collection property has been pre-initialized. In the example below I want to have my property exposed as List<String>, but have the backing implementation be a LinkedList ready to hold 1000 items.
private List<String> foos = new LinkedList<String>(1000);
#XmlElement(name="foo")
public List<String> getFoos() {
return foos;
}
Why Your Code Used to Work
If you previously had JAXB call the setter on a property mapped to a collection that returned a non-null response from the getter, then there was a bug in that JAXB implementation. Your code should not have worked in the previous version either.
How to Get the Setter Called
To have the setter called you just need to have your getter return null, on a new instance of the object. Your code could look something like:
import java.util.*;
import javax.xml.bind.annotation.*;
#XmlRootElement(name = "Foo")
public class Foo {
private List<CustomObject> list = null;
#XmlElementWrapper(name = "Wrap")
#XmlElement(name = "Item", required = true)
public synchronized void setList(List<CustomObject> values) {
if (null == list) {
list = new ArrayList<CustomObject>();
} else {
list.clear();
}
list.addAll(values);
}
public synchronized List<CustomObject> getList() {
if (null == list) {
return null;
}
return new ArrayList(list);
}
}
UPDATE
If you don't need to perform any logic on the List returned from JAXB's unmarshalling then using field access may be an acceptable solution.
#XmlRootElement(name = "Foo")
#XmlAccessorType(XmlAccessType.FIELD)
public class Foo {
#XmlElementWrapper(name = "Wrap")
#XmlElement(name = "Item", required = true)
private List<CustomObject> list = null;
public synchronized void setList(List<CustomObject> values) {
if(null == list) {
list = new ArrayList<CustomObject>();
} else {
list.clear();
}
list.addAll(values);
}
public synchronized List<CustomObject> getList() {
return new ArrayList(list);
}
}