I am using JSF 2.0 to build an app.
What i wish to do is depending on whether a checkbox is ticked or not by the user, i wish to change the scope of the bean from request to session .. is this possible ?
Or override the default 'session' scope with request scope when the checkbox is ticked ...
I tried researching a lot , but i am not even sure if this is possible.
Thanks.
It's not possible to change the scope of a managed bean during runtime. You can however easily change the behavior of a bean during runtime.
E.g.
#ManagedBean
#RequestScoped
public class MultiScopedBean {
#ManagedProperty("#{requestScopedBean}")
private RequestScopedBean requestScopedBean;
#ManagedProperty("#{sessionScopedBean}")
private SessionScopedBean sessionScopedBean;
private boolean sessionScoped; // Bind this to the checkbox.
// ...
public Object getSomeProperty() {
if (sessionScoped) {
return sessionScopedBean.getSomeProperty();
} else {
return requestScopedBean.getSomeProperty();
}
}
public void setSomeProperty(Object someProperty) {
if (sessionScoped) {
sessionScopedBean.setSomeProperty(someProperty);
} else {
requestScopedBean.setSomeProperty(someProperty);
}
}
// ...
}
Yes, it'll end up in quite some boilerplate, but that's what you get for such an odd requirement.
Related
Since Jackson's hibernate5-module not working for me. I'm trying to implement my own lazy property filter. I implemented custom annotation introspection successfully.
But when I apply my custom serializer, #JsonIgnoreProperties is ignored.
#Entity
class Call {
#OneToMany(mappedBy = "call")
#JsonIgnoreProperties("call")
List<CallEvent> events;
}
#Entity
class CallEvent {
#ManyToOne(fetch = FetchType.LAZY)
Call call;
}
public class LazyValueIntrospector extends JacksonAnnotationIntrospector {
#Override
public Object findSerializer(Annotated a) {
var yes = a.hasAnnotation(ManyToOne.class)
|| a.hasAnnotation(Basic.class)
|| a.hasAnnotation(OneToMany.class)
|| a.hasAnnotation(OneToOne.class);
if (yes) {
return LazyValueSerializer.class;
}
return super.findSerializer(a);
}
}
public class LazyValueSerializer extends JsonSerializer<Object> {
#Override
public boolean isEmpty(SerializerProvider provider, Object value) {
return value == null || !Hibernate.isInitialized(value);
}
#Override
public void serialize(Object value, JsonGenerator gen, SerializerProvider serializers) throws IOException {
gen.writeObject(value);
}
}
Explanation:
LazyValueIntrospector.findSerializer detects possible lazy properties.
Hibernate.isInitialized tells me the value is initialized or not.
gen.writeObject(value); writes if property is initialized.
The problem is gen.writeObject(value); method call is ignoring #JsonIgnoreProperties("call") annotation.
The question is:
How to apply #JsonIgnoreProperties("call") annotation in my custom serializer?
Ps: spring.jackson.defaultPropertyInclusion=non_empty property applied globally. Which allows isEmpty checking.
As we are ignoring only one property, try giving #JsonIgnore over the call property inside the CallEvent entity above Call.
Example:
#Entity
class CallEvent {
#ManyToOne(fetch = FetchType.LAZY)
#JsonIgnore
Call call;
}
Without actual example of where #JsonIgnoreProperties is ignored it is hard to say. But I think that the custom serializer would need to delegate to the original serializer and not replace it; there's quite a bit more that is needed to support various other features.
To do that you cannot register serializer the way like shown here, but instead replace it using one of methods in BeanSerializerModifier (and need to register that modifier with ObjectMapper) -- that way you get the "real" serializer to delegate to.
Usually implementations also need to implement createContextual() (from ContextSerializer) which needs to be passed to the original ("delegate") serializer.
You may want to have a look at how serializers are implemented in Hibernate module itself.
So, Since the question is how to apply jackson's own behaviors in the custom bean serializer. I found the answer for myself.
extend from BeanSerializerBase, not BeanSerializer.
override with* methods. Such as withByNameInclusion and withProperties
This way, Jackson calls appropriate methods when it's needed.
#Override
public JsonSerializer<Object> unwrappingSerializer(NameTransformer unwrapper) {
return new LazyBeanUnwrappingSerializer(this, unwrapper);
}
#Override
protected BeanSerializerBase withProperties(BeanPropertyWriter[] properties, BeanPropertyWriter[] filteredProperties) {
return new LazyBeanSerializer(this, properties, filteredProperties);
}
#Override
protected BeanSerializerBase withByNameInclusion(Set<String> toIgnore, Set<String> toInclude) {
return new LazyBeanSerializer(this, toIgnore, toInclude);
}
#Override
public BeanSerializerBase withObjectIdWriter(ObjectIdWriter objectIdWriter) {
return new LazyBeanSerializer(this, objectIdWriter, _propertyFilterId);
}
#Override
public BeanSerializerBase withFilterId(Object filterId) {
return new LazyBeanSerializer(this, _objectIdWriter, filterId);
}
#Override
protected BeanSerializerBase asArraySerializer() {
throw new RuntimeException("Array serializer no supported");
}
I'm having a problem where the related table id fields return 'null' from my domain objects when using inheritance. Here is an example:
In /src/groovy/
BaseClass1.groovy
class BaseClass1 {
Long id
static mapping = {
tablePerConcreteClass true
}
}
BaseClass2.groovy
class BaseClass2 extends BaseClass1 {
String someOtherProperty
static constraints = {
someOtherProperty(maxSize:200)
}
static mapping = BaseClass1.mapping
}
In /grails-app/domain
ParentClass.groovy
class ParentClass extends BaseClass2 {
ChildClass myChild
static mapping = BaseClass2.mapping << {
version false
}
}
ChildClass.groovy
class ChildClass extends BaseClass1 {
String property
static mapping = BaseClass1.mapping
}
The problem appears here:
SomeotherCode.groovy
print parentClassInstance.myChild.id // returns the value
print parentClassInstance.myChildId // returns null
Any ideas what might be going on to get those dynamic properties to break like this?
After debugging into the get(AssociationName)Id source, I found the following:
The handler for this is:
GrailsDomainConfigurationUtil.getAssociationIdentifier(Object target, String propertyName,
GrailsDomainClass referencedDomainClass) {
String getterName = GrailsClassUtils.getGetterName(propertyName);
try {
Method m = target.getClass().getMethod(getterName, EMPTY_CLASS_ARRAY);
Object value = m.invoke(target);
if (value != null && referencedDomainClass != null) {
String identifierGetter = GrailsClassUtils.getGetterName(referencedDomainClass.getIdentifier().getName());
m = value.getClass().getDeclaredMethod(identifierGetter, EMPTY_CLASS_ARRAY);
return (Serializable)m.invoke(value);
}
}
catch (NoSuchMethodException e) {
// ignore
}
catch (IllegalAccessException e) {
// ignore
}
catch (InvocationTargetException e) {
// ignore
}
return null;
}
It threw an exception on the related class (value.getClass().getDeclaredMethod), saying NoSuchMethod for the method getId(). I was unable to remove the id declaration from the base class without Grails complaining that an identifier column was required. I tried marking id as public and it also complained that it wasn't there. So, I tried this
BaseClass {
Long id
public Long getId() { return this.#id }
}
and things worked on some classes, but not on others.
When I removed the ID declaration, I go an error: "Identity property not found, but required in domain class". On a whim, I tried adding #Entity to the concrete classes and viola! everything started working.
class BaseClass {
//Don't declare id!
}
#Entity
class ParentClass {}
#Entity
class ChildClass {}
I still think it is a grails bug that it needs to be added, but at least it is easy enough to work around.
I'm not sure why you are seeing this behavior, but I'm also not sure why you are doing some of the things you are doing here. Why have a domain class extend a POGO? Domains, Controllers, and Services are heavily managed by the Grails machinery, which probably was not designed for this sort of use. Specifically, I believe Grails builds the dynamic property getters for the GrailsDomainProperty(s) of GrailsDomainClass(es), not POGO's. In this case, you have an explicitly declared id field in BaseClass1 that is not a GrailsDomainProperty. I suspect that this POGO id property is not picked up by the Grails machinery that creates the dynamic property getters for Domains.
You might try putting BaseClass1/2 in /grails-app/domain, perhaps making them abstract if you don't want them instantiated, then extending them as you are and seeing if you observe the behavior you want.
I'm currently migrating away from Seam's #Factory annotation. Combined with #Observer, I could do this:
#Factory(value = "optionsList", scope = ScopeType.APPLICATION)
#Observer("entity.modified")
public List<MyBean> produceEntityOptions() {
List l = getEm().createQuery('select e from entity e').getResultList();
Contexts.getApplicationContext().set("optionsList", l);
return l;
}
Which would cache a list of possible options for use in e.g. <f:selectItems> (the actual computation can be more complicated).
I've translated this for use with CDI to
#Produces #Named("optionsList") #ApplicationScoped
public List<MyBean> produceEntityOptions() {
return getEm().createQuery('select e from entity e').getResultList();
}
but this loses the ability to recreate the cache (only) when an external event signals the cache has gone stale. How can I get that back?
Here's what you could do:
#ApplicationScoped
public class MyListProducer {
// the current list
private List<MyBean> listOfBeans;
// resets / reloads/ refreshes list
private void loadList() {
this.listOfBeans = getEm().createQuery('select e from entity e').getResultList();
}
// initialize the list
#PostConstruct
protected void postConstruct() {
loadList();
}
// listen for the stale event - you'll have to create a type (maybe even qualifiers) yourself
private void resetList(#Observes MyCustomListIsStaleEvent evt) {
loadList();
}
// the producer - to ensure that the producer is called after you refresh the list, make the list of scope #Dependent instead of #ApplicationScoped
#Produces #Named("optionsList")
protected List<MyBean> getList() {
return this.listOfBeans;
}
}
I think that in effect, this is what you want. But I don't exclude the possibility that there might be differences - don't know Seam very much.
Side note: You should think about synchronizing the observer and the producer methods, either with plain old synchronization or by making the above a stateful session bean and taking advantage of EJB synchronization mechanisms.
I have 2 forms on a page, that I'd like to validate separately.
I have the following:
public function executeNew(sfWebRequest $request)
{
$this->propertyForm = new AdminNewPropertyForm();
$this->propertyCsvForm = new AdminNewPropertyImportForm();
$this->processForm($request, $this->propertyForm, $this->propertyCsvForm);
}
protected function processForm(sfWebRequest $request, sfForm $propertyForm, sfForm $propertyCsvForm)
{
if($request->hasParameter('property'))
{
if($request->isMethod('post'))
{
$propertyForm->bind($request->getParameter($propertyForm->getName()));
if($propertyForm->isValid())
{
$propertyForm->save();
$this->getUser()->setFlash('success', 'The property was successfully updated.');
} else {
$this->getUser()->setFlash('error', 'The property could not be saved.');
}
}
}
else {
if($request->isMethod('post'))
{
$propertyCsvForm->bind($request->getParameter($propertyCsvForm->getName()));
if($propertyCsvForm->isValid())
{
$propertyCsvForm->save();
}
}
}
}
I am then displaying both forms in the view.
The problem is, I'm getting an error when passing the forms in processForm()
Strict standards: Declaration of propertyActions::processForm() should be compatible with that of autoPropertyActions::processForm()
Am I passing the forms correctly?
Thanks
As the error message says you are obviously not doing it correctly ;)
As your propertyActions class extends an abstract class autoPropertyActions there are some strict standards on implementing the functions declared in the abstract class. That's why it's complaining that you have made some unexpected alterations.
In fact - do you really have to use the processForm function? After all you are calling this function yourself, so you can call it whatever you like and the class won't complain then (as the original processForm will stay intact).
I'm working on an ASP.NET MVC project that support external plugins, now, I'm moving from Unity to Autofac and I need to wrap the lifetime objects of Autofac so the plugins won't have to reference it, in Unity I could do something this.
public sealed class UnityScopeFactory : IDependencyScopeFactory
{
private HttpRequestScope _httpRequest;
private SingletonScope _singleton;
private TransientScope _transient;
public IDependencyScope HttpRequest()
{
return _httpRequest ?? (_httpRequest = new HttpRequestScope());
}
public IDependencyScope Singleton()
{
return _singleton ?? (_singleton = new SingletonScope());
}
public IDependencyScope Transient()
{
return _transient ?? (_transient = new TransientScope());
}
private class HttpRequestScope : IDependencyScope
{
public object CreateScope()
{
return new HttpPerRequestLifetimeManager();
}
}
private class SingletonScope : IDependencyScope
{
public object CreateScope()
{
return new ContainerControlledLifetimeManager();
}
}
private class TransientScope : IDependencyScope
{
public object CreateScope()
{
return new TransientLifetimeManager();
}
}
}
I made similar thing in Autofac but I'm not sure whether it's the correct way to do that, I looked into the RegistrationBuilder of Autofac which is (unfortunately) internal and I came up with this.
public class AutofacScopeFactory : IDependencyScopeFactory
{
private HttpRequestScope _httpRequest;
private SingletonScope _singleton;
private TransientScope _transient;
public IDependencyScope HttpRequest()
{
return _httpRequest ?? (_httpRequest = new HttpRequestScope());
}
public IDependencyScope Singleton()
{
return _singleton ?? (_singleton = new SingletonScope());
}
public IDependencyScope Transient()
{
return _transient ?? (_transient = new TransientScope());
}
private class HttpRequestScope : IDependencyScope
{
public object CreateScope()
{
return new CurrentScopeLifetime();
}
}
private class SingletonScope : IDependencyScope
{
public object CreateScope()
{
return new RootScopeLifetime();
}
}
private class TransientScope : IDependencyScope
{
public object CreateScope()
{
return new CurrentScopeLifetime();
}
}
}
Also, after I got this to work, how can I use pass it to the ContainerBuilder?
In Unity I could do something like this.
public sealed class UnityDependencyContainer : IDependencyContainer
{
private readonly IUnityContainer _container;
public UnityDependencyContainer()
{
_container = new UnityContainer()
}
public void Register<TContract, TImplementation>(IDependencyScope scope) where TImplementation : TContract
{
LifetimeManager manager = scope.CreateScope() as LifetimeManager;
if (manager != null)
{
_container.RegisterType<TContract, TImplementation>(manager);
}
}
}
How do I pass an instance of IComponentLifetime to the method chain? is it a dead end?
public class AutofacContainer : IDependencyContainer
{
private static readonly ContainerBuilder Builder;
static AutofacContainer()
{
Builder = new ContainerBuilder();
}
public void RegisterType<TContract, TImplementation>(IDependencyScope scope) where TImplementation : TContract
{
IComponentLifetime manager = scope.CreateScope() as IComponentLifetime;
if (manager != null)
{
Builder.RegisterType<TImplementation>().As<TContract>();
}
}
}
Autofac doesn't separate scopes quite the way you have it outlined, so you might be trying to fit a square peg in a round hole.
Autofac scopes are more hierarchical. Any lifetime scope can spawn a child transient scope. For example, you might see...
Container/root lifetime
HttpRequest scope
Small task-specific transient scope
You can "tag" a scope and register components to a specific named/tagged scope - that's how the HttpRequest scope works. It gets "tagged" with a special identifier.
When you resolve objects is when it determines which lifetime scope owns it. Resolving happens from the most-nested scope. In the above hierarchy, you resolve items from the small task-specific transient scope whether they're singletons, request scoped, or whatever. When the singleton gets resolved, it will search up the lifetime scope stack and automatically assign "ownership" of the object to the root lifetime scope. When a per-request item gets resolved, it searches up the stack for the lifetime scope with the special "HTTP request" identifier and assigns ownership there. Factory-scoped items are resolved in the current lifetime scope.
Note: That discussion is a gross oversimplification of how it works. There is documentation explaining the lifetime scope mechanism on the Autofac site.
Point being, I see some things in the above design that don't really "jive" with the way Autofac does stuff.
The DependencyScopeFactory can't create its own transient or HttpRequest scopes. There are specific lifetime management components that start and end the HttpRequest scope, so you'd need to use those; there is no 'global' transient scope, so you can't really just create one.
HttpRequest scope, assuming you're using MVC, would look more like...
public ILifetimeScope HttpRequestScope
{
get { return AutofacDependencyResolver.Current.RequestLifetime; }
}
There's no analog for a transient scope because usage on that is supposed to be inline:
using(var transientScope = parentScope.BeginLifetimeScope())
{
// Do stuff and resolve dependencies using the transient scope.
// The IDisposable pattern here is important so transient
// dependencies will be properly disposed at the end of the scope.
}
When you register components, you don't register them "into a lifetime scope." You actually register them into a component registry and part of the component registration includes the ownership information about the lifetime of the component once it's resolved.
var builder = new ContainerBuilder();
// This component is factory-scoped and will be "owned" by whatever
// lifetime scope resolves it. You can resolve multiple of these
// in a single scope:
builder.RegisterType<FirstComponent>().As<ISomeInterface>();
// This component is a singleton inside any given lifetime scope,
// but if you have a hierarchy of scopes, you'll get one in each
// level of the hierarchy.
builder.RegisterType<SecondComponent>().InstancePerLifetimeScope();
// This component will be a singleton inside a specifically named
// lifetime scope. If you try to resolve it in a scope without that
// name, it'll search up the scope stack until it finds the scope
// with the right name. If no matching scope is found - exception.
builder.RegisterType<ThirdComponent>().InstancePerMatchingLifetimeScope("scopename");
// This is a per-HTTP-request component. It's just like the
// above InstancePerMatchingLifetimeScope, but it has a special
// tag that the web integration knows about.
builder.RegisterType<FourthComponent>().InstancePerHttpRequest();
If you're trying to make a container/registration agnostic interface, it wouldn't need a "lifetime scope manager" - instead, you'd need to pass some parameters indicating the intended lifetime scope and do the appropriate registration syntax (above) based on the incoming parameters.
Again, I'd recommend you check out that documentation.
Also, if you're using Unity, Autofac does have an Enterprise Library Configurator package that allows you to configure Autofac in a Unity style (since that's how EntLib likes to do things). That might be something to check out.
If you don't need to use Unity syntax at all... I'd recommend just moving to do things the native Autofac way. Trying to make one container look and act like another is a pretty painful endeavor.
Assuming your plugins are in separate assemblies or whatever, you could easily take advantage of some of the nice assembly-scanning syntax along with Autofac modules and hook up your plugins that way.