public class MyApplication extends Application<MyConfiguration> {
final static Logger LOG = Logger.getLogger(MyApplication.class);
public static void main(final String[] args) throws Exception {
new MyApplication().run(args);
}
#Override
public String getName() {
return "PFed";
}
#Override
public void initialize(final Bootstrap<MyConfiguration> bootstrap) {
// TODO: application initialization
bootstrap.addBundle(new DBIExceptionsBundle());
}
#Override
public void run(final MyConfiguration configuration,
final Environment environment) {
// TODO: implement application
final DBIFactory factory = new DBIFactory();
final DBI jdbi = factory.build(environment, configuration.getDataSourceFactory(), "postgresql");
UserDAO userDAO = jdbi.onDemand(UserDAO.class);
userDAO.findNameById(1);
UserResource userResource = new UserResource(new UserService(userDAO));
environment.jersey().register(userResource);
}
I get the the following error at findNameById.
java.lang.NoSuchMethodError: java.lang.Object.findNameById(I)Ljava/lang/String;
at org.skife.jdbi.v2.sqlobject.CloseInternalDoNotUseThisClass$$EnhancerByCGLIB$$a0e63670.CGLIB$findNameById$5()
}
public interface UserDAO {
#SqlQuery("select userId from user where id = :email")
User isEmailAndUsernameUnique(#Bind("email") String email);
#SqlQuery("select name from something where id = :id")
String findNameById(#Bind("id") int id);
}
Related
I'm using springboot and rabbitmq to receive a message.
The first consumer i created works, declared as below:
#Component
public class UserConsumer {
#Autowired
private RabbitTemplate template;
#RabbitListener(queues = MessagingConfig.CONSUME_QUEUE)
public void consumeMessageFromQueue(MassTransitRequest userRequest) {
...
}
}
I then needed a second consumer so i duplicated the above and called it another name:
#Component
public class PackConsumer {
#Autowired
private RabbitTemplate template;
#RabbitListener(queues = MessagingConfig.CONSUME_QUEUE_CREATE_PACK)
public void consumeMessageFromQueue(MassTransitRequest fileRequest) {
...
}
}
Everything works locally on my machine, however when i deploy it the new queue does not process messages because there is no consumer connected to it. The UserConsumer continues to work.
Is there something else i should be doing in order to connect to the new queue at the same time as the original?
During my learning i did add a "MessagingConfig" class as below, however i believe it relates to sending messages and not receiving them or an alternative configuration:
#Configuration
public class MessagingConfig {
public static final String CONSUME_QUEUE = "merge-document-request";
public static final String CONSUME_EXCHANGE = "merge-document-request";
public static final String CONSUME_ROUTING_KEY = "";
public static final String PUBLISH_QUEUE = "merge-document-response";
public static final String PUBLISH_EXCHANGE = "merge-document-response";
public static final String PUBLISH_ROUTING_KEY = "";
public static final String CONSUME_QUEUE_CREATE_PACK = "create-pack-request";
public static final String CONSUME_EXCHANGE_CREATE_PACK = "create-pack-request";
public static final String CONSUME_ROUTING_KEY_CREATE_PACK = "";
public static final String PUBLISH_QUEUE_CREATE_PACK = "create-pack-response";
public static final String PUBLISH_EXCHANGE_CREATE_PACK = "create-pack-response";
public static final String PUBLISH_ROUTING_KEY_CREATE_PACK = "";
#Bean
public Queue queue() {
return new Queue(CONSUME_QUEUE);
}
#Bean
public TopicExchange exchange() {
return new TopicExchange(CONSUME_EXCHANGE);
}
#Bean
public Binding binding(Queue queue, TopicExchange exchange) {
return BindingBuilder.bind(queue).to(exchange).with(CONSUME_ROUTING_KEY);
}
#Bean
public MessageConverter converter() {
return new Jackson2JsonMessageConverter();
}
#Bean
public AmqpTemplate template(ConnectionFactory connectionFactory) {
final RabbitTemplate rabbitTemplate = new RabbitTemplate(connectionFactory);
rabbitTemplate.setMessageConverter(converter());
return rabbitTemplate;
}
}
Thanks in advance
I am looking for some integration test examples for RabbitListenerConfigurer and RabbitListenerEndpointRegistrar and calling #rabbitListner annotation and test the message conversion and pass additional paramenters such as Channel and message properties etc.
Some thing like this
#RunWith(SpringJUnit4ClassRunner.class)
public class RabbitListenerConfigureIntegrationTests {
public final String sampleMessage="{\"ORCH_KEY\":{\"inputMap\":{},\"outputMap\":{\"activityId\":\"10001002\",\"activityStatus\":\"SUCCESS\"}}}";
#Test
public void testRabiitListenerConfigurer() throws Exception {
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext(
EnableRabbitConfigWithCustomConversion.class);
RabbitListenerConfigurer registrar = ctx.getBean(RabbitListenerConfigurer.class);
/* I want to get the Listener instance here */
Message message = MessageBuilder.withBody(sampleMessage.getBytes())
.andProperties(MessagePropertiesBuilder.newInstance()
.setContentType("application/json")
.build())
.build();
/* call listener.onmessage(message) and that intern pass the call back to #rabbit listener and by that time MessageHandler which is registered should kick off and convert the message */
}
#Configuration
#EnableRabbit
public static class EnableRabbitConfigWithCustomConversion implements RabbitListenerConfigurer {
#Override
public void configureRabbitListeners(RabbitListenerEndpointRegistrar registrar) {
registrar.setMessageHandlerMethodFactory(messageHandlerMethodFactory());
}
#Bean
public ConnectionFactory mockConnectionFactory() {
return mock(ConnectionFactory.class);
}
#Bean
public SimpleRabbitListenerContainerFactory rabbitListenerContainerFactory() {
SimpleRabbitListenerContainerFactory factory = new SimpleRabbitListenerContainerFactory();
factory.setConnectionFactory(mockConnectionFactory());
factory.setAutoStartup(false);
return factory;
}
#Bean
MessageHandlerMethodFactory messageHandlerMethodFactory() {
DefaultMessageHandlerMethodFactory messageHandlerMethodFactory = new DefaultMessageHandlerMethodFactory();
messageHandlerMethodFactory.setMessageConverter(consumerJackson2MessageConverter());
return messageHandlerMethodFactory;
}
#Bean
public MappingJackson2MessageConverter consumerJackson2MessageConverter() {
return new MappingJackson2MessageConverter();
}
#Bean
public Listener messageListener1() {
return new Listener();
}
}
public class Listener {
#RabbitListener(queues = "QUEUE")
public void listen(ExchangeDTO dto, Channel chanel) {
System.out.println("Result:" + dto.getClass() + ":" + dto.toString());
/*ExchangeDTO dto = (ExchangeDTO)messageConverter.fromMessage(message);
System.out.println("dto:"+dto);*/
}
}
EDIT 2
I am not getting Exchange DTO populated with values. instead I get Null values
Here is Log :
15:00:50.994 [main] DEBUG org.springframework.amqp.rabbit.listener.adapter.MessagingMessageListenerAdapter - Processing [GenericMessage [payload=byte[93], headers={contentType=application/json, id=8bf86bf1-7e45-d136-9126-69959f92f100, timestamp=1552680050993}]]
Result:class com.dsicover.dftp.scrubber.subscriber.ExchangeDTO:DTO [inputMap={}, outputMap={}]
public class ExchangeDTO implements Serializable {
/**
*
*/
private static final long serialVersionUID = 1L;
private HashMap<String, Object> inputMap = new HashMap<String, Object>();
private HashMap<String, Object> outputMap = new HashMap<String, Object>();
public HashMap<String, Object> getInputMap() {
return inputMap;
}
public void setInputMap(HashMap<String, Object> inputMap) {
this.inputMap = inputMap;
}
public HashMap<String, Object> getOutputMap() {
return outputMap;
}
public void setOutputMap(HashMap<String, Object> outputMap) {
this.outputMap = outputMap;
}
#Override
public String toString() {
return "DTO [inputMap=" + this.inputMap + ", outputMap=" + this.outputMap + "]";
}
}
Is there any thing i am missing in Jackson2MessageConverter.
Give the #RabbitListener an id
RabbitListenerEndpointRegistry.getListenerContainer(id);
cast container to AbstractMessageListenerContainer
container.getMessageListener()
cast listener to ChannelAwareMessageListener
call onMessage().
use a mock channel and verify expected call
EDIT
#Autowired
private RabbitListenerEndpointRegistry registry;
#Test
public void test() throws Exception {
AbstractMessageListenerContainer listenerContainer =
(AbstractMessageListenerContainer) this.registry.getListenerContainer("foo");
ChannelAwareMessageListener listener =
(ChannelAwareMessageListener) listenerContainer.getMessageListener();
Channel channel = mock(Channel.class);
listener.onMessage(new Message("foo".getBytes(),
MessagePropertiesBuilder
.newInstance()
.setDeliveryTag(42L)
.build()), channel);
verify(channel).basicAck(42L, false);
}
EDIT2
Your json does not look like a DTO, it looks like a Map<String, DTO>.
This works fine for me...
#SpringBootApplication
public class So55188061Application {
public static void main(String[] args) {
SpringApplication.run(So55188061Application.class, args);
}
#RabbitListener(id = "foo", queues = "foo")
public void listen(Map<String, Foo> in, Channel channel, #Header(AmqpHeaders.DELIVERY_TAG) long tag) throws IOException {
System.out.println(in);
channel.basicAck(tag, false);
}
#Bean
public MessageConverter converter() {
return new Jackson2JsonMessageConverter();
}
public static class Foo {
private HashMap<String, Object> inputMap = new HashMap<String, Object>();
private HashMap<String, Object> outputMap = new HashMap<String, Object>();
public HashMap<String, Object> getInputMap() {
return this.inputMap;
}
public void setInputMap(HashMap<String, Object> inputMap) {
this.inputMap = inputMap;
}
public HashMap<String, Object> getOutputMap() {
return this.outputMap;
}
public void setOutputMap(HashMap<String, Object> outputMap) {
this.outputMap = outputMap;
}
#Override
public String toString() {
return "Foo [inputMap=" + this.inputMap + ", outputMap=" + this.outputMap + "]";
}
}
}
and
#RunWith(SpringRunner.class)
#SpringBootTest
public class So55188061ApplicationTests {
public final String sampleMessage =
"{\"ORCH_KEY\":{\"inputMap\":{},"
+ "\"outputMap\":{\"activityId\":\"10001002\",\"activityStatus\":\"SUCCESS\"}}}";
#Autowired
private RabbitListenerEndpointRegistry registry;
#Test
public void test() throws Exception {
AbstractMessageListenerContainer listenerContainer = (AbstractMessageListenerContainer) this.registry
.getListenerContainer("foo");
ChannelAwareMessageListener listener = (ChannelAwareMessageListener) listenerContainer.getMessageListener();
Channel channel = mock(Channel.class);
listener.onMessage(MessageBuilder.withBody(sampleMessage.getBytes())
.andProperties(MessagePropertiesBuilder.newInstance()
.setContentType("application/json")
.setDeliveryTag(42L)
.build())
.build(),
channel);
verify(channel).basicAck(42L, false);
}
}
and
{ORCH_KEY=Foo [inputMap={}, outputMap={activityId=10001002, activityStatus=SUCCESS}]}
According to your complex requirement to have everything on board, I don't see how we can make a deal with the mock(ConnectionFactory.class). We would need to mock much more to have everything working.
Instead, I would suggest to take a look into the real integration test against existing RabbitMQ or at least embedded QPid.
In addition you may consider to use a #RabbitListenerTest to spy your #RabbitListener invocation without interfering your production code.
More info is in the Reference Manual: https://docs.spring.io/spring-amqp/docs/2.1.4.RELEASE/reference/#test-harness
I'd like to not require my clients to provide content_type application/json but just default to it. I got this working.
I also tried to combine with another example to implement a custom isFatal(Throwable t) from ConditionalRejectingErrorHandler. I can get my custom error handler to fire, but then it seems to require the content_type property again. I can't figure out how to get them both to work at the same time.
Any ideas?
The below successfully works to not require content_type
EDIT: The below code does not work as I thought. An old message in the queue with the property content_type application/json must have been pulled in
#EnableRabbit
#Configuration
public class ExampleRabbitConfigurer implements
RabbitListenerConfigurer {
#Value("${spring.rabbitmq.host:'localhost'}")
private String host;
#Value("${spring.rabbitmq.port:5672}")
private int port;
#Value("${spring.rabbitmq.username}")
private String username;
#Value("${spring.rabbitmq.password}")
private String password;
#Autowired
private MappingJackson2MessageConverter mappingJackson2MessageConverter;
#Autowired
private DefaultMessageHandlerMethodFactory messageHandlerMethodFactory;
#Bean
public MappingJackson2MessageConverter mappingJackson2MessageConverter() {
return new MappingJackson2MessageConverter();
}
#Bean
public DefaultMessageHandlerMethodFactory messageHandlerMethodFactory() {
DefaultMessageHandlerMethodFactory factory = new DefaultMessageHandlerMethodFactory();
factory.setMessageConverter(mappingJackson2MessageConverter);
return factory;
}
#Override
public void configureRabbitListeners(final RabbitListenerEndpointRegistrar registrar) {
registrar.setMessageHandlerMethodFactory(messageHandlerMethodFactory);
}
The below here works to override isFatal() in ConditionalRejectingErrorHandler. The SimpleRabbitListenerContainerFactory.setMessageConverter() seems like it should serve the same purpose as DefaultMessageHandlerMethodFactory.setMessageConverter(). Obviously this is not the case.
#EnableRabbit
#Configuration
public class ExampleRabbitConfigurer {
#Value("${spring.rabbitmq.host:'localhost'}")
private String host;
#Value("${spring.rabbitmq.port:5672}")
private int port;
#Value("${spring.rabbitmq.username}")
private String username;
#Value("${spring.rabbitmq.password}")
private String password;
#Autowired
ConnectionFactory connectionFactory;
#Autowired
Jackson2JsonMessageConverter jackson2JsonConverter;
#Autowired
ErrorHandler amqpErrorHandlingExceptionStrategy;
#Bean
public Jackson2JsonMessageConverter jackson2JsonConverter() {
return new Jackson2JsonMessageConverter();
}
#Bean
public ErrorHandler amqpErrorHandlingExceptionStrategy() {
return new ConditionalRejectingErrorHandler(new AmqpErrorHandlingExceptionStrategy());
}
#Bean
public SimpleRabbitListenerContainerFactory rabbitListenerContainerFactory(ConnectionFactory connectionFactory) {
SimpleRabbitListenerContainerFactory factory = new SimpleRabbitListenerContainerFactory();
factory.setConnectionFactory(connectionFactory);
factory.setMessageConverter(jackson2JsonConverter);
factory.setErrorHandler(amqpErrorHandlingExceptionStrategy);
return factory;
}
public static class AmqpErrorHandlingExceptionStrategy extends ConditionalRejectingErrorHandler.DefaultExceptionStrategy {
private final Logger LOGGER = org.slf4j.LoggerFactory.getLogger(getClass());
#Override
public boolean isFatal(Throwable t) {
if (t instanceof ListenerExecutionFailedException) {
ListenerExecutionFailedException lefe = (ListenerExecutionFailedException) t;
LOGGER.error("Failed to process inbound message from queue "
+ lefe.getFailedMessage().getMessageProperties().getConsumerQueue()
+ "; failed message: " + lefe.getFailedMessage(), t);
}
return super.isFatal(t);
}
}
Use an "after receive" MessagePostProcessor to add the contentType header to the inbound message.
Starting with version 2.0, you can add the MPP to the container factory.
For earlier versions you can reconfigure...
#SpringBootApplication
public class So47424449Application {
public static void main(String[] args) {
SpringApplication.run(So47424449Application.class, args);
}
#Bean
public ApplicationRunner runner(RabbitListenerEndpointRegistry registry, RabbitTemplate template) {
return args -> {
SimpleMessageListenerContainer container =
(SimpleMessageListenerContainer) registry.getListenerContainer("myListener");
container.setAfterReceivePostProcessors(m -> {
m.getMessageProperties().setContentType("application/json");
return m;
});
container.start();
// send a message with no content type
template.setMessageConverter(new SimpleMessageConverter());
template.convertAndSend("foo", "{\"bar\":\"baz\"}", m -> {
m.getMessageProperties().setContentType(null);
return m;
});
template.convertAndSend("foo", "{\"bar\":\"qux\"}", m -> {
m.getMessageProperties().setContentType(null);
return m;
});
};
}
#Bean
public Jackson2JsonMessageConverter converter() {
return new Jackson2JsonMessageConverter();
}
#RabbitListener(id = "myListener", queues = "foo", autoStartup = "false")
public void listen(Foo foo) {
System.out.println(foo);
if (foo.bar.equals("qux")) {
throw new MessageConversionException("test");
}
}
public static class Foo {
public String bar;
public String getBar() {
return this.bar;
}
public void setBar(String bar) {
this.bar = bar;
}
#Override
public String toString() {
return "Foo [bar=" + this.bar + "]";
}
}
}
As you can see, since it modifies the source message, the modified header is available in the error handler...
2017-11-22 09:39:26.615 WARN 97368 --- [cTaskExecutor-1] ingErrorHandler$DefaultExceptionStrategy : Fatal message conversion error; message rejected; it will be dropped or routed to a dead letter exchange, if so configured: (Body:'{"bar":"qux"}' MessageProperties [headers={}, contentType=application/json, contentEncoding=UTF-8, contentLength=0, receivedDeliveryMode=PERSISTENT, priority=0, redelivered=false, receivedExchange=, receivedRoutingKey=foo, deliveryTag=2, consumerTag=amq.ctag-re1kcxKV14L_nl186stM0w, consumerQueue=foo]), contentType=application/json, contentEncoding=UTF-8, contentLength=0, receivedDeliveryMode=PERSISTENT, priority=0, redelivered=false, receivedExchange=, receivedRoutingKey=foo, deliveryTag=2, consumerTag=amq.ctag-re1kcxKV14L_nl186stM0w, consumerQueue=foo])
I am trying to build a custom sink for unzipping files.
Having this simple code:
public static class ZipIO{
public static class Sink extends com.google.cloud.dataflow.sdk.io.Sink<String> {
private static final long serialVersionUID = -7414200726778377175L;
private final String unzipTarget;
public Sink withDestinationPath(String s){
if(s!=""){
return new Sink(s);
}
else {
throw new IllegalArgumentException("must assign destination path");
}
}
protected Sink(String path){
this.unzipTarget = path;
}
#Override
public void validate(PipelineOptions po){
if(unzipTarget==null){
throw new RuntimeException();
}
}
#Override
public ZipFileWriteOperation createWriteOperation(PipelineOptions po){
return new ZipFileWriteOperation(this);
}
}
private static class ZipFileWriteOperation extends WriteOperation<String, UnzipResult>{
private static final long serialVersionUID = 7976541367499831605L;
private final ZipIO.Sink sink;
public ZipFileWriteOperation(ZipIO.Sink sink){
this.sink = sink;
}
#Override
public void initialize(PipelineOptions po) throws Exception{
}
#Override
public void finalize(Iterable<UnzipResult> writerResults, PipelineOptions po) throws Exception {
long totalFiles = 0;
for(UnzipResult r:writerResults){
totalFiles +=r.filesUnziped;
}
LOG.info("Unzipped {} Files",totalFiles);
}
#Override
public ZipIO.Sink getSink(){
return sink;
}
#Override
public ZipWriter createWriter(PipelineOptions po) throws Exception{
return new ZipWriter(this);
}
}
private static class ZipWriter extends Writer<String, UnzipResult>{
private final ZipFileWriteOperation writeOp;
public long totalUnzipped = 0;
ZipWriter(ZipFileWriteOperation writeOp){
this.writeOp = writeOp;
}
#Override
public void open(String uID) throws Exception{
}
#Override
public void write(String p){
System.out.println(p);
}
#Override
public UnzipResult close() throws Exception{
return new UnzipResult(this.totalUnzipped);
}
#Override
public ZipFileWriteOperation getWriteOperation(){
return writeOp;
}
}
private static class UnzipResult implements Serializable{
private static final long serialVersionUID = -8504626439217544799L;
public long filesUnziped=0;
public UnzipResult(long filesUnziped){
this.filesUnziped=filesUnziped;
}
}
}
}
The processing fails with error:
Exception in thread "main" java.lang.IllegalArgumentException: Cannot setCoder(null)
at com.google.cloud.dataflow.sdk.values.TypedPValue.setCoder(TypedPValue.java:67)
at com.google.cloud.dataflow.sdk.values.PCollection.setCoder(PCollection.java:150)
at com.google.cloud.dataflow.sdk.io.Write$Bound.createWrite(Write.java:380)
at com.google.cloud.dataflow.sdk.io.Write$Bound.apply(Write.java:112)
at com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner$BatchWrite.apply(DataflowPipelineRunner.java:2118)
at com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner$BatchWrite.apply(DataflowPipelineRunner.java:2099)
at com.google.cloud.dataflow.sdk.runners.PipelineRunner.apply(PipelineRunner.java:75)
at com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner.apply(DataflowPipelineRunner.java:465)
at com.google.cloud.dataflow.sdk.runners.BlockingDataflowPipelineRunner.apply(BlockingDataflowPipelineRunner.java:169)
at com.google.cloud.dataflow.sdk.Pipeline.applyInternal(Pipeline.java:368)
at com.google.cloud.dataflow.sdk.Pipeline.applyTransform(Pipeline.java:275)
at com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner.apply(DataflowPipelineRunner.java:463)
at com.google.cloud.dataflow.sdk.runners.BlockingDataflowPipelineRunner.apply(BlockingDataflowPipelineRunner.java:169)
at com.google.cloud.dataflow.sdk.Pipeline.applyInternal(Pipeline.java:368)
at com.google.cloud.dataflow.sdk.Pipeline.applyTransform(Pipeline.java:291)
at com.google.cloud.dataflow.sdk.values.PCollection.apply(PCollection.java:174)
at com.mcd.de.tlogdataflow.StarterPipeline.main(StarterPipeline.java:93)
Any help is appreciated.
Thanks & BR
Philipp
This crash is caused by a bug in the Dataflow Java SDK (specifically, this line) which was also present in the Apache Beam (incubating) Java SDK.
The method Sink.WriterOperation#getWriterResultCoder() must always be overridden, but we failed to mark it abstract. It is fixed in Beam, but unchanged in the Dataflow SDK. You should override this method and return an appropriate coder.
You have some options to come up with the coder:
Write your own small coder class, wrapping one of VarLongCoder or BigEndianLongCoder
Just use a long instead of the UnzipResult structure so you can use those as-is.
Less advisable due to the excess size, you could use SerializableCoder.of(UnzipResult.class)
How to authenticate Dropwizard admin portal, so as to restrict normal users from accessing it?
Please help
In your config, you can set adminUsername and adminPassword under http like so:
http:
adminUsername: user1234
adminPassword: pass5678
For DW 0.7 my approach would be:
public class AdminConstraintSecurityHandler extends ConstraintSecurityHandler {
private static final String ADMIN_ROLE = "admin";
public AdminConstraintSecurityHandler(final String userName, final String password) {
final Constraint constraint = new Constraint(Constraint.__BASIC_AUTH, ADMIN_ROLE);
constraint.setAuthenticate(true);
constraint.setRoles(new String[]{ADMIN_ROLE});
final ConstraintMapping cm = new ConstraintMapping();
cm.setConstraint(constraint);
cm.setPathSpec("/*");
setAuthenticator(new BasicAuthenticator());
addConstraintMapping(cm);
setLoginService(new AdminMappedLoginService(userName, password, ADMIN_ROLE));
}
}
public class AdminMappedLoginService extends MappedLoginService {
public AdminMappedLoginService(final String userName, final String password, final String role) {
putUser(userName, new Password(password), new String[]{role});
}
#Override
public String getName() {
return "Hello";
}
#Override
protected UserIdentity loadUser(final String username) {
return null;
}
#Override
protected void loadUsers() throws IOException {
}
}
and using them in the way:
environment.admin().setSecurityHandler(new AdminConstraintSecurityHandler(...))
Newer Jetty versions do not have MappedLoginService, so #Kamil's answer no longer works. I have modified their answer to get it working as of Dropwizard 1.2.2:
public class AdminConstraintSecurityHandler extends ConstraintSecurityHandler {
private static final String ADMIN_ROLE = "admin";
public AdminConstraintSecurityHandler(final String userName, final String password) {
final Constraint constraint = new Constraint(Constraint.__BASIC_AUTH, ADMIN_ROLE);
constraint.setAuthenticate(true);
constraint.setRoles(new String[]{ADMIN_ROLE});
final ConstraintMapping cm = new ConstraintMapping();
cm.setConstraint(constraint);
cm.setPathSpec("/*");
setAuthenticator(new BasicAuthenticator());
addConstraintMapping(cm);
setLoginService(new AdminLoginService(userName, password));
}
public class AdminLoginService extends AbstractLoginService {
private final UserPrincipal adminPrincipal;
private final String adminUserName;
public AdminLoginService(final String userName, final String password) {
this.adminUserName = Objects.requireNonNull(userName);
this.adminPrincipal = new UserPrincipal(userName, new Password(Objects.requireNonNull(password)));
}
#Override
protected String[] loadRoleInfo(final UserPrincipal principal) {
if (adminUserName.equals(principal.getName())) {
return new String[]{ADMIN_ROLE};
}
return new String[0];
}
#Override
protected UserPrincipal loadUserInfo(final String userName) {
return adminUserName.equals(userName) ? adminPrincipal : null;
}
}
}