using log4j2 configuration builder to initialize logger after startup - log4j2

I created a custom log4j configuration using ConfigurationBuilder and want to initialize this configuration and start using log4j afterwards, in otherwards without having a configuration file when the project initializes...
according to this page under Reconfigure Log4j Using ConfigurationBuilder with the Configurator, it says -
An alternative to a custom ConfigurationFactory is to configure with the Configurator. Once a Configuration object has been constructed, it can be passed to one of the Configurator.initialize methods to set up the Log4j configuration. Using the Configurator in this manner allows the application control over when Log4j is initialized. However, should any logging be attempted before Configurator.initialize() is called then the default configuration will be used for those log events.
So this should be possible.
This is my code - its almost exactly as it is on that page with a few adjustments -
ConfigurationBuilder<BuiltConfiguration> builder = ConfigurationBuilderFactory.newConfigurationBuilder();
builder.setStatusLevel(DEBUG);
builder.setConfigurationName("RollingBuilder");
//create a console appender
AppenderComponentBuilder appenderBuilder = builder.newAppender("Stdout", "CONSOLE")
.addAttribute("target", ConsoleAppender.Target.SYSTEM_OUT);
appenderBuilder.add(builder.newLayout("PatternLayout"))
.addAttribute("pattern", "%d{dd/MMM/yyyy HH:mm:ss,SSS}- %c{1}: %m%n");
builder.add(appenderBuilder);
//create a rolling file appender
LayoutComponentBuilder layoutBuilder = builder.newLayout("PatternLayout")
.addAttribute("pattern", "%d [%t] %-5level: %msg%n");
ComponentBuilder triggeringPolicy = builder.newComponent("Policies")
.addComponent(builder.newComponent("TimeBasedTriggeringPolicy")
.addAttribute("interval", "1")
.addAttribute("modulate", "true"));
ComponentBuilder rolloverStrategy = builder.newComponent("DefaultRolloverStrategy")
.addAttribute("max", "4");
appenderBuilder = builder.newAppender("RollingFile", "RollingFile")
.addAttribute("fileName", "logs/app-info.log")
.addAttribute("filePattern", "logs/app-info-%d{yyyy-MM-dd}--%i.log")
.add(layoutBuilder)
.addComponent(triggeringPolicy)
.addComponent(rolloverStrategy);
builder.add(appenderBuilder);
//create a new logger
builder.add(builder.newLogger("root", Level.DEBUG)
.add(builder.newAppenderRef("RollingFile"))
.addAttribute("additivity", false));
builder.add(builder.newRootLogger(Level.DEBUG)
.add(builder.newAppenderRef("RollingFile")));
LoggerContext ctx = Configurator.initialize(builder.build());
However, when I call that code, then do log statements right after, I get the error -
ERROR StatusLogger No log4j2 configuration file found. Using default configuration: logging only errors to the console. Set system property 'org.apache.logging.log4j.simplelog.StatusLogger.level' to TRACE to show Log4j2 internal initialization logging.
So obviously it thinks I dont have a configuration file... does anyone know how I can get my Logger to recognize this configuration file I created in code?

from Log4j 2 docs
Here is how Log4j finds the available ConfigurationFactories:
A system property named "log4j.configurationFactory" can be set with the name of the ConfigurationFactory to be used.
ConfigurationFactory.setConfigurationFactory(ConfigurationFactory)
can be called with the instance of the ConfigurationFactory to be
used. This must be called before any other calls to Log4j.
A ConfigurationFactory implementation can be added to the classpath
and configured as a plugin in the "ConfigurationFactory" category.
The Order annotation can be used to specify the relative priority
when multiple applicable ConfigurationFactories are found.
In a test I've arranged this:
public class Log4j2Example {
static {
System.setProperty("log4j.configurationFactory", CustomConfigurationFactory.class.getName());
}
private static final Logger LOG = LogManager.getLogger(Log4j2Example.class);
public static void main(String[] args) {
LOG.debug("This Will Be Printed On Debug");
LOG.info("This Will Be Printed On Info");
LOG.warn("This Will Be Printed On Warn");
LOG.error("This Will Be Printed On Error");
LOG.fatal("This Will Be Printed On Fatal");
LOG.info("Appending string: {}.", "Hello, World");
}
}
The ConfigurationFactory implemented:
#Plugin(name = "CustomConfigurationFactory", category = ConfigurationFactory.CATEGORY)
#Order(50)
public class CustomConfigurationFactory extends ConfigurationFactory {
---8<----
}
If you set the value of log4j.configurationFactory property before log4j2 start, it's done.

If you prefer using the Configurator instead of using a custom ConfigurationFactory, you have to make sure that your initialization is applied before any call to LogManager.getLogger(...). For instance, you could use a static initialization block
public class MyApplication {
public static final Logger log;
static {
createCustomConfiguration();
log = LogManager.getLogger(MyApplication.class);
}
public static void createCustomConfiguration() {
// your initialization code
}
public static void main(String[] args) {
log.info("Log and roll!");
// do stuff
}
}

Related

how to run testcontainer with dynamic port for spring data elasticsearch

My test case uses #SpringBootTest annotations to bring up the context and has Autowired some repository. Testcontainer is started in #BeforeAll() method. The problem is RestClientConfig is being initialized/injected before #BeforeAll() in test case. When testcontainer starts, it exports some dynamic port.
I have to set some fixed port in testcontainer 34343 and use the same port in properties file for RestClientConfig.
container = new ElasticsearchContainer(ELASTICSEARCH_IMAGE)
.withEnv("discovery.type", "single-node")
.withExposedPorts(9200)
.withCreateContainerCmdModifier(cmd -> cmd.withHostConfig(
new HostConfig().withPortBindings(new PortBinding(Ports.Binding.bindPort(34343), new ExposedPort(9200)))));
Is there a way to start container and get its dynamic port then use it to initialize RestClientConfig?
I didn't use annoation #Testcontainers though. Is it needed?
Newer versions of Spring provide #DynamicPropertySource for exactly this use case:
https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/test/context/DynamicPropertySource.html
Your code should look roughly like this:
#SpringJUnitConfig(...)
#Testcontainers
class ExampleIntegrationTests {
#Container
static ElasticsearchContainer elastic= new ElasticsearchContainer(ELASTICSEARCH_IMAGE)
.withEnv("discovery.type", "single-node");
// ...
#DynamicPropertySource
static void elasticProperties(DynamicPropertyRegistry registry) {
registry.add("spring.elasticsearch.uris", elastic::getHttpHostAddress);
}
}
You can use context configuration initialiser to set properties during runtime, which you can later use in your RestClientConfig.
Let me show you on the example of Postgresql container setup:
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT, classes = Application.class)
#ContextConfiguration(initializers = AbstractTestcontainersTest.DockerPostgreDataSourceInitializer.class)
public abstract class AbstractTestcontainersTest {
protected static final String DB_CONTAINER_NAME = "postgres-auth-test";
protected static PostgreSQLContainer<?> postgreDBContainer =
new PostgreSQLContainer<>(DockerImageName.parse("public.ecr.aws/docker/library/postgres:12.10-alpine")
.asCompatibleSubstituteFor("postgres"))
.withUsername("postgres")
.withPassword("change_me")
.withInitScript("db.sql")
.withCreateContainerCmdModifier(cmd -> cmd.withName(DB_CONTAINER_NAME))
.withDatabaseName("zpot_main");
#BeforeAll
public static void beforeAll() throws ShellExecutionException {
postgreDBContainer.start();
}
#AfterAll
public static void afterAll() {
postgreDBContainer.stop();
}
public static class DockerPostgreDataSourceInitializer implements ApplicationContextInitializer<ConfigurableApplicationContext> {
#Override
public void initialize(ConfigurableApplicationContext applicationContext) {
TestPropertySourceUtils.addInlinedPropertiesToEnvironment(
applicationContext,
"spring.datasource.url=" + postgreDBContainer.getJdbcUrl(),
"spring.datasource.username=" + postgreDBContainer.getUsername(),
"spring.datasource.password=" + postgreDBContainer.getPassword()
);
}
}
}
All the configuration is done in DockerPostgreDataSourceInitializer, where I set all the properties I need. You also need to annotate your test class with #ContextConfiguration annotaion. You can do something similar with your ElasticSearchContainer. As I just checked the ElasticSearchContainer has a method getHttpHostAddress() which returns host+dynamic_port combination for your container. You can get that host-port pair and set in in properties to be used later in your client configuration. If you need just port you can call container.getMappedPort(9200) and again set that port in properties.
Regarding #Testcontainers annotation, you need it if you want testcontainers to manage your container lifecycle. In that case you also need to annotate container with #Container annotation. Your container will be started either once before all test methods in a class if your container is a static field or before each test method if it's a regular field. You can read more about that here: https://www.testcontainers.org/test_framework_integration/junit_5/#extension.
Or you can start your container manually either in #BeforeAll or #BeforeEach annotated setup methods. In other words no, you don't have to use #Testcontainers annotation.

Deploying a transaction event listener in a Neo4jDesktop installation

I have created a project that contains an ExtensionFactory subclass annotated as #ServiceProvider that returns a LifecycleAdapter subclass which registers a transaction event listener in its start() method, as shown in this example. The code is below:
#ServiceProvider
public class EventListenerExtensionFactory extends ExtensionFactory<EventListenerExtensionFactory.Dependencies> {
private final List<TransactionEventListener<?>> listeners;
public EventListenerExtensionFactory() {
this(List.of(new MyListener()));
}
public EventListenerExtensionFactory(List<TransactionEventListener<?>> listeners) {
super(ExtensionType.DATABASE, "EVENT_LISTENER_EXT_FACTORY");
this.listeners = listeners;
}
#Override
public Lifecycle newInstance(ExtensionContext context, Dependencies dependencies) {
return new EventListenerLifecycleAdapter(dependencies, listeners);
}
#RequiredArgsConstructor
private static class EventListenerLifecycleAdapter extends LifecycleAdapter {
private final Dependencies dependencies;
private final List<TransactionEventListener<?>> listeners;
#Override
public void start() {
DatabaseManagementService managementService = dependencies.databaseManagementService();
listeners.forEach(listener -> managementService.registerTransactionEventListener(
DEFAULT_DATABASE_NAME, listener));
dependencies.log()
.getUserLog(EventListenerExtensionFactory.class)
.info("Registering transaction event listener for database " + DEFAULT_DATABASE_NAME);
}
}
interface Dependencies {
DatabaseManagementService databaseManagementService();
LogService log();
}
}
It works fine in an integration test:
public AbstractDatabaseTest(TransactionEventListener<?>... listeners) {
URI uri = Neo4jBuilders.newInProcessBuilder()
.withExtensionFactories(List.of(new EventListenerExtensionFactory(List.of(listeners))))
.withDisabledServer()
.build()
.boltURI();
driver = GraphDatabase.driver(uri);
session = driver.session();
}
Then I copy the jar file in the plugins directory of my desktop database:
$ cp build/libs/<myproject>.jar /mnt/c/Users/albert.gevorgyan/.Neo4jDesktop/relate-data/dbmss/dbms-7fe3cbdb-11b2-4ca2-81eb-474edbbb3dda/plugins/
I restart the database and even the whole desktop Neo4j program but it doesn't seem to identify the plugin or to initialize the factory: no log messages are found in neo4j.log after the start event, and the transaction events that should be captured by my listener are ignored. Interestingly, a custom function that I have defined in the same jar file actually works - I can call it in the browser. So something must be missing in the extension factory as it doesn't get instantiated.
Is it possible at all to deploy an ExtensionFactory in a Desktop installation and if yes, what am I doing wrong?
It works after I added a provider configuration file to META-INF/services, as explained in https://www.baeldung.com/java-spi. Neo4j finds it then.

Getting closed before endTest call in Selenium using Extent Reports

BaseTest.java:
private static ReportService reportService; // Calling report service interface
#BeforeSuite:
reportService = new ExtentReportService(getConfig()); // New instance of ExtentReportService.
#BeforeMethod:
reportService.startTest(testname); // Starting the test and passing the name and description of the test.
#AfterMethod:
reportService.endTest(); // Ending the test
#AfterSuite:
reportService.close(); // Closing the test
**ExtentReportService.java:** // Contains different extent API methods. (These are designed to be generic.)
protected static ExtentReports extent; // static instance of ExtentReports
protected static ExtentTest test; //static instance of ExtentTTest
#Override // StartTest method
startTest(Method method) {
testMetaData = getTestMetaData(method);
test=extent.startTest(testMetaData.getId(),testMetaData.getSummary());
}
#Override //End test method
endTest() {
extent.endTest(test);
extent.flush();
}
The above is my selenium code.
When I am executing my suite file with parallel="methods" and thread count="3", I am getting the following error: "com.relevantcodes.extentreports.ExtentTestInterruptedException: Close was called before test could end safely using EndTest.".
While debugging, I found that even before all endTest() in AfterMehtod were executed, AfterSuite was being called.
I tried different variations such that the code works, such as, removing static, calling endTest() in the test itself rather than after method, removing close() call from AfterSuite and many other variations. But still getting the same error.
I tried all the possible solutions given on the internet, but to no use.
Attaching a hierarchy file for the ExtentReport used in my project
I also the following solution given in StackOverflow:
Extent report :com.relevantcodes.extentreports.ExtentTestInterruptedException: Close was called before test could end safely using EndTest
Unsynchronized output
XMF file for parallel test.
ExtentReports Intialized in ExtentManager class using Singleton().
public class ExtentManager {
private static ExtentReports extent;
public static ExtentReports getInstance() {
if(extent == null) {
extent = new ExtentReports(System.getProperty("user.dir")+"\target\surefire-reports\html\extent.html", true, DisplayOrder.OLDEST_FIRST);
extent.loadConfig(new File(System.getProperty("user.dir")+"src\test\resources\extentconfig\ReportsConfig.xml"));
}
return extent;
}
}
Declared in TestBase class as global.
public ExtentReports repo= ExtentManager.getInstance();
public static ExtentTest test
Call startTest in public void onTestStart(ITestResult result)
test = repo.startTest(result.getName().toUpperCase());
Call endTest in CustomListener Class both in a)public void onTestFailure(ITestResult result); b)public void onTestSuccess(ITestResult result).
repo.endTest(test)
Call close() OR flush() in #AfterSuite in TestBase class but NOT both!
//repo.close();
repo.flush();
Note: I have ExtentReports ver-2.41.2, and TestNg ver-7.1.0.
After the above steps, error 'Getting closed before endTest call in Selenium using Extent Reports' got resolved.
Extent report generates each test successfully in the report.
Try it out!

log4j2 evaluation of user-created Lookup plugins is inconsistent and does not work at all when used programmatically

I'm observing some behaviour with log4j2 Plugins that I can only interpret as inconsistent.
I'm trying to write a plugin to be used with RollingFileAppender that will let me change the header of each rolled file. I've decided to use a Lookup plugin for this (call it dynamicheader). One additional complication is that the RollingFileAppender and its associated PatternLayout are created programatically. The problems I observe are detailed below, but in summary:
The header property of the PatternLayout is never evaluated: calling getHeader() on the PatternLayout always returns "${dynamicheader:key} instead of the value associated with the key. If I replace "${dynamicheader:key}" with "${sys:key}" the header IS evaluated and getHeader() returns whatever value I set for "key".
If I configure a FileAppender with my plugin in a configuration file ("${dynamicheader:key}", see below), calling getHeader() on the associated PatternLayout DOES return the value associated with the key, but it is only evaluated once. If I change the value(header) associated with a key, subsequent calls to getHeader() do not re-evaluate the header and return the original key. However if I use 2 $ when I define the PatternLayout ("$${dynamicheader:key}"), the header IS evaluated on every call to getHeader(). This isn't surprising since (I gather) it is the expected behaviour, but, if I again replace my plugin with the SystemPropertyLookup, I need only a single $ for the header to be evaluated on every call to getHeader().
My plugin, configuration file and unit test to demonstrate the behaviour follow:
Plugin
package my.custom.plugins;
#Plugin(name = "dynamicheader", category = StrLookup.CATEGORY)
public class DynamicHeader extends AbstractLookup {
private static Map<String, String> headerByAppender = Collections.synchronizedMap(new HashMap<>());
#Override
public String lookup(final LogEvent event, final String key) {
return get(key);
}
public void put(String key, String val) {
headerByAppender.put(key, val);
}
public String get(String key) {
return headerByAppender.get(key);
}
}
Note that my plugin is very very similar to the SystemPropertiesLookup plugin and I'd expect them to behave similarly.
Config:
Configuration:
name: TestConfig
packages: "my.custom.plugins"
Appenders:
File:
name: FILE
fileName: target/surefire-reports/unit-test.log
append: false
PatternLayout:
Pattern: "%m%n%ex"
header: ${dynamicheader:key2}
Loggers:
Root:
AppenderRef:
ref: FILE
Tests:
public void testMyPlugin1() {
DynamicHeader dh = new DynamicHeader();
dh.put("key1", "val1");
PatternLayout pl = PatternLayout.newBuilder()
.withPattern("%m%n%ex")
.withHeader("${dynamicheader:key1}") // use my plugin here...
.build();
assertEquals("val1", pl.getHeader()); // <- this fails. pl.getHeader() always returns "${dynamicheader:key1}"
}
public void testSysPlugin1() {
System.setProperty("key1", "val1");
PatternLayout pl = PatternLayout.newBuilder()
.withPattern("%m%n%ex")
.withHeader("${sys:key1}") // use sys plugin here...
.build();
assertEquals("val1", pl.getHeader()); // <- this works.
}
public void testMyPlugin2() {
DynamicHeader dh = new DynamicHeader();
Layout l = ((LoggerContext)LogManager.getContect(false)).getConfiguration().getAppender("FILE").getLayout();
dh.put("key2", "val2");
assertEquals("val2", l.getHeader()); // <- this works.
dh.put("key2", "val3");
assertEquals("val3", l.getHeader()); // <- this fails. getHeader() returns "key2"
}
However, if I chanege the coinfiguration file from
header: ${dynamicheader:key2}
to
header: $${dynamicheader:key2}
the above test passes. In contrast, if I change
header: ${dynamicheader:key2}
to
header: ${sys:key2}
the equivalent test passes:
public void testSysPlugin2() {
Layout l = ((LoggerContext)LogManager.getContect(false)).getConfiguration().getAppender("FILE").getLayout();
System.setProperty("key2", "val2");
assertEquals("val2", l.getHeader()); // <- this works.
System.setProperty("key2", "val3");
assertEquals("val3", l.getHeader()); // <- this also works.
}
My questions then are:
Why is My plugin never evaluated when the PatternLayout is created programmatically? Am I doing something wrong?
Why does the SystemPropertyLookup behave differently than an virtually identical user-authored lookup.
I'm using log4j2 v2.8.2.
Thanks in advance!

How to get values from properties file in grails?

How to get values from properties file please? and where should I put the file ?
Thank you
EDIT : I'm using grails 3.1.5 And I'm trying to get properties from a job class (quartz)
Either keep your properties directly in Config.groovy file.
Or you can create a .properties file to keep properties and add this file in Config.groovy
grails.config.locations = [ "classpath:grails-app-config.properties"]
and access it anywhere in application using
grailsApplication.config."propertyName"
We have a trait like this:
/**
* Load config from config locations given by property grails.config.locations.
* Based on http://grails.1312388.n4.nabble.com/Grails-3-External-config-td4658823.html
*/
trait ExternalConfigurationLoader implements EnvironmentAware {
#Override
void setEnvironment(Environment environment) {
loadExternalConfigLocations(environment)
}
void loadExternalConfigLocations(Environment environment) {
if (environment) {
def configLocations = findConfigLocationsFromApplicationGroovy()
DefaultResourceLocator resourceLocator = new DefaultResourceLocator()
for (String configLocation in configLocations) {
loadConfigLocation(configLocation, grails.util.Environment.current.name, environment, resourceLocator)
}
}
}
List<String> findConfigLocationsFromApplicationGroovy() {
def applicationGroovy = this.getClass().classLoader.getResource('application.groovy')
if (applicationGroovy) {
def applicationConfiguration = new ConfigSlurper(grails.util.Environment.current.name).parse(applicationGroovy)
return applicationConfiguration.grails.config.locations
}
[]
}
void loadConfigLocation(String configLocation, String currentEnvironmentName, Environment environment, ResourceLocator resourceLocator) {
def configurationResource = resourceLocator.findResourceForURI(configLocation)
if (configurationResource) {
log.debug "External config '$configLocation' found. Loading."
def configSlurper = new ConfigSlurper(currentEnvironmentName)
def config = configSlurper.parse(configurationResource.getURL())
environment.propertySources.addFirst(new MapPropertySource(configLocation, config))
} else {
log.debug "External config '$configLocation' not found."
}
}
}
Then we can add this trait to Application.groovy:
class Application extends GrailsAutoConfiguration implements ExternalConfigurationLoader {
and configure external config files in application.groovy:
grails.config.locations = ["classpath:myapp-config.groovy", "file:dev-config.groovy"]
If using Tomcat, you can then put myapp-config.groovy in Tomcats lib folder.
Note: this variant only supports external config files of type .groovy but you can extend it to support .yml or .properties if you prefer that. Also note that this example has some issues with overriding values from environment block in application.yml, so if you plan to override dataSource you will need to move the default configuration of dataSource from application.yml to application.groovy first.
There is also a plugin in the making that is adding similar support for grails.config.locations. See https://github.com/sbglasius/external-config

Resources