The Citrus Framework documentation indicates that integration test console output can be logged via the SLF4J logging system. It's not obvious whether this is automatic, or whether it needs to be enabled in some way. My experience indicates that it's not enabled as no log file containing what appears on the test run console has been produced.
My application uses Log4J with an associated log4j2-spring.xml file (in src/main/resources) to define log formats and files. When an integration test is run via Citrus, the application console output (and other information) is properly logged to the files specified in the Log4J config file. There is however, no Citrus console output logged anywhere (except on the console).
How do I enable the logging of the Citrus test console output? I created a separate log4j.xml that I placed in src/test/resources, but this seems to have been ignored.
Do I need to specify a separate logger in the config that's specific for Citrus output?
Citrus uses SLF4J which is a facade to several other logging frameworks out there. So you need to pick your favorite logging framework (in your case Log4J) and add a SLF4J logger binding for that logging framework. This is all described in the SLF4J user manual
I would suggest to add the SLF4J logger binding for Log4J as a test scoped dependency in your project. Also depending on your Log4J configuration setup you may need to add a Log4J logger configuration for com.consol.citrus and set a proper logging level for that in order to see the Citrus output logged by Log4J.
Related
Not able to enable the debug|trace level logging of the dataflow workers.
The documentation :https://cloud.google.com/dataflow/docs/guides/logging#SettingLevels
indicates the usage of DataflowWorkerLoggingOptions to programmatically overrides the default log level on the worker and enable the debug|trace level logging; however the interface is deprecated and no more present in bean-sdk 2.27.0 .
Has anyone been able to enable the worker level debugging in cloud dataflow; in any way.
The documentation is still up to date and the interface is still present and will work.
The interface is deprecated because the Java-based Dataflow worker is not used when running a pipeline using Beam's portability framework. Quoting the deprecation message:
#deprecated This interface will no longer be the source of truth for worker logging configuration once jobs are executed using a dedicated SDK harness instead of user code being co-located alongside Dataflow worker code. Please set the option below and also the corresponding option within org.apache.beam.sdk.options.SdkHarnessOptions to ensure forward compatibility.
So what you should do is follow the instructions that you linked and also set up logging in SdkHarnessOptions.
When running Apache Beam Google Dataflow jobs and using the SLF4J logger we don't get anything beyond the log message in Stack Driver.
Example of additional information would be function, line number etc.
Is there anyway to configure the logger like a log4j.xml or java logging properties file?
There is no way to customize logs messages in Dataflow other than what is shown in this logging pipelines messages
Have you looked at Cloud Logging? It has several features such as Custom logs / Ingestion API. In case you haven't, take a look at this guide to setup the SLF4J logging facade through Logback appender and Cloud Logging. Once you have configured Logback to use the Cloud Logging, you can use the SLF4J logging API. Another option is to use the Cloud Logging API with a default Java Logging API handler, which can be added programmatically or by using a configuration file, here is an example using logger.
Isaac Miliani, I tried the same option google cloud logging as provided in the google cloud docs,
Added logback.xml to src/main/resources (classpath).
Created loggingeventenhancer and enhancer class to add new labels.
Added markers to logger error, to find the type of error in Stackdriver.
But the logs in stackdriver doesnt have new labels added via logging appender. I think the logback.xml is not found by the maven compile command to deploy the job in dataflow.
Can you provide whats going on wrong here?
I've gone through the ReportPortal docs and it states which appender should be used to send logs to report portal.
I guess what I'm missing here is how to configure where is ReportPortal configured in the first place (i.e., what's the host name & credentials that should be used, etc)
Thanks
first of all you need to male test framework integration
http://reportportal.io/docs/Test-Framework-Integration
choose one relevant to you.
And main config for those integrations stays in reportportal.properties file
http://reportportal.io/docs/JVM-based-clients-configuration
I want to avoid 'info' level logging for certain classes, while I want to allow all logs from other classes. How do i configure this scenario in my log4j2 xml file.
You would add loggers to your configuration with the names of the classes you don't want logged and specify a lever of error or fatal.
I'm developing a grails plugin, which I would like to send an email after the application's Config.groovy has been processed. (For one reason, that the Config.groovy - or other external configuration file - will have all the config I need for emails.)
I attempted a MyPluginBootStrap.groovy, however this got executed before the applications configuration (and indeed, its BootStrap.groovy). And it seems all the normal plugin onXXX() hooks are executed before the application startup.
My next thought is simply to tell people that they need to manually call a method in the plugin to trigger the functionality. So, they'd potentially add a call to their BootStrap.groovy. But I want it more automatic.
Ideally I'd like things to work as follows:
A developer adds the plugin to their BuildConfig.groovy;
A developer specifies some configuration in their Config.groovy - or external config; and then
When you app starts, the email gets sent - without them having to do anything more than item 1 and 2 above.
Is this possible?