log4j2 and custom key value using JSONLayout - log4j2

I would like to add to my log a String key and an Integer value using Log4j2.
Is there a way to do it? when I added properties to the ThreadContext I was able to add only String:String key and values but this does not help I have numbers that I need to present in Kibana (some graphs)
thanks,
Kobi

The built-in GelfLayout may be useful.
It's true that the default ThreadContext only supports String:String key-values. The work done in LOG4J2-1648 allows you to use other types in ThreadContext:
Tell Log4j to use a ThreadContext map implementation that implements the ObjectThreadContextMap interface. The simplest way to accomplish this is by setting system property log4j2.garbagefree.threadContextMap to true.
The standard ThreadContext facade only has methods for Strings, so you need to create your own facade. The below should work:
public class ObjectThreadContext {
public static boolean isSupported() {
return ThreadContext.getThreadContextMap() instanceof ObjectThreadContextMap;
}
public static Object getValue(String key) {
return getObjectMap().getValue(key);
}
public static void putValue(String key, Object value) {
getObjectMap().putValue(key, value);
}
private static ObjectThreadContextMap getObjectMap() {
if (!isSupported()) { throw new UnsupportedOperationException(); }
return (ObjectThreadContextMap) ThreadContext.getThreadContextMap();
}
}
It is possible to avoid ThreadContext altogether by injecting key-value pairs from another source into the LogEvent. This is (briefly) mentioned under Custom Context Data Injectors (http://logging.apache.org/log4j/2.x/manual/extending.html#Custom_ContextDataInjector).

I found default log4j2 implementation somewhat problematic for passing custom fields with values. In my opinion current java logging frameworks are not well suited for writing structured log events
If you like hacks, you can check https://github.com/skorhone/gelfj-alt/tree/master/src/main/java/org/graylog2/log4j2 . It's a library written for gelf. One of provided features is a layout (ExtGelfjLayout) that supports extracting custom fields (See FieldExtractor) from events. But... im order to send such event, you need to write your own logging facade on top of log4j2.

Related

Utilize Message Template for Message Property Using Serilog

I've adopted Serilog for my logging needs.
I (do my best to) follow the SOLID principles and have thus adopted Steven's adapter which is an excellent implementation.
For the most part, this is great. I have a class called LogEntryDetail which contains certain properties:
class LogEntryDetail
{
public string Message {get;set;}
public string MessageTemplate {get;set;}
public string Properties {get;set;}
// etc. etc.
}
I will log the LogEntryDetail like this:
public void Log(LogEntryDetail logEntryDetail)
{
if (ReferenceEquals(null, logEntryDetail.Layer))
{
logEntryDetail.Layer = typeof(T).Name;
}
_logger.Write(ToLevel(logEntryDetail.Severity), logEntryDetail.Exception, logEntryDetail.MessageTemplate, logEntryDetail);
}
I am using the MSSqlServer sink (Serilog.Sinks.MSSqlServer) For error logging, all is well.
I have a perf logger, which I plug into my request pipeline. For this logger, I don't want to save every property in the LogEntry object. I only want to save the Message property in the Message column of the table which I have created.
So, normally, when you call write on the serilog logger and pass in a complex object, the Message column contains the whole object, serialized as JSON.
I want to know if there is some way that I can specify the MessageTemplate to be something like {Message} or {#Message}, so that the Message column in the database only contains the string stored in the Message property of the LogEntryDetail object. Any other property is redundant and a waste of storage space.
When I specify the MessageTemplate to be {Message}, the Message property contains the full name of the LogEntryDetail type (including namespace).
I feel like I am close and just missing some little thing in my comprehension of Serilog's MessageTemplate feature.
I'll just explain what I did here to try and get the best of both worlds. It seems here we have the age-old developer conundrum of sacrificing specific features of a library in order to comply with the SOLID principles. We've seen this before with things like repository abstractions which make it impossible to leverage the granular features of some of the ORMs which they abstract.
So, my SerilogAdapter looks like this:
public class SerilogLogAdapter<T> : ILogger
{
private readonly Serilog.ILogger _logger;
public SerilogLogAdapter(Serilog.ILogger logger)
{
_logger = logger;
}
public void Log(LogEntryDetail logEntryDetail)
{
if (ReferenceEquals(null, logEntryDetail.Layer))
{
logEntryDetail.Layer = typeof(T).Name;
}
if (logEntryDetail.MessageTemplate.Equals(MessageTemplates.LogEntryDetailMessageTemplate, StringComparison.Ordinal))
{
_logger.Write(ToLevel(logEntryDetail.Severity), logEntryDetail.Exception, logEntryDetail.MessageTemplate, logEntryDetail);
}
else
{
_logger.Write(ToLevel(logEntryDetail.Severity), logEntryDetail.MessageTemplate, logEntryDetail.Message, logEntryDetail.AdditionalInfo);
}
}
private static LogEventLevel ToLevel(LoggingEventType severity) =>
severity == LoggingEventType.Debug ? LogEventLevel.Debug :
severity == LoggingEventType.Information ? LogEventLevel.Information :
severity == LoggingEventType.Warning ? LogEventLevel.Warning :
severity == LoggingEventType.Error ? LogEventLevel.Error :
LogEventLevel.Fatal;
}
If the MessageTemplate is one which represents the whole object, then that will be logged. Otherwise, a custom MessageTemplate can be used and the Message property, along with the AdditionalInfo property (a dictionary) can be logged.
We at least squeeze one more thing out of Serilog, and it is one of its strengths - the ability log using different Message templates and to search the log by Message Template.
By all means let me know if it could be better!

Dataflow output parameterized type to avro file

I have a pipeline that successfully outputs an Avro file as follows:
#DefaultCoder(AvroCoder.class)
class MyOutput_T_S {
T foo;
S bar;
Boolean baz;
public MyOutput_T_S() {}
}
#DefaultCoder(AvroCoder.class)
class T {
String id;
public T() {}
}
#DefaultCoder(AvroCoder.class)
class S {
String id;
public S() {}
}
...
PCollection<MyOutput_T_S> output = input.apply(myTransform);
output.apply(AvroIO.Write.to("/out").withSchema(MyOutput_T_S.class));
How can I reproduce this exact behavior except with a parameterized output MyOutput<T, S> (where T and S are both Avro code-able using reflection).
The main issue is that Avro reflection doesn't work for parameterized types. So based on these responses:
Setting Custom Coders & Handling Parameterized types
Using Avrocoder for Custom Types with Generics
1) I think I need to write a custom CoderFactory but, I am having difficulty figuring out exactly how this works (I'm having trouble finding examples). Oddly enough, a completely naive coder factory appears to let me run the pipeline and inspect proper output using DataflowAssert:
cr.RegisterCoder(MyOutput.class, new CoderFactory() {
#Override
public Coder<?> create(List<? excents Coder<?>> componentCoders) {
Schema schema = new Schema.Parser().parse("{\"type\":\"record\,"
+ "\"name\":\"MyOutput\","
+ "\"namespace\":\"mypackage"\","
+ "\"fields\":[]}"
return AvroCoder.of(MyOutput.class, schema);
}
#Override
public List<Object> getInstanceComponents(Object value) {
MyOutput<Object, Object> myOutput = (MyOutput<Object, Object>) value;
List components = new ArrayList();
return components;
}
While I can successfully assert against the output now, I expect this will not cut it for writing to a file. I haven't figured out how I'm supposed to use the provided componentCoders to generate the correct schema and if I try to just shove the schema of T or S into fields I get:
java.lang.IllegalArgumentException: Unable to get field id from class null
2) Assuming I figure out how to encode MyOutput. What do I pass to AvroIO.Write.withSchema? If I pass either MyOutput.class or the schema I get type mismatch errors.
I think there are two questions (correct me if I am wrong):
How do I enable the coder registry to provide coders for various parameterizations of MyOutput<T, S>?
How do I values of MyOutput<T, S> to a file using AvroIO.Write.
The first question is to be solved by registering a CoderFactory as in the linked question you found.
Your naive coder is probably allowing you to run the pipeline without issues because serialization is being optimized away. Certainly an Avro schema with no fields will result in those fields being dropped in a serialization+deserialization round trip.
But assuming you fill in the schema with the fields, your approach to CoderFactory#create looks right. I don't know the exact cause of the message java.lang.IllegalArgumentException: Unable to get field id from class null, but the call to AvroCoder.of(MyOutput.class, schema) should work, for an appropriately assembled schema. If there is an issue with this, more details (such as the rest of the stack track) would be helpful.
However, your override of CoderFactory#getInstanceComponents should return a list of values, one per type parameter of MyOutput. Like so:
#Override
public List<Object> getInstanceComponents(Object value) {
MyOutput<Object, Object> myOutput = (MyOutput<Object, Object>) value;
return ImmutableList.of(myOutput.foo, myOutput.bar);
}
The second question can be answered using some of the same support code as the first, but otherwise is independent. AvroIO.Write.withSchema always explicitly uses the provided schema. It does use AvroCoder under the hood, but this is actually an implementation detail. Providing a compatible schema is all that is necessary - such a schema will have to be composed for each value of T and S for which you want to output MyOutput<T, S>.

PXAttributeExtension in Acumatica

Does anybody knows how to use PXAttributeExtension in Acumatica?
Can I use it for modification of existing attributes, for example CurrencyInfoAttribute?
PX.Data.PXAttributeExtension has been removed as of version 5.1
Fortunately Acumatica provides a variety of ways to both override and modify existing attributes within the system, the mostly commonly used one are :
[PXMergeAttributes] - Placed on a CacheAttached handler, reuses existing attributes defined in DAC
[PXMergeAttributes(Method = MergeMethod.Merge)]
[NPSubaccount(typeof(APTranExtension.usrNPFundID), typeof(APTranExtension.usrNPMasterID), typeof(APTran.accountID), typeof(APTran.branchID), true)]
protected virtual void APTran_SubID_CacheAttached(PXCache sender)
{
}
[PXCustomizeBaseAttributeAttribute] - Placed on a CacheAttached handler, Overrides a single property on a attribute for a particular screen
[PXMergeAttributes(Method = MergeMethod.Merge)]
[PXCustomizeBaseAttribute(typeof(PXUIFieldAttribute), nameof(PXUIFieldAttribute.DisplayName), "Refined Subaccount")]
protected virtual void APTran_SubID_CacheAttached(PXCache sender)
{
}
A great article on the methods of overriding and customizing attributes can be found here : https://www.codeday.top/2017/10/10/47532.html

IBM Integration Bus: How to read user defined node (Java) complex (table) property in Java extension code

I created Java user defined node in IntegrationToolkit (9.0.0.1) and assigned it with several properties. Two of the node properties are simple (of String type) and one property is complex (table property with predefined type of User-defined) that is consisted of another two simple properties.
By following the documentation I was able to read two simple properties in my Java extension class (that extends MbNode and implements MbNodeInterface) by making getters and setters that match the names of the two simple properties. Documentation also states that getters and setters should return and set String values whatever the real simple type of a property may be. Obviously, this would not work for my complex node property.
I was also able to read User Defined Properties that are defined on the message flow level, by using CMP (Integration Buss API) classes, which was another impossible thing to do from user defined node without CMP. At one point I began to think that my complex property would be among User Defined Properties, (although UDPs are defined on the flow level and my property is defined on the custom node level) based on some other random documentation and some other forum discussion.
I finally deduced that the complex property should map to MbTable type (as it is so stated in that type's description), but I was not able to use that.
Does anyone know how to access user defined node's complex(table) property value from Java?
I recently started working with WebSphere Message Broker v 8.0.0.5 for one of my projects and I was going to ask the same question until SO suggested your question which answered my question. It might be a little late for this question but it may help others having similar questions.
After many frustrating hours consulting IBM documentation this is what I found following your thread:
You're correct about the properties being available as user-defined properties (UDP) but only at the node level.
According to the JavaDoc for MbTable class (emphasis added to call out the relevant parts):
MbTable is a complex data type which contains one or more rows of simple data types. It structure is very similar to a * standard java record set. It can not be constructed in a node but instead is returned by the getUserDefinedAttribute() on the MbNode class. Its primary use is in allowing complex attributes to be defined on nodes instead of the normal static simple types. It can only be used in the runtime if a version of the toolkit that supports complex properties is being used.
You have to call com.ibm.broker.plugin.MbNode.getUserDefinedAttribute which will return an instance of com.ibm.broker.plugin.MbTable. However, the broker runtime doesn't call any setter methods for the complex attributes during the node initialization process like it does for simple properties. Also, you cannot access the complex attributes in either the constructor or the setter methods of other simple properties in the node class. These are available only in the run or evaluate method.
The following is the decompiled method definition of com.ibm.broker.plugin.MbNode.getUserDefinedAttribute.
public Object getUserDefinedAttribute(String string) {
Object object;
String string2 = "addDynamicTerminals";
if (Trace.isOn) {
Trace.logNamedEntry((Object)this, (String)string2);
}
if ((object = this.getUDA(string)) != null && object.getClass() == MbElement.class) {
try {
MbTable mbTable;
MbElement mbElement = (MbElement)object;
object = mbTable = new MbTable(mbElement);
}
catch (MbException var4_5) {
if (Trace.isOn) {
Trace.logStackTrace((Object)this, (String)string2, (Throwable)var4_5);
}
object = null;
}
}
if (Trace.isOn) {
Trace.logNamedExit((Object)this, (String)string2);
}
return object;
}
As you can see it always returns an instance of MbTable if the attribute is found.
I was able to access the complex attributes with the following code in my node definition:
#Override
public void evaluate(MbMessageAssembly inAssembly, MbInputTerminal inTerminal) throws MbException {
checkUserDefinedProperties();
}
/**
* #throws MbException
*/
private void checkUserDefinedProperties() throws MbException {
Object obj = getUserDefinedAttribute("geoLocations");
if (obj instanceof MbTable) {
MbTable table = (MbTable) obj;
int size = table.size();
int i = 0;
table.moveToRow(i);
for (; i < size; i++, table.next()) {
String latitude = (String) table.getValue("latitube");
String longitude = (String) table.getValue("longitude");
}
}
}
The documentation for declaring attributes for user-defined extensions in Java is surprisingly silent on this little bit of detail.
Please note that all the references and code are for WebSphere Message Broker v 8.0.0 and should be relevant for IBM Integration Bus 9.0.0.1 too.

How to force log4j2 rolling file appender to roll over?

To my best knowledge, RollingFileAppender in log4j2 will not roll over at the specified time (let's say - at the end of an hour), but at the first log event that arrives after the time threshold has been exceeded.
Is there a way to trigger an event, that on one hand will cause the file to roll over, and on another - will not append to the log (or will append something trivial, like an empty string)?
No there isn't any (built-in) way to do this. There are no background threads monitoring rollover time.
You could create a log4j2 plugin that implements org.apache.logging.log4j.core.appender.rolling.TriggeringPolicy (See the built-in TimeBasedTriggeringPolicy and SizeBasedTriggeringPolicy classes for sample code.)
If you configure your custom triggering policy, log4j2 will check for every log event whether it should trigger a rollover (so take care when implementing the isTriggeringEvent method to avoid impacting performance). Note that for your custom plugin to be picked up, you need to specify the package of your class in the packages attribute of the Configuration element of your log4j2.xml file.
Finally, if this works well for you and you think your solution may be useful to others too, consider contributing your custom triggering policy back to the log4j2 code base.
Following Remko's idea, I wrote the following code, and it's working.
package com.stony;
import org.apache.logging.log4j.core.LogEvent;
import org.apache.logging.log4j.core.appender.rolling.*;
import org.apache.logging.log4j.core.config.plugins.Plugin;
import org.apache.logging.log4j.core.config.plugins.PluginFactory;
#Plugin(name = "ForceTriggerPolicy", category = "Core")
public class ForceTriggerPolicy implements TriggeringPolicy {
private static boolean isRolling;
#Override
public void initialize(RollingFileManager arg0) {
setRolling(false);
}
#Override
public boolean isTriggeringEvent(LogEvent arg0) {
return isRolling();
}
public static boolean isRolling() {
return isRolling;
}
public static void setRolling(boolean _isRolling) {
isRolling = _isRolling;
}
#PluginFactory
public static ForceTriggerPolicy createPolicy(){
return new ForceTriggerPolicy();
}
}
If you have access to the Object RollingFileAppender you could do something like:
rollingFileAppender.getManager().rollover();
Here you can see the manager class:
https://github.com/apache/logging-log4j2/blob/d368e294d631e79119caa985656d0ec571bd24f5/log4j-core/src/main/java/org/apache/logging/log4j/core/appender/rolling/RollingFileManager.java

Resources