Utilize Message Template for Message Property Using Serilog - serilog

I've adopted Serilog for my logging needs.
I (do my best to) follow the SOLID principles and have thus adopted Steven's adapter which is an excellent implementation.
For the most part, this is great. I have a class called LogEntryDetail which contains certain properties:
class LogEntryDetail
{
public string Message {get;set;}
public string MessageTemplate {get;set;}
public string Properties {get;set;}
// etc. etc.
}
I will log the LogEntryDetail like this:
public void Log(LogEntryDetail logEntryDetail)
{
if (ReferenceEquals(null, logEntryDetail.Layer))
{
logEntryDetail.Layer = typeof(T).Name;
}
_logger.Write(ToLevel(logEntryDetail.Severity), logEntryDetail.Exception, logEntryDetail.MessageTemplate, logEntryDetail);
}
I am using the MSSqlServer sink (Serilog.Sinks.MSSqlServer) For error logging, all is well.
I have a perf logger, which I plug into my request pipeline. For this logger, I don't want to save every property in the LogEntry object. I only want to save the Message property in the Message column of the table which I have created.
So, normally, when you call write on the serilog logger and pass in a complex object, the Message column contains the whole object, serialized as JSON.
I want to know if there is some way that I can specify the MessageTemplate to be something like {Message} or {#Message}, so that the Message column in the database only contains the string stored in the Message property of the LogEntryDetail object. Any other property is redundant and a waste of storage space.
When I specify the MessageTemplate to be {Message}, the Message property contains the full name of the LogEntryDetail type (including namespace).
I feel like I am close and just missing some little thing in my comprehension of Serilog's MessageTemplate feature.

I'll just explain what I did here to try and get the best of both worlds. It seems here we have the age-old developer conundrum of sacrificing specific features of a library in order to comply with the SOLID principles. We've seen this before with things like repository abstractions which make it impossible to leverage the granular features of some of the ORMs which they abstract.
So, my SerilogAdapter looks like this:
public class SerilogLogAdapter<T> : ILogger
{
private readonly Serilog.ILogger _logger;
public SerilogLogAdapter(Serilog.ILogger logger)
{
_logger = logger;
}
public void Log(LogEntryDetail logEntryDetail)
{
if (ReferenceEquals(null, logEntryDetail.Layer))
{
logEntryDetail.Layer = typeof(T).Name;
}
if (logEntryDetail.MessageTemplate.Equals(MessageTemplates.LogEntryDetailMessageTemplate, StringComparison.Ordinal))
{
_logger.Write(ToLevel(logEntryDetail.Severity), logEntryDetail.Exception, logEntryDetail.MessageTemplate, logEntryDetail);
}
else
{
_logger.Write(ToLevel(logEntryDetail.Severity), logEntryDetail.MessageTemplate, logEntryDetail.Message, logEntryDetail.AdditionalInfo);
}
}
private static LogEventLevel ToLevel(LoggingEventType severity) =>
severity == LoggingEventType.Debug ? LogEventLevel.Debug :
severity == LoggingEventType.Information ? LogEventLevel.Information :
severity == LoggingEventType.Warning ? LogEventLevel.Warning :
severity == LoggingEventType.Error ? LogEventLevel.Error :
LogEventLevel.Fatal;
}
If the MessageTemplate is one which represents the whole object, then that will be logged. Otherwise, a custom MessageTemplate can be used and the Message property, along with the AdditionalInfo property (a dictionary) can be logged.
We at least squeeze one more thing out of Serilog, and it is one of its strengths - the ability log using different Message templates and to search the log by Message Template.
By all means let me know if it could be better!

Related

log4j2 and custom key value using JSONLayout

I would like to add to my log a String key and an Integer value using Log4j2.
Is there a way to do it? when I added properties to the ThreadContext I was able to add only String:String key and values but this does not help I have numbers that I need to present in Kibana (some graphs)
thanks,
Kobi
The built-in GelfLayout may be useful.
It's true that the default ThreadContext only supports String:String key-values. The work done in LOG4J2-1648 allows you to use other types in ThreadContext:
Tell Log4j to use a ThreadContext map implementation that implements the ObjectThreadContextMap interface. The simplest way to accomplish this is by setting system property log4j2.garbagefree.threadContextMap to true.
The standard ThreadContext facade only has methods for Strings, so you need to create your own facade. The below should work:
public class ObjectThreadContext {
public static boolean isSupported() {
return ThreadContext.getThreadContextMap() instanceof ObjectThreadContextMap;
}
public static Object getValue(String key) {
return getObjectMap().getValue(key);
}
public static void putValue(String key, Object value) {
getObjectMap().putValue(key, value);
}
private static ObjectThreadContextMap getObjectMap() {
if (!isSupported()) { throw new UnsupportedOperationException(); }
return (ObjectThreadContextMap) ThreadContext.getThreadContextMap();
}
}
It is possible to avoid ThreadContext altogether by injecting key-value pairs from another source into the LogEvent. This is (briefly) mentioned under Custom Context Data Injectors (http://logging.apache.org/log4j/2.x/manual/extending.html#Custom_ContextDataInjector).
I found default log4j2 implementation somewhat problematic for passing custom fields with values. In my opinion current java logging frameworks are not well suited for writing structured log events
If you like hacks, you can check https://github.com/skorhone/gelfj-alt/tree/master/src/main/java/org/graylog2/log4j2 . It's a library written for gelf. One of provided features is a layout (ExtGelfjLayout) that supports extracting custom fields (See FieldExtractor) from events. But... im order to send such event, you need to write your own logging facade on top of log4j2.

Security Context in terms of QueryDslPredicateExecutor and Spring Data Rest

I'm building REST API on the top of Spring Data Rest. Initially all repositories where extending JpaRepository. Lately decision has been made to take a more flexible approach and use QueryDslPredicateExecutor<T> along with QuerydslBinderCustomizer<Q>.
Pretty much all findAll methods exposed in repositories should address two scenarios
principal has a role ROLE_ADMIN then no filtering should be applied a part from Pageable,Sort
principal does not have a role ROLE_ADMIN I would return only those entities which belong to the current user
Getting that done was as simple as annotating findAll method as below.
#Query("select e from Entity e where e.field = ?#{principal} or 1=?#{hasRole('ROLE_ADMIN') ? 1 : 0}")
Page<Entity> findAll(Pageable pageable);
Now I want our findAll to be something similar to below
Page<Entity> findAll(Predicate predicate, Pageable pageable)
Predicate is being build from request parameters(courtesy of #QuerydslPredicate) and is being passed in to RepositoryEntityController which is all being managed by spring-data-rest which is great.
#ResponseBody
#RequestMapping(value = BASE_MAPPING, method = RequestMethod.GET)
public Resources<?> getCollectionResource(#QuerydslPredicate RootResourceInformation resourceInformation,
DefaultedPageable pageable, Sort sort, PersistentEntityResourceAssembler assembler)
throws ResourceNotFoundException, HttpRequestMethodNotSupportedException {
I want to tweak that predicate(2 scenarios as above that I want to address).
It would be something simialr to below.
BooleanBuilder builder = new BooleanBuilder(predicateBuildFromHttpRequest);
builder.and(predicateAddressingOurRequirements);
builder.getValue();
#PostFilter won't be an option as return type for all repos is Page<Entity>.
Use case that I want to address seems to be quite common to me. Having said that I had a look at spring-data and spring-data-rest documentation and could not find anything related to my question.
Question is : Am I missing something obvious here and there is a quick win for it? or I would need to implement custom solution myself? Any comments very much appreciated!
The Querydsl predicates are constructed by QuerydslAwareRootResourceInformationHandlerMethodArgumentResolver which is sadly package private and can't be directly extended.
However, you can make a copy of that, add your security predicate logic and then drop in your implementation instead of the former resolver.
public class MyQueryDslRootResourceArgumentResolver extends RootResourceInformationHandlerMethodArgumentResolver {
// the most of the code is ommitted, the content is identical with
// QuerydslAwareRootResourceInformationHandlerMethodArgumentResolver,
// the important part is postProcessMethod where you can modify the predicate
#Override
#SuppressWarnings({"unchecked"})
protected RepositoryInvoker postProcess(MethodParameter parameter, RepositoryInvoker invoker,
Class<?> domainType, Map<String, String[]> parameters) {
Object repository = repositories.getRepositoryFor(domainType);
if (!QueryDslPredicateExecutor.class.isInstance(repository)
|| !parameter.hasParameterAnnotation(QuerydslPredicate.class)) {
return invoker;
}
ClassTypeInformation<?> type = ClassTypeInformation.from(domainType);
QuerydslBindings bindings = factory.createBindingsFor(null, type);
// modify your predicate here
Predicate predicate = predicateBuilder.getPredicate(type, toMultiValueMap(parameters), bindings);
return new QuerydslRepositoryInvokerAdapter(invoker, (QueryDslPredicateExecutor<Object>) repository, predicate);
}
}
Then add you own configuration class with the custom resolver implementation.
public class CustomRepositoryRestMvcConfiguration extends RepositoryRestMvcConfiguration {
#Autowired
ApplicationContext applicationContext;
#Override
public RootResourceInformationHandlerMethodArgumentResolver repoRequestArgumentResolver() {
QuerydslBindingsFactory factory = applicationContext.getBean(QuerydslBindingsFactory.class);
QuerydslPredicateBuilder predicateBuilder = new QuerydslPredicateBuilder(defaultConversionService(),
factory.getEntityPathResolver());
return new MyQueryDslRootResourceArgumentResolver(repositories(),
repositoryInvokerFactory(defaultConversionService()), resourceMetadataHandlerMethodArgumentResolver(),
predicateBuilder, factory);
}
}
Here is an example project that modifies the Predicate (that is produced by the parameters from url) before passing it to the repository.
The demonstration of what David Siro explained above
https://github.com/yeldarxman/QueryDslPredicateModifier

AutoFixture constrained string parameter

Is there a simple way to specify a list of possible values for the parameter orderBy? Not one by one please, otherwise I would not be making the question. I want to specify that orderby makes sense only if it is chosen from a predetermined list. Suppose the list is very large...still not random. This cannot be that hard...no single example of such a simple task.
[Test, AutoData]
public override void IndexReturnsView(int? pageIndex, int? pageSize, string orderBy, bool? desc)
{
.....
}
EDIT:
All I want is to read the possible values from a list as I would do with the ValueSource attribute. However, it seems not to work with AutoFixture. If I specified e.g. [ValueSource("GetOrderByColumnNames")] my test does not work anymore. I have no idea of what I am doing wrong. Unfortunately AutoFixture lacks useful documentation and the examples are very basic. Is there a working example of this scenario that I can use to guide myself here?
This has to be a very common situation, however I have been looking for days with no luck :(.
Appreciated!
If I understand the question correctly, the problem is that the orderBy value should be randomly selected from a list of predefined values, but that list might be too large to use with [InlineAutoData].
The easiest way to do this that I can think of is to introduce a helper type. This might actually be a valuable addition to the application code itself, as it makes the role of various values more explicit, but if not, you can always add the wrapper type to the test code base.
Something like this is the minimum you'll need:
public class OrderCriterion
{
public OrderCriterion(string value)
{
Value = value;
}
public string Value { get; }
}
If we also imagine that this class exposes a list of ValidValues, you can implement an AutoFixture Customization using the ElementsBuilder class:
public class OrderCriterionCustomization : ICustomization
{
public void Customize(IFixture fixture)
{
fixture.Customizations.Add(
new ElementsBuilder<OrderCriterion>(OrderCriterion.ValidValues));
}
}
Then you create a data source attribute for your test code base:
public class TestConventionsAttribute : AutoDataAttribute
{
public TestConventionsAttribute() : base(
() => new Fixture().Customize(new OrderCriterionCustomization()))
{
}
}
This enables you to write a test like this, which passes:
[Theory, TestConventions]
public void IndexReturnsView(
int? pageIndex,
int? pageSize,
OrderCriterion orderBy,
bool? desc)
{
Assert.Contains(orderBy.Value, OrderCriterion.ValidValues.Select(x => x.Value));
}
Notice that instead of declaring the orderBy parameter as a string, you declare it as an OrderCriterion, which means that AutoFixture will be detect its presence, and the Customization then kicks in.
See also https://stackoverflow.com/a/48903199/126014

Dataflow output parameterized type to avro file

I have a pipeline that successfully outputs an Avro file as follows:
#DefaultCoder(AvroCoder.class)
class MyOutput_T_S {
T foo;
S bar;
Boolean baz;
public MyOutput_T_S() {}
}
#DefaultCoder(AvroCoder.class)
class T {
String id;
public T() {}
}
#DefaultCoder(AvroCoder.class)
class S {
String id;
public S() {}
}
...
PCollection<MyOutput_T_S> output = input.apply(myTransform);
output.apply(AvroIO.Write.to("/out").withSchema(MyOutput_T_S.class));
How can I reproduce this exact behavior except with a parameterized output MyOutput<T, S> (where T and S are both Avro code-able using reflection).
The main issue is that Avro reflection doesn't work for parameterized types. So based on these responses:
Setting Custom Coders & Handling Parameterized types
Using Avrocoder for Custom Types with Generics
1) I think I need to write a custom CoderFactory but, I am having difficulty figuring out exactly how this works (I'm having trouble finding examples). Oddly enough, a completely naive coder factory appears to let me run the pipeline and inspect proper output using DataflowAssert:
cr.RegisterCoder(MyOutput.class, new CoderFactory() {
#Override
public Coder<?> create(List<? excents Coder<?>> componentCoders) {
Schema schema = new Schema.Parser().parse("{\"type\":\"record\,"
+ "\"name\":\"MyOutput\","
+ "\"namespace\":\"mypackage"\","
+ "\"fields\":[]}"
return AvroCoder.of(MyOutput.class, schema);
}
#Override
public List<Object> getInstanceComponents(Object value) {
MyOutput<Object, Object> myOutput = (MyOutput<Object, Object>) value;
List components = new ArrayList();
return components;
}
While I can successfully assert against the output now, I expect this will not cut it for writing to a file. I haven't figured out how I'm supposed to use the provided componentCoders to generate the correct schema and if I try to just shove the schema of T or S into fields I get:
java.lang.IllegalArgumentException: Unable to get field id from class null
2) Assuming I figure out how to encode MyOutput. What do I pass to AvroIO.Write.withSchema? If I pass either MyOutput.class or the schema I get type mismatch errors.
I think there are two questions (correct me if I am wrong):
How do I enable the coder registry to provide coders for various parameterizations of MyOutput<T, S>?
How do I values of MyOutput<T, S> to a file using AvroIO.Write.
The first question is to be solved by registering a CoderFactory as in the linked question you found.
Your naive coder is probably allowing you to run the pipeline without issues because serialization is being optimized away. Certainly an Avro schema with no fields will result in those fields being dropped in a serialization+deserialization round trip.
But assuming you fill in the schema with the fields, your approach to CoderFactory#create looks right. I don't know the exact cause of the message java.lang.IllegalArgumentException: Unable to get field id from class null, but the call to AvroCoder.of(MyOutput.class, schema) should work, for an appropriately assembled schema. If there is an issue with this, more details (such as the rest of the stack track) would be helpful.
However, your override of CoderFactory#getInstanceComponents should return a list of values, one per type parameter of MyOutput. Like so:
#Override
public List<Object> getInstanceComponents(Object value) {
MyOutput<Object, Object> myOutput = (MyOutput<Object, Object>) value;
return ImmutableList.of(myOutput.foo, myOutput.bar);
}
The second question can be answered using some of the same support code as the first, but otherwise is independent. AvroIO.Write.withSchema always explicitly uses the provided schema. It does use AvroCoder under the hood, but this is actually an implementation detail. Providing a compatible schema is all that is necessary - such a schema will have to be composed for each value of T and S for which you want to output MyOutput<T, S>.

Map string to enum with Automapper

My problem is hydrating a Viewmodel from a Linq2Sql object that has been returned from the database. We have done this in a few areas and have a nice layered pattern worked up for it but the latest item calls for some enums to be used and this has caused headaches all round. Currently we pull back from the database then use Automapper to hydrate (or flatten) into our Viewmodels but having the enums in the model seems to be causing issues with Automapper. I've tried to create custom resovlers which have sufficed for all my other mapping requirements but it doesn't work in this instance.
A sample of the code looks like:
public class CustomerBillingTabView{
public string PaymentMethod {get; set;}
...other details
}
public class BillingViewModel{
public PaymentMethodType PaymentMethod {get; set;}
...other details
}
public enum PaymentMethodType {
Invoice, DirectDebit, CreditCard, Other
}
public class PaymentMethodTypeResolver : ValueResolver<CustomerBillingTabView, PaymentMethodType>
{
protected override PaymentMethodType ResolveCore(CustomerBillingTabView source)
{
if (string.IsNullOrWhiteSpace(source.PaymentMethod))
{
source.PaymentMethod = source.PaymentMethod.Replace(" ", "");
return (PaymentMethodType)Enum.Parse(typeof(PaymentMethodType), source.PaymentMethod, true);
}
return PaymentMethodType.Other;
}
}
CreateMap<CustomerBillingTabView, CustomerBillingViewModel>()
.ForMember(c => c.CollectionMethod, opt => opt.ResolveUsing<PaymentMethodTypeResolver>())
I get the following error
[ArgumentException: Type provided must be an Enum.
Parameter name: enumType]
System.Enum.TryParseEnum(Type enumType, String value, Boolean ignoreCase, EnumResult& parseResult) +9626766
System.Enum.Parse(Type enumType, String value, Boolean ignoreCase) +80
AutoMapper.Mappers.EnumMapper.Map(ResolutionContext context, IMappingEngineRunner mapper) +231
AutoMapper.MappingEngine.AutoMapper.IMappingEngineRunner.Map(ResolutionContext context) +720
I'd like to stick with Automapper for all of our mapping actions but I've seen a lot of people say that it doesn't do this type of mappings so I'm starting to wonder if I'm using it in the wrong way? Also, I've seen a few mentions of ValueInjecter - is this an alternative to Automapper, or will it be useful to just plug the holes in Automapper for the hydration of models and use Automapper for flattening?
Yes I could just use a string in my ViewModel, but I'm not a fan of magic strings, and this particular item is used by helpers to perform some logic in a number of places.
This is an issue with the AutoMapper documentation. If you download the AutoMapper source there are examples in there. The code you want will look like this:
public class PaymentMethodTypeResolver : ValueResolver<CustomerBillingTabView, PaymentMethodType>
{
protected override PaymentMethodType ResolveCore(CustomerBillingTabView source)
{
string paymentMethod = source.Context.SourceValue as string;
if (string.IsNullOrWhiteSpace(paymentMethod))
{
paymentMethod = paymentMethod.Replace(" ", "");
return source.New((PaymentMethodType)Enum.Parse(typeof(PaymentMethodType), paymentMethod, true));
}
return source.New(PaymentMethodType.Other);
}
}
here's a solution with the ValueInjecter:
since you already solved the problem I'm just going to point you to something similar:
AutoMapper strings to enum descriptions
in this question the requirements were a bit more than just doing from string to enum, but it includes this conversion also
about the ValueInjecter being an alternative: yes, it does stuff more generic no configuration for every little thing required, and build whatever convention you can imagine

Resources