How to print stack trace as string once by JsonLayout of log4j2 - log4j2

Im using log4j2 2.13.3 ing my project.
<JsonLayout compact="true" eventEol="true" stacktraceAsString="true"/>
log like this:
"message": "404 NOT_FOUND \"Unable to find instance for xxx\"",
"thrown": {
"commonElementCount": 0,
"localizedMessage": "404 NOT_FOUND \"Unable to find instance for xxx\"",
"message": "404 NOT_FOUND \"Unable to find instance for xxx\"",
"name": "org.springframework.cloud.gateway.support.NotFoundException",
"suppressed": [
{
"commonElementCount": 0,
"localizedMessage": "xxx\nStack trace:",
"message": "xxx\nStack trace:",
"name": "reactor.core.publisher.FluxOnAssembly$OnAssemblyException",
"extendedStackTrace": "stack trace"
}
],
"extendedStackTrace": "stack trace"
},
extendedStackTrace appears twice. Is there any way to make the stack trace appear once.
like this:(copy from http://logging.apache.org/log4j/log4j-2.2/manual/layouts.html#JSONLayout)
[
{
"logger":"com.foo.Bar",
"timestamp":"1376681196470",
"level":"INFO",
"thread":"main",
"message":"Message flushed with immediate flush=true"
},
{
"logger":"com.foo.Bar",
"timestamp":"1376681196471",
"level":"ERROR",
"thread":"main",
"message":"Message flushed with immediate flush=true",
"throwable":"stack trace"
}
]

If you want to log as JSON I would suggest your upgrade and use the new JsonTemplateLayout. It gives you much more control over the generated output.
For my purposes of logging to an ELK stack I prefer to use the GELF Layout with the SocketAppender to send the logs to Logstash. See Logging in the Cloud for more information.

Related

422 Unprocessable Entity error when POSTing a FastAPI docker-container but works when it's not dockerized

With the same body and configurations, the docker container shows a 422 error.
However, if I run the same FastAPI service on my PC (without Docker), I can successfully get my results.
Postman throws the following error when pinging the container:
{
"detail": [
{
"loc": [
"body"
],
"msg": "value is not a valid dict",
"type": "type_error.dict"
}
]
}
For reference, this is the function in conflict:
#router.post("/get_NERs")
def get_NERs(self, artrel: ArticleRelevance):
return artrel.dict()
Where ArticleRelevance is:
class ArticleRelevance(BaseModel):
title: str
comments: List[str]
I am successfully able to ping GET functions from the same docker container so I know the routing isn't an issue.
Indentations are a funny concept.
{
"headline": "Richest nations agree to end support for coal production overseas",
"all_comments": ["Great, up next let’s shut down call centers in India", "Hope this hurts us here in Australia."]
}
AND
{
"headline": "Richest nations agree to end support for coal production overseas",
"all_comments": ["Great, up next let’s shut down call centers in India", "Hope this hurts us here in Australia."]
}
are apparently different when FastAPI's router parses them?
Fixed the issue by proper indentations.

Ocelot swagger redirection issue

I am trying to implement Ocelot/Swagger/MMLib and .net microservices on my Windows 2019 server.
Everything is working fine, I can call each of the microservices correctly through the API gateway using postman, but I would like to display the swagger documentation as the API is going to be used by a third party.
If I use the ip address/port number I get the correct page displayed, with my microservice definitions. However if I reroute this to a physical url (eg https://siteaddress.com/path/swagger.index.html) I get the main swagger document but a 'Failed to load API definition' error, followed by 'Fetch error undefined /swagger/docs/v1/test.
The network page of my browser inspection gives a 'Http Error 404.0 Not Found'. The requested url is 'https://siteaddress.com:443/swagger/docs/v1/test'.
My ocelot.json is:
{
"Routes": [
{
"DownstreamPathTemplate": "/api/v1/TestSvc/{everything}",
"DownstreamScheme": "http",
"DownstreamHostAndPorts": [
{
"Host": "test.api",
"Port": "80"
}
],
"UpstreamPathTemplate": "/api/v1/TestSvc/{everything}",
"UpstreamHttpMethod": [ "POST" ],
"SwaggerKey": "test"
}
...
],
"SwaggerEndPoints": [
{
"Key": "test",
"Config": [
{
"Name": "Test API",
"Version": "v1",
"Url": "http://test.api:80/swagger/v1/swagger.json"
}
]
}
...
]
}
I have tried changing paths in ocelot.json and startup.cs. I can see nothing in the MMLib documentation regarding this scenario, which is surely common in deploying these sites.
Suggestion on where to go next appreciated.
Page with ipaddress and port number
Page with physical address and error message

Publish message to DLX along with error codes with Spring AMQP

I am able to publish messages to DLX when they are not able to process due to insufficient information or any other issues with the message received by binding DLX for the queue through Spring AMPQ.
For instance, invoice received with missing of billable hours and/or no employee id present it it.
{
"invoiceId": "INV1234",
"hourRate": 18.5,
"hoursWorked": 0,
"employeeId": "EMP9900"
}
Due to less size of request body, it's easy to understand what is the issue. But, we have some considerable request body length and 15-20 validation points.
Producer of the message expecting what is the issue when pushing back the message to publishing back them through DLX.
I have the following two thoughts to address this requirements.
Option #1: Append the errors information to the original message.
{
"message": {
"invoiceId": "INV1234",
"hourRate": 18.5,
"hoursWorked": 0,
"employeeId": "EMP9900"
},
"errors": [
{
"errorCode": "001",
"errorMessage": "Invalid no. of hours."
},
{
"errorCode": "002",
"errorMessage": "Employee not found in the system."
}
]
}
Option #2 : Add additional errors object in the headers
Out of these two options,
what is the better way of handling this requirements? And
Is there any in-built solution available in either spring-amqp or any other library?
See the documentation. The framework implements your #2 solution.
Starting with version 1.3, a new RepublishMessageRecoverer is provided, to allow publishing of failed messages after retries are exhausted.
...
The RepublishMessageRecoverer publishes the message with additional information in message headers, such as the exception message, stack trace, original exchange, and routing key. Additional headers can be added by creating a subclass and overriding additionalHeaders().

fluent plugin kafka json escaping

I am using version fluent-plugin-kafka version 0.12.3.
I have an application that outputs its logs in JSON format, but my console consumer shows that the logs are escaped.
for instance, the application output the following log line:
{
"msg": "ok"
}
when i look at the log using the console consumer, it outputs:
{
"container_id": "7e...",
"container_name": "/app",
"source": "stdout",
"log": "{\"msg\": \"ok\"}"
}
how should fluent-plugin-kafka should be configured so the application log will be nested as a json without escaping it?
desired output:
{
"container_id": "7e...",
"container_name": "/app",
"source": "stdout",
"log": {
"msg": "ok"
}
}
using record transformer filter i was able to mutate the incoming data.

Subscription rest-hook in FHIR HTTP 404 Not Found

I created a simple Rest Endpoint in java using jersey as you see in the following:
#Path("/study")
public class CreateRestEndpoint {
private static String endpoint;
#PUT
#Consumes("application/fhir+json")
#Produces(MediaType.APPLICATION_JSON)
public Response getPut(IBaseResource list){
System.out.println("UpdatedResource.: "+list);
return Response.status(200).build();
}
#PUT
public Response getTest(String str) {
System.out.printf(str);
return Response.status(200).build();
}
When I use postman and I send a PUT request to jersey-servlet, everything is ok and the jersey-servlet gets the message immediately.
But I created jersey-servlet to get a message which is sent by FHIR server (my FHIR server is running in docker) via Subscription resource. Actually, I'm trying to use subscription mechanism to be notified when a List resource is updated.:
{
"resourceType": "Subscription",
"id": "9",
"meta": {
"versionId": "2",
"lastUpdated": "2019-11-08T09:05:33.366+00:00",
"tag": [
{
"system": "http://hapifhir.io/fhir/StructureDefinition/subscription-matching-strategy",
"code": "IN_MEMORY",
"display": "In-memory"
}
]
},
"status": "active",
"reason": "Monitor Screening List",
"criteria": "List?code=http://miracum.org/fhir/CodeSystem/screening-list|screening-recommendations",
"channel": {
"type": "rest-hook",
"endpoint": "http://localhost:8080/notification/study",
"payload": "application/fhir+json"
}
}
When I change the List resources in FHIR, I expected to arrive a message in the jersey-servlet but unfortunately I get the following error (when I set the endpoint to a test rest-hook like webhook.site samples, I can get the message from FHIR side):
fhir_1 | 2019-11-08 18:48:40.688 [subscription-delivery-rest-hook-9-13] INFO c.u.f.j.s.m.i.S.SUBS6 [SubscriptionDebugLogInterceptor.java:162] Delivery of resource List/4/_history/17 for subscription Subscription/9 to channel of type RESTHOOK - Failure: ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException: HTTP 404 Not Found
fhir_1 | Exception in thread "subscription-delivery-rest-hook-9-13" org.springframework.messaging.MessagingException: Failure handling subscription payload for subscription: Subscription/9; nested exception is ca.uhn.fhir.rest.server.exceptions.ResourceNotFoundException: HTTP 404 Not Found, failedMessage=ca.uhn.fhir.jpa.subscription.module.subscriber.ResourceDeliveryJsonMessage#330c0fdb[myPayload=ca.uhn.fhir.jpa.subscription.module.subscriber.ResourceDeliveryMessage#38a1c8a2[mySubscription=ca.uhn.fhir.jpa.subscription.module.CanonicalSubscription#1d55d025[myIdElement=Subscription/9,myStatus=ACTIVE,myCriteriaString=List?..........
..................................................
What is the problem? I tried a lot with different parameters but no solution found.
I changed #Path to #Path("/study/List/{var}"), but I got the same failure again. Actually my FHIR server is running in docker and probability the problem would be inside the Docker.
After setting the Proxy in Docker, everything fine...
Conclusion: I had to change the #path to #Path("/study/List/{var}")and set the Proxy in Docker.

Resources