adding loggers to Grails classes - grails

I use the following approach to access a logger instance from classes in a Grails app:
In Grails artefacts (controllers, services, domain classes, etc.) I simply use the logger that is added by Grails, e.g.
class MyController {
def someAction() {
log.debug "something"
}
}
For classes under src/groovy I annotate them with #groovy.util.logging.Slf4j, e.g.
#Slf4j
class Foo {
Foo() {
log.debug "log it"
}
}
The logger seems to behave properly in both cases, but it slightly bothers me that the class of the loggers differs. When I use the annotation, the class of the logger is org.slf4j.impl.GrailsLog4jLoggerAdapter, but when I use the logger that's automatically added to Grails artefacts the class is org.apache.commons.logging.impl.SLF4JLog.
Is there a recommended (or better) approach to adding loggers to Grails classes?

I don't see any problem with what you described. SLF4J isn't a logging framework, it's a logging framework wrapper. But aside from some Grails-specific hooks in the Grails class, they both implement the same interface and delegate eventually to the same loggers/appenders/etc. in the real implementation library, typically Log4j.
What I'm pretty sure is different though is the log category/name, because you need to configure the underlying library based on what the logger names become. With annotations the logger name is the same as the full class name an package. With the one Grails adds, there's an extra prefix based on the artifact type. I always forget the naming convention but a quick way to know the logger name is to log it; add this in your class where it will be accessed at runtime:
println log.name
and it will print the full logger name (using println instead of a log method avoids potential misconfiguration issues that could keep the message from being logged
I like to keep things simple and consistent and know that being used, so I skip the wrapper libraries and use Log4j directly. Access the logger is easy. Import the class
import org.apache.log4j.Logger
and then add this as a class field:
Logger log = Logger.getLogger(getClass().name)
This can be copy/pasted to other classes since there's no hard-coded names. It won't work in static scope, so for that I'd add
static Logger LOG = Logger.getLogger(this.name)
which also avoids hard-coding by using Groovy's support for "this" in static scope to refer to the class.

Have you tried the #Log4j (for log4j) instead.
#Log4j (for log4j)
How can i use 'log' inside a src/groovy/ class

Related

How to enable Serilog minimum level overrides without particular convention of calling ForContext?

This article on Serilog minimum level overrides states:
The first argument of Override is a source context prefix, which is normally matched against the namespace-qualified type name of the class associated with the logger.
For this so-called "normal" behavior, wouldn't I need to manually set the .ForContext<>() differently for each class my logger is called from? In other words, how are namespace-specific minimum log levels supposed to work without a specific convention of how .ForContext is set?
If that makes sense, then how can I set ForContext automatically without calling it with a different argument everywhere?
For this so-called "normal" behavior, wouldn't I need to manually set
the .ForContext<>() differently for each class my logger is called
from?
Yes, you would. A common way of doing it is by using the Log.ForContext<T>() on each class, in a member variable that gets shared across the different methods of your class (so that all logs get written with the same context). e.g.
public class SomeService
{
private readonly ILogger _log = Log.ForContext<SomeService>();
// ...
}
public class SomeRepository
{
private readonly ILogger _log = Log.ForContext<SomeRepository>();
// ...
}
If you are using an IoC container such as Autofac, you can have the .ForContext<>() call happen automatically when classes are resolved by the IoC container (by using constructor injection, for example).
If you are using Autofac specifically, you could use AutofacSerilogIntegration that takes care of that. There might be similar implementations for other IoC containers (or you'd have to implement your own).
If you are using Microsoft's Generic Host, then you'll need to configure it to use a custom ServiceProviderFactory which will be responsible for creating the instances and making the call to .ForContext<>()... An easy route is to integrate Autofac with Microsoft's Generic Host and then leverage the AutofacSerilogIntegration I mentioned above.

log is not accessible in normal groovy file of grails3

Migrated grails2 to grails3.
In grails2 i used lots of
log.info,log.debug
statements in side src/main/groovy files.
but in grails-3 by default log is not injected.
it's giving error like No such property: log for class
This is a planned change. You can use #groovy.util.logging.Commons annotations on your non-grails classes to have log available. Also other like #Log4j, #Slf4j are available, depending on your logging library.
There is one more difference which is important - those annotations will add log as private property and classes which will inherit from them, will also need to be annotated to use logging. Alternative is to manually define protected logger on your class.

Custom Log Level in Grails

I want to be informed when uncaught exceptions occur in my Grails 2.2.4 application. Log4j has an SMTPAppender doing something similar, but only based on a specific log level. In my application there are already a lot of log entries in all available log levels, so sending email on ERROR or FATAL is not really an option because it would also contain non-exception entries.
Filtering uncaught exceptions in Grails is quite easy, I just redirect them to a specific controller and handle it there:
static mappings = {
[...]
"500"(controller: "errors", action: "serverError")
}
My plan was to introduce my own log level and use it only for uncaught exceptions. Documentation suggests this:
final Level EXCEPTION = Level.forName("EXCEPTION", 50);
logger.log(EXCEPTION, "uncaught exception", e);
But I don't know how to use this in Grails with the injected log object. It only supports the base options like log.error('foo',e). Grails documentation says how to add custom appenders, but nothing about custom levels (or did I miss it?!)
Any suggestions?
Grails uses Slf4j and Commons Logging to abstract the logger implementation and allow changing from Log4j to another framework without having to edit every file with a logger. Instead, the wrapper library gets the correct implementation instance based on the requested logger name and what's available from the native API. If you change implementations, the wrapper loggers work the same way as far as your app code is concerned, but they call different implementation loggers to do the actual logging.
But there's no standard between implementations for configuration, so internal Grails startup code works directly with the API to configure loggers, appenders, levels, etc. You can do the same - use the traditional Log4j logger access code to get an instance by logger name, using the same one as the preconfigured logger Grails wired up. I can never remember the naming convention for loggers in artifacts, so I cheat and add a line of code
println log.name
in a method that I know runs, and call that method indirectly via whatever controller action can get there. So for example, if I want to know the logger of FractalService, put that code in its graphJuliaSet method and call the controller action that graphs Julia Sets using this service.
Log4j loggers are singletons, if you access the logger and change it, that will affect all future calls.
So that logger is available via something like:
String name = ... // the name from the println above
Logger logger = Logger.getLogger(name)

Grails: What is the best way to access domain classes from a src/groovy class?

The Grails FAQ says this:
Q: How can I access domain classes from sources in src/groovy?
Sometimes, you are developing some utility classes that live in src/groovy and which you intend >to use from Services and other artifacts. However, as those classes are pre-compiled by Grails, >it is not possible to instantiate them and write things like Book.findByTitle("Groovy in >Action"). But fortunately, there is a workaround since it's possible to do this:
import org.codehaus.groovy.grails.commons.ApplicationHolder
//…
def book = ApplicationHolder.application.getClassForName("library.Book").findByTitle("Groovy in Action")
The application MUST have finished bootstrapping before the dynamic Gorm methods will function correctly.
However, it appears that I can directly import domain objects and use GORM methods in my src/groovy classes without any problem, e.g.:
Book.findByTitle("Groovy in Action")
Since ApplicationHolder is deprecated, this advice must be out of date, but is there still any reason to avoid using domain classes directly from src/groovy?
You are correct, you referring to an out dated information. You can use domain classes inside classes defined under src/groovy.
The only overhead is that you have to handle transactions manually. On the contrary, services inside grails-app/services handes transaction by default. Services take care of transactions when the transactional flag is set to true (default is true of nothing specified).
On the other hand, when you access domain classes from src/groovy you have to use withTransaction block to handle transactions manually..
Book.withTransaction{status->
def book = Book.findByTitle("Groovy in Action")
book.title = "Grails in Action"
book.save()
status.setRollbackOnly() //Rolls back the transaction
}
Refer withTransaction for details.

How can you log from a Neo4j Server Plugin?

I'm trying to debug a problem in the Neo4J Server plugin I'm writing. Is there a log I can output to? It's not obvious where or how to do this.
Good question. I think you could use Java Logging? That should be routed into the normal logging system.
Just inject org.neo4j.logging.Log in your class containing implementation of your Neo4j stored procedure.
public class YourProcedures {
#Context
public Transaction tx;
#Context
public Log log;
#Procedure(value = "yourProcedure", mode = Mode.READ)
public Stream<YourResult> yourProcedure(#Name("input") String input) {
log.debug("something");
}
}
Logs are then dumped into standard Neo4j log file.
The level is controlled by GraphDatabaseSettings.store_internal_log_level configuration.
The level can be also changed in runtime. Just inject DependencyResolver bean and define this admin procedure. (The framework has listener hooked to config change which reconfigures the internal logging framework. This is the simplest solution I could find.)
#Context
public DependencyResolver dependencyResolver;
#Procedure(value = "setLogLevel", mode = Mode.DBMS)
#Description("Runtime change of logging level")
public void setLogLevel(#Name("level") String level) {
Config config = dependencyResolver.resolveDependency(Config.class);
config.set(GraphDatabaseSettings.store_internal_log_level, Level.valueOf(level));
}
UPDATE:
This ^ solution works, however it is insufficient when one wants to use logging the way usual in Log4j - different loggers organized in hierarchy, each logger at its own level. The org.neo4j.logging.Log component is just a wrapper of Log4j logger for the GlobalProcedures class. This logger is only one of many loggers in hierarchy. In fact, the wrapper blocks access to richer features of underlying framework. (Unfortunately, to define multiple #Context Log fields in YourProcedures class distinguished by some annotation qualifying logger is also impossible because field injection is driven by Map<Class,instance> so there is only one possible instance to inject for any #Context-annotated field according to field type.)
Solution 1:
Use JUL as in accepted answer. The disadvantage is, JUL redirects log event to underlying Log4j anyway so if logger hierarchy is defined in JUL, Log4j must be set to lowest possible level in order to make JUL levels sensitive.
Solution 2:
Use Log4j directly (i.e. public static final Logger logger = LogManager.getLogger("some.identifier.in.hierarchy") in YourProcedures). There are some issues with redefining configuration programmatically though it is possible, I dropped this solution only because I had some trouble deploying this solution in non-docker environment.
Solution 3: (finally chosen)
I defined custom component LogWithHierarchy (it can be built from own ExtensionFactory loaded using ServiceLoaders - I was inspired in APOC config implementation). This component provides API of the form debug(loggerName, message), info(loggerName, message) etc. The component knows original Log, drills down into its log4j LoggerContext and redirects all logging requests to particular logger in this LoggerContext. Log messages finally end in debug.log. With this solution the original log4j logger hierarchy is fully utilized, levels can be changed dynamically in runtime (setLogLevel must be changed to operate on aforementioned LoggerContext) and still everything is implemented using standard Neo4j plugin support.

Resources