Kube Cloud Pt4 | Observability
full course- Kube Cloud Pt4 | Observability
- Kube Cloud Pt4 | Logging
- Kube Cloud Pt4 | Tracing
- Kube Cloud Pt4 | Metrics
Add Structured Logging
Structured logging means that we’re going to log our output in json format. This is extremely useful because it allows tools to easily parse the json to make the individual log message attributes available for searching and allows you to control the presentation (highlight signal, reduce noise). You’re probably more familiar with the unstructured logging that comes by default with spring boot. Essentially, they have already parsed the logs and presented them to you in a mostly useful format, but we are going to want more control than that because we may have needs that extend beyond the basic functionality.
Create a new branch in message-generator
$ git checkout -b logging
Switched to a new branch 'logging'
Update your pom.xml to include logz’s logback encoder and appender
<properties>
...
<logstash-logback-encoder.version>7.0.1</logstash-logback-encoder.version>
<log4j2.version>2.17.1</log4j2.version>
<logzio-logback-appender.version>1.0.25</logzio-logback-appender.version>
</properties>
<!-- logging -->
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>${logstash-logback-encoder.version}</version>
</dependency>
<dependency>
<groupId>io.logz.logback</groupId>
<artifactId>logzio-logback-appender</artifactId>
<version>${logzio-logback-appender.version}</version>
</dependency>
A few things here:
We’re updating log4j to the most current version because there was a major security vulnerability discovered. By adding this property, we’re telling the spring boot starter parent to override the version it normally brings in.
Logstash logback encoder tells logback how to present the structured logs in logstash format. Logstash is the “L” in ELK (elasticsearch, logstash and kibana) and is the “pump” that will push the logs over the network into elasticsearch for storage.
Logzio logback appender adds a logback appender which has all of the default configuration needed to communicate back to logz.io.
We need to gather some information from logz.io so that we can connect to our account. Navigate to the send your logs menu item
Search for Java
Navigate to logback
Scroll down to the configuration and get the token
Now we just need to add some configuration so that logback understands how we want to log. Add a file called logback-spring.xml
to your src/main/resources
directory with this content
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<shutdownHook class="ch.qos.logback.core.hook.DelayingShutdownHook"/>
<include resource="org/springframework/boot/logging/logback/defaults.xml"/>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<layout class="ch.qos.logback.classic.PatternLayout">
<Pattern>
%black(%d{ISO8601}) %highlight(%-5level) [%blue(%t)] [%cyan(%X{trace_id}),%cyan(%X{span_id})] %yellow(%C{1.}): %msg%n%throwable
</Pattern>
</layout>
</appender>
<appender name="LogzioLogbackAppender" class="io.logz.logback.LogzioLogbackAppender">
<token>${LOGZIO_LOGGER_TOKEN}</token>
<logzioUrl>https://listener.logz.io:8071</logzioUrl>
<logzioType>message-generator</logzioType>
</appender>
<springProfile name="dev">
<root level="info">
<appender-ref ref="CONSOLE"/>
</root>
</springProfile>
<springProfile name="!dev">
<root level="info">
<appender-ref ref="CONSOLE"/>
<appender-ref ref="LogzioLogbackAppender"/>
</root>
</springProfile>
</configuration>
Here we’re setting up two appenders. Console will log out in a parsed string pattern which I think is easier to read and we’ve included some colors to help things stand out. Next we’re setting up the logback appender to push logs over to logz.io. Make sure to put your token in and make the logzioType reflect your service name since this is an attribute that matches the logs to your service.
We’re also setting up some profile configuration so that when we’re developing locally we’re not necessarily pushing logs up to logz.io. You can turn this on and off by setting the active spring profile (-Dspring.profiles.active=dev
), but we’re going to leave this off for now while we’re testing.
We don’t want logging to logz.io during our tests because this creates an unneeded dependency on an external system. So create this logback-test.xml
under src/test/resources
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<shutdownHook class="ch.qos.logback.core.hook.DelayingShutdownHook"/>
<include resource="org/springframework/boot/logging/logback/defaults.xml"/>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<layout class="ch.qos.logback.classic.PatternLayout">
<Pattern>
%d{ISO8601} %highlight(%-5level) [%cyan(%X{trace_id}),%cyan(%X{span_id})] %yellow(%C{1}): %msg%n%throwable
</Pattern>
</layout>
</appender>
<root level="INFO">
<appender-ref ref="CONSOLE"/>
</root>
</configuration>
You should see a difference when you start up locally:
And you should see your logs in logz.io
Navigate to kibana
And you should see them in a few minutes (it takes some time to get them over, parsed, stored and indexed)
Add Logs
Our logging stance is pretty sparse at this point, so lets start beefing it up.
I use an aspect to hang off of every method and log the input and output of each method. This removes the tedium of adding lots of log statements that need to be maintained. Aspects aren’t used a lot, but they can be handy. Be careful of overusing them though, because they hide a lot of functionality and can make debugging troublesome.
First, add aspectj as a dependency to our pom.xml
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-aop</artifactId>
</dependency>
Next we’re going to use an annotation to activate the logging, so create the annotation in a new package called com.bullyrooks.messagegenerator.config
. Create a class called LoggingEnabled
.
package com.bullyrooks.messagegenerator.config;
import java.lang.annotation.ElementType;
import java.lang.annotation.Retention;
import java.lang.annotation.RetentionPolicy;
import java.lang.annotation.Target;
@Target(ElementType.TYPE)
@Retention(RetentionPolicy.RUNTIME)
public @interface LoggingEnabled {
}
Now create the aspect in the same package as a class called LoggingAspect
package com.bullyrooks.messagegenerator.config;
import com.fasterxml.jackson.core.JsonProcessingException;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.ObjectWriter;
import lombok.extern.slf4j.Slf4j;
import org.aspectj.lang.ProceedingJoinPoint;
import org.aspectj.lang.annotation.Around;
import org.aspectj.lang.annotation.Aspect;
import org.aspectj.lang.annotation.Pointcut;
import org.aspectj.lang.reflect.MethodSignature;
import org.springframework.stereotype.Component;
import org.springframework.util.StopWatch;
import java.util.HashMap;
import java.util.Map;
@Aspect
@Component
@Slf4j
@Profile("!test")
public class LoggingAspect {
ObjectMapper om = new ObjectMapper();
@Pointcut("within(@com.bullyrooks.messagegenerator.config.LoggingEnabled *)")
public void loggingEnabled() {
}
@Pointcut("execution(public * *(..))")
public void publicMethod() {
}
@Pointcut("execution(private * *(..))")
public void privateMethod() {
}
@Around("loggingEnabled() && (publicMethod() || privateMethod())")
public Object profileAllMethods(ProceedingJoinPoint proceedingJoinPoint) throws Throwable {
logMethodInvocationAndParameters(proceedingJoinPoint);
final StopWatch stopWatch = new StopWatch();
//Measure method execution time
stopWatch.start();
Object result = proceedingJoinPoint.proceed();
stopWatch.stop();
//Log method execution time
logMethodResultAndParameters(proceedingJoinPoint, result, stopWatch.getTotalTimeMillis());
return result;
}
private void logMethodResultAndParameters(ProceedingJoinPoint proceedingJoinPoint,
Object result, long totalTimeMillis) {
try {
MethodSignature methodSignature = (MethodSignature) proceedingJoinPoint.getSignature();
String className = methodSignature.getDeclaringType().getSimpleName();
String methodName = methodSignature.getName();
ObjectWriter writer = om.writer();
if (writer != null) {
log.debug("<- {}.{} returns:{}. Execution time: {}ms",
className,
methodName,
writer.writeValueAsString(result),
totalTimeMillis);
}
} catch (JsonProcessingException e) {
log.error("unable to write log value: {}", e.getMessage(), e);
}
}
private void logMethodInvocationAndParameters(ProceedingJoinPoint jp) {
try {
String[] argNames = ((MethodSignature) jp.getSignature()).getParameterNames();
Object[] values = jp.getArgs();
Map<String, Object> params = new HashMap<>();
if (argNames != null && argNames.length != 0) {
for (int i = 0; i < argNames.length; i++) {
params.put(argNames[i], values[i]);
}
}
ObjectWriter writer = om.writer();
if (writer != null) {
String className = jp.getSignature().getDeclaringType().getSimpleName();
String methodName = jp.getSignature().getName();
log.debug("-> {}.{} invocation. params: {}",
className,
methodName,
writer.writeValueAsString(params));
}
} catch (JsonProcessingException e) {
log.error("unable to write log value: {}", e.getMessage(), e);
}
}
}
There’s a lot going on here, I’ll try to help explain.
First, we’re describing our pointcuts. These will be any class that has our annotation, any public method and any private method. Then I create a profile which combines those pointcuts and says what to execute. In this case, it says before the method is invoked, log the input, then start a timer, call the method and stop the timer when it returns and log the output.
The logMethodInvocationAndParameters
method logs the input its a little complicated, but essentially, it grabs the input parameters, the class name and method and writes them to the log along with a direction arrow.
The logMethodResultAndParameters
essentially does the same thing, but instead of getting the input parameters it has direct access to the returned object so it just logs that in json format.
Finally the @Profile annotation tells this aspect to ignore when running in the test profile, which means that our logs won’t get created during testing.
Now, everything is logging at debug
level, so we’re going to need to adjust our log configuration to display them. Add these lines to your application.yaml
logging:
level:
root: INFO
com.bullyrooks: DEBUG
This is saying to log everything at INFO
level or above, except our packages which should log at DEBUG
.
Finally, add these annotations to MessageController
and MessageService
@LoggingEnabled
@Slf4j
Restart the application and hit the /message
endpoint. You should see logs similar to this:
Note, this is almost tracing (but isn’t nearly as good as it can be). Also, the logs are written via the aspect, so that aspect package and class are written here.
You should also see them in logz.io
Lets go ahead and push up these changes and deploy to okteto.
git add .
git commit -m "logging enabled"
git push --set-upstream origin logging
Enumerating objects: 33, done.
Counting objects: 100% (33/33), done.
Delta compression using up to 4 threads
Compressing objects: 100% (15/15), done.
...
* [new branch] logging -> logging
Branch 'logging' set up to track remote branch 'logging' from 'origin'.
When that succeeds, merge to main
git checkout main
git merge logging
git push
Once it builds and deploys we should see the logs in okteto’s console
as well as in logz.io
Note, for some reason my debug logging level wasn’t getting passed through. You can verify the log level through actuator with this endpoint /actuator/loggers/com.bullyrooks
(it was set to INFO
). You can use actuator to change the inflight configuration by POST
ing to that same endpoint with this body
{"configuredLevel": "DEBUG"}
If you GET again, you should see it updated
{
"configuredLevel": "DEBUG",
"effectiveLevel": "DEBUG"
}
Update Cloud Application
Now, make the same updates to cloud application and push that up to okteto before we move on to the tracing.
0 comments on “Kube Cloud Pt4 | Logging”Add yours →