How to trace distributed logs using Spring Cloud Sleuth?

In the previous post, I explained how to trace logs across distributed applications/microservices using MDC design pattern provided by all major logging frameworks like slf4j.

Spring Boot has made it even easier.

You just need to include the dependency spring-cloud-sleuth and do some minor changes and your logs are automatically appended with tracing ids .The tracing ids can then be used to trace the logs across the different services.

It achieves this by internally using MDC and does something similar to that explained in the previous post.

In addition spring-cloud-sleuth also adds span id.

This is used to track individual services within the entire flow.

So if api1 calls api2 they both will have the same trace id but different span ids.

Now let’s see a demo of this.

First, create a spring boot project through https://start.spring.io/.

Add both web and cloud sleuth dependencies as shown below:

Let this be service one.

Similarly create another project and let this be service two.

As in the previous project add both web and sleuth dependencies.

Now create a REST service in service two project and invoke it from service one project using Spring Rest Template.

That’s it!

Spring boot will automatically append tracing and span ids in your application logs now.

But wait!

There is one thing you need to pay attention to.

You need to make sure that the REST template instance is injected and not created using the new operator.

Use the below code:

	@Bean
	public RestTemplate template() {
	    return new RestTemplate();
	}
	@Autowired
	private RestTemplate template;

and not :

RestTemplate template = new RestTemplate();

This is because to implement tracing, Spring Boot needs to inject necessary interceptors and this can be done only if the REST template is injected and not created manually.

As quoted in Spring docs and in this stackoverflow thread:

"You have to register RestTemplate as a bean so that the interceptors will get injected. If you create a RestTemplate instance with a new keyword then the instrumentation WILL NOT work."

Here is service two controller code:

package com.example.servicetwo;

import java.util.HashMap;
import java.util.Map;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RestController;

@RestController
public class ControllerTwo {

	Logger logger = LoggerFactory.getLogger(ControllerTwo.class.getName());

	@GetMapping("/controller2")
	public Map<String, String> controller2() {

		logger.info("Executing controller2 method");

		Map<String, String> map = new HashMap<>();

		logger.info("Preparing message");
		map.put("message", "Hey this is controller two!");

		logger.info("Returning response from controller2");
		return map;

	}
}

Here is the application.properties file of service two project:


server.port = 8081


spring.application.name = servicetwo

I am running it in a different port because service one runs in the default port 8080.

Also notice spring.application.name . This is the name of the application which will be displayed in the logs. This is optional.

Here is the service one controller code:

package com.example.serviceone;

import java.util.Map;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RestController;
import org.springframework.web.client.RestTemplate;

@RestController
public class ControllerOne {

	@Bean
	public RestTemplate template() {
	    return new RestTemplate();
	}
	@Autowired
	private RestTemplate template;
	
	Logger logger = LoggerFactory.getLogger(ControllerOne.class.getName());
	@GetMapping("/controller1")
	public Map<String, String> controller1() {
		
		logger.info("Execution begins in controller 1");


		logger.info("Calling Controller 2 from Controller 1");
		Map<String, String> response = template.getForObject("http://localhost:8081/controller2", Map.class);

		logger.info("Returning response from controller1");
		return response;

	}
}

Notice how the rest template is injected above.

Here is the application.properties file of service two project:

spring.application.name = serviceone


Now run both the applications and hit the rest service of service one project :

We got the response back from service two project.

Now let’s look at the logs.

Here is the log printed in the console of service one application:

2020-11-02 20:40:48.402  INFO [serviceone,3f31d74a5af5610e,3f31d74a5af5610e,true] 24860 --- [nio-8080-exec-1] com.example.serviceone.ControllerOne     : Execution begins in controller 1
 2020-11-02 20:40:48.402  INFO [serviceone,3f31d74a5af5610e,3f31d74a5af5610e,true] 24860 --- [nio-8080-exec-1] com.example.serviceone.ControllerOne     : Calling Controller 2 from Controller 1
 2020-11-02 20:40:48.780  INFO [serviceone,3f31d74a5af5610e,3f31d74a5af5610e,true] 24860 --- [nio-8080-exec-1] com.example.serviceone.ControllerOne     : Returning response from controller1

And here is the log from service two console:

2020-11-02 20:40:48.646  INFO [servicetwo,3f31d74a5af5610e,8320a4add5c3ee2e,true] 19932 --- [nio-8081-exec-1] com.example.servicetwo.ControllerTwo     : Executing controller2 method
 2020-11-02 20:40:48.646  INFO [servicetwo,3f31d74a5af5610e,8320a4add5c3ee2e,true] 19932 --- [nio-8081-exec-1] com.example.servicetwo.ControllerTwo     : Preparing message
 2020-11-02 20:40:48.646  INFO [servicetwo,3f31d74a5af5610e,8320a4add5c3ee2e,true] 19932 --- [nio-8081-exec-1] com.example.servicetwo.ControllerTwo     : Returning response from controller2

Notice the data within square brackets:

[serviceone,3f31d74a5af5610e,3f31d74a5af5610e,true]

This is the information added by Spring cloud sleuth.

The first word “serviceone” is the application name.

The second id is the tracing id.

The third id is the span id.

The fourth boolean value is related to zipkin ( a server where you can upload your logs and track them centrally). We are not dealing with that in this post.

Notice the tracing id from both the logs. They are the same.

Notice the span id , they are different for the two different services.

That’s it!

Tracing logs got much easier with Spring Cloud Sleuth.


Posted

in

by

Comments

2 responses to “How to trace distributed logs using Spring Cloud Sleuth?”

  1. Tim Avatar
    Tim

    What’s the point of this? It tells you which logs are from what transactions but you have to look at each of the instances and you may not know which instances to look at or they may be gone by now. There may be dozens of consoles that you have to look at, too.
    I’m not arguing, I just think they should be put somewhere so you can see them all at once. Like Zipkin, but the example I saw didn’t actually load the ones coded for debugging purposes (most were timestamps for determining latency).

    1. Vijay SRJ Avatar
      Vijay SRJ

      Yes definitely the logs should be aggregated using tools like zipkin or splunk , this example is just to show how spring cloud sleuth works . The logs generated should be uploaded to tools like splunk so that you can debug easily in a production environment

Leave a Reply

Discover more from The Full Stack Developer

Subscribe now to keep reading and get access to the full archive.

Continue reading