[Question / Feature Suggestion] Distributed tracing using json-logging library
Sureya opened this issue · 2 comments
What am I trying to do?
Let's say we have a microservice A, which speaks with 2 other microservices A1 and A2 to process its requests. In this case, I would like to implement distributed tracing. The aim here is to monitor metrics like what is the average time taken for a request in step A1 and step A2 so that we can find overall bottlenecks.
With this library's context, let's assume all of the services are Flask API. So when an initial request is made to Service A a correlation-id is generated, I want the same co-relation id to be set as correlation-id for all the logs in services A1 and A2.
@thangbn : To do that, I am thinking we would need a method like json_logging.set_correlation_id(request=request)
to be used in Services A1 and A2
If there is a better way to do this, please feel free to suggest any alternatives as well, just wanted to get your opinion on this.
This paper from Bloomberg is what I am using as a reference for implementing as distribute tracing
you probably figured it out. the correlation id will be auto extracted from request header, the only part u need to take care is to grab the current correlation-id explicit by json_logging.get_correlation_id() and set it into headers of any upstream service call.