Hey, I'm Marco and welcome to my newsletter!
As a software engineer, I created this newsletter to share my first-hand knowledge of the development world. Each topic we will explore will provide valuable insights, with the goal of inspiring and helping all of you on your journey.
In this episode I want to show you how I use logs in my applications, the structure and a real debugging example.
Are you interested in learning how to implement a magic link authentication mechanism using JSONWebToken?
Sign up, I'll be publishing a post about it in the coming weeks!
👋 Introduction
Imagine you have a problem during production and need to figure it out. How do you start?
Debugging is essential for developers. Knowing how to get more info makes finding the real issue easier.
That’s where logging helps. It saves time by giving clear data. It’s not just for fixing bugs; it tracks user activity, usage levels, and service performance, making everything easier to handle and improve.
Since my initial experience, I have been obsessed with logs. Every morning, I check the log tool for errors from the previous day. If any errors are found, I create a task, prioritize it, and add it to the backlog.
I have worked in environments where logging was either overused or completely ignored. Having experienced both extremes, I can confidently say that when used effectively, logging can make a significant difference.
Dashboard
Logs can be organized and combined to make dashboards. These are helpful because they quickly show if something is wrong, like a drop in performance or a problem with an external service.
We can also set up alerts based on log data. Recently, I've been working on making sure that if there's an error in the application, it triggers an alert on AWS, which then sends a message to a dedicated Slack channel. This setup is great because it means I don't have to check logs all the time; they let me know about issues automatically. (pull vs push)
There are different tools you can use to store and show log data. Over time, I've used several popular ones, including:
Kibana
it is a tool that helps to see and understand the data. It works very well with Elasticsearch. You can use it to create charts, maps and dashboards from the data stored in Elasticsearch. It is very useful to get insights and keep an eye on trends and important numbers in one place. Both proprietary and self-hosted solutions are available, so you can choose the one that is right for you.
AWS Cloudwatch
is an AWS service for monitoring and managing your AWS resources. It tracks metrics, monitors log files, sets alarms, and triggers automated actions. It's essential for gaining insights into resource performance, optimizing AWS usage, and ensuring operational health across your AWS environment. It’s a proprietary solution.
Graphana
Grafana is a tool for visualizing data and monitoring systems. It works with different data sources like Elasticsearch and Prometheus. You can create custom dashboards and charts to track real-time data easily. Grafana is popular for monitoring performance metrics, applications, and business data, making it a valuable tool for data analysis and insights.
💡 My solution
In my Node.js applications, I handle logging in this way:
console.log wrap
There are many libraries available for logging in Node.js. In my case, I chose to create a log.js module that wraps console.log to avoid unnecessary dependency. This module handles JSON serialization correctly and adds a "transaction-id" prefix, which is a unique ID for each API call. This makes it easier to filter logs and view only the logs for a specific execution.
The transaction ID is generated in the first middleware when an HTTP call is made, it’s stored in async storage for later use and returned in the response header “x-transaction-id“.
Are you interested in how I made this module? You can find it in this file of my Node.js backend template.
print a log on each method with all input parameters
static getByPath = memoize(async (path) => {
log.info('Model::Post::getByPath', {path})
const rows = await query(`
select *
from posts
where path = ?
`, [path]);
if (rows.length !== 1) throw new APIError404('Post not found.')
const post = Post.fromDBRow(rows[0])
return post
},
{
promise: true
})
Keep reading with a 7-day free trial
Subscribe to Implementing to keep reading this post and get 7 days of free access to the full post archives.