Hey, I'm Marco and welcome to my newsletter!
As a software engineer, I created this newsletter to share my first-hand knowledge of the development world. Each topic we will explore will provide valuable insights, with the goal of inspiring and helping all of you on your journey.
In this episode I want to share my backend template that I use for all my projects. I've improved it over the past year and I believe it includes all the necessary features and basic elements for developing a web application or a script that runs periodically.
You can download all the code shown directly from my Github repository: https://github.com/marcomoauro/node-backend-template
1) 📁 Project structure
For the structure of the project I am inspired by the Model View Controller (MVC) approach where I go to create a folder for controllers and one for models:
/controllers: contains functions that are hooked to the application routes, functions that act on the same concept I group them in the same file.
/models: for each concept represented in the database I create a class that is responsible for querying the database, here we find queries, utility functions that can be instance or static, and a function that is responsible for transforming the database representation of the concept into the application representation.
I also create an /api folder to store functions that I might need or that don't fit well in models or controllers.
Here is a view of what the facility looks like:
src/
api/
controllers/
models/
There are files that I go to create always within src/ that are as follows:
asyncStorage.js
database.js
errors.js
index.js
log.js
middlewares.js
router.js
1) asyncStorage.js
import { AsyncLocalStorage } from 'async_hooks';
export default new AsyncLocalStorage();
AsyncLocalStorage class provides an asynchronous storage space associated with an asynchronous context. It can be useful in scenarios where you need to maintain the state or data associated with a specific asynchronous operation.
In my case, I use it to maintain context information for an HTTP request so that various steps in the application can easily access that information without having to pass it explicitly through each function call. In particular I go to extract fields from this instance exclusively in the controllers, so that the models can also be called from other parts of the application avoiding unwanted coupling.
Importing this file into the various modules gets the instance directly, I will tell you how I create it next in the index.js file.
2) database.js
This file takes care of exporting an interface to access the database. Depending on the database I use I have various interfaces, the one I am using in the last period is the interface for the DBMS Postgres:
import pgPromise from 'pg-promise';
import monitor from 'pg-monitor';
import log from "./log.js";
const initOptions = {
query(e) {
monitor.query(e);
},
error(err, e) {
monitor.error(err, e);
}
};
const pgp = pgPromise(initOptions);
// https://stackoverflow.com/questions/39168501/pg-promise-returns-integers-as-strings
pgp.pg.types.setTypeParser(20, parseInt)
let db;
monitor.setLog((msg, info) => {
info.display = false; // to avoid library default log
const query_log = msg.split('\n').slice(1).join(' ')
log.query(query_log);
});
const getConnectionPool = () => {
if (!db) {
db = pgp({
connectionString: process.env.DATABASE_URL,
ssl: {
rejectUnauthorized: false
},
max: 10,
});
}
return db
}
export default getConnectionPool()
This file configures and exports a pool of PostgreSQL connections with some customizations for query monitoring and logging.
I made a change to the default log, since the native library uses its own formatting that I don't like then I found a way to disable it in order to apply my own, so as to have a standard among the various application logs, necessary for debugging and monitoring.
3) errors.js
This file exports a middleware used in the core of the application. It captures all exceptions, thrown explicitly and not, to handle them and send an appropriate response to the client, preventing server crashes.
import asyncStorage from './asyncStorage.js';
import log from './log.js';
export const apiErrorManager = (ctx, error) => {
let e = error;
if (error.apiError) {
e = error.apiError;
}
ctx.state.stack = e.stack;
if (e instanceof APIError) {
e.id_transaction = asyncStorage.getStore()?.id_transaction ?? 'id_transaction is not defined. Maybe you are outside http lifecycle.';
ctx.status = e.http_code;
ctx.body = e;
} else if (e.status === 401) {
ctx.status = 401;
ctx.set('WWW-Authenticate', 'Basic');
ctx.body = 'Authentication failed, please retry.';
} else if (e.constructor.name === 'BadRequest') {
ctx.status = 400;
ctx.body = e.errors.map((error) => `${error.path} ${error.message}`).join(', ');
} else {
log.error(e);
ctx.status = 500;
ctx.body = 'Internal server error.';
}
};
export class APIError extends Error {
constructor(message, code, http_code = 500, params = undefined) {
super(message);
this.name = 'APIError';
this.message = message;
this.code = code;
this.id_transaction = undefined;
this.http_code = http_code;
this.params = params;
}
toJSON() {
return { error: true, code: this.code, message: this.message, id_transaction: this.id_transaction, params: this.params };
}
toString() {
return this.message;
}
}
export class APIError400 extends APIError {
constructor(message = 'Bad request.', params = undefined) {
super(message, 'HTTP_400', 400, params);
this.name = 'APIError400';
}
}
...
For each error class, I've mapped the most common ones, I create a class that can be instantiated and raised as an exception, so by doing so the middleware automatically figures out what the HTTP response code should be and defines a response payload.
In a controller or model, we can utilize this command to halt the execution and send a 400 HTTP error to the client along with the message "Example for Substack episode."
throw new APIError400('Example for Substack episode.')
and the response the client will get will be:
{
"error": true,
"code": "HTTP_400",
"message": "Example for Substack episode.",
"id_transaction": "bF2EfutV8L"
}
If we hadn't provided a message, it would use the default one from the class, which, in this case, is "Bad request."
The id_transaction field of the response has a value used as a prefix for all application logs. Knowing this value helps filter execution traces, facilitating debugging in case of an error. The value changes for each API call execution and is stored in asyncStorage.
4) index.js
This file contains the definition of the server and the registered middlewares:
import cors from '@koa/cors';
import http from 'http';
import Koa from 'koa';
import json from 'koa-better-json';
import {koaBody} from 'koa-body';
import {APIError404, APIError405, APIError415, apiErrorManager} from './errors.js';
import log from './log.js';
import {
initAsyncStorage,
logIncomingCall,
routeSummaryLog
} from './middlewares.js';
import router from './router.js';
process.on('uncaughtException', (e) => log.error('uncaughtException', e));
process.on('unhandledRejection', (e) => log.error('unhandledRejection', e));
const app = new Koa();
const body_limits = {
formLimit: '64mb',
jsonLimit: '64mb',
formidable: {maxFileSize: '64mb', multiples: true},
multipart: true,
};
app.use(cors({exposeHeaders: ['x-version']}));
app.use(koaBody(body_limits));
app.use(json());
app.use(initAsyncStorage);
app.use(routeSummaryLog);
app.use(logIncomingCall);
app.use(async (ctx, next) => {
try {
ctx.set('x-version', process.env.npm_package_version);
await next();
if (!ctx.body) {
// if body is not defined at this moment, it means that no route matched the request, so a 404 error is fired.
throw new APIError404();
}
} catch (error) {
apiErrorManager(ctx, error); // error handler
}
});
app.use(router.routes());
app.use(
router.allowedMethods({
throw: true,
notImplemented: () => new APIError415(),
methodNotAllowed: () => new APIError405(),
}),
);
const server = http.createServer(app.callback());
export default server;
server.listen(process.env.PORT, async (error) => {
if (error) {
log.error(error);
} else {
log.info(`http serving on port ${process.env.PORT}`);
}
});
We have:
cors: CORS middleware for Koa.
koaBody: is a middleware parser for koa.
json: middleware that simplifies the handling for JSON encoded in the response.
initAsyncStorage: custom middleware that creates the storage instance.
routeSummaryLog: custom middleware that creates a log at the before returning the response to the client with useful debugging information.
logIncomingCall: custom middleware that logs incoming api calls by including input parameters.
Right after, we use another middleware. It adds the app version to response headers from package.json, accessed via env npm_package_version. This middleware wraps the app logic with try-catch, managing errors with the apiErrorManager module if they occur.
Finally we find routes registration, speficated in the router.js file with:
app.use(router.routes());
And the execution of the server:
server.listen(process.env.PORT, async (error) => {
if (error) {
log.error(error);
} else {
log.info(`http serving on port ${process.env.PORT}`);
}
});
5) log.js
It’s the logger I use inside the application, I decided not to use external libraries to avoid unnecessary dependencies; I did a wrap over the console module of Node.js.
import asyncStorage from './asyncStorage.js';
const paramToString = (p) => {
let str = '';
let remove_newline = true;
let remove_spaces = true;
switch (typeof p) {
case 'object':
if (p === null) {
return 'null';
} else if (p instanceof Error) {
str = (p?.response?.data ? JSON.stringify(p.response.data) : p?.stack) ?? 'N/A';
remove_newline = false;
remove_spaces = false;
} else {
if (!Array.isArray(p) && p.constructor.name && p.constructor.name !== 'Object') {
str = `${p.constructor.name} `;
}
str += JSON.stringify(p);
}
break;
case 'undefined':
return 'undefined';
default:
str = p.toString();
break;
}
if (remove_newline) {
str = str.replace(/\n/gm, ' ');
}
if (remove_spaces) {
str = str.replace(/\s{2,}/gm, ' ');
}
return str.trim();
};
const enrichMessage = ({ level, params = [] }) => {
level = level.toUpperCase();
params = params.map(paramToString);
if (params.length > 1) {
const params_prefix = 'params =>';
params.splice(1, 0, params_prefix);
}
const values = [level, ...params];
const id_transaction = asyncStorage?.getStore()?.id_transaction;
if (id_transaction) {
values.unshift(`[${id_transaction}]`);
}
return values;
};
const info = (...params) => {
console.log(...enrichMessage({ level: 'info', params }));
};
const query = (...params) => {
console.log(...enrichMessage({ level: 'query', params }));
};
const http = (...params) => {
console.log(...enrichMessage({ level: 'http', params }));
};
const curl = (...params) => {
console.log(...enrichMessage({ level: 'curl', params }));
};
const koa = (...params) => {
console.log(...enrichMessage({ level: 'koa', params }));
};
const auth = (...params) => {
console.log(...enrichMessage({ level: 'auth', params }));
};
const error = (...params) => {
console.error(...enrichMessage({ level: 'error', params }));
};
const warn = (...params) => {
console.log(...enrichMessage({ level: 'warn', params }));
};
export default {
info,
query,
http,
curl,
koa,
auth,
error,
warn,
};
It exports an object where each method represents a different level. When a method is called, it uses console.log() or console.error(). The level is added at the start of the log, and if there's an asyncStorage instance, the id_transaction is included as a prefix.
Also allows objects to be passed as they are rather than worrying about their serialization.
An example:
const id = 1
const obj = { value: 3 }
log.info('Controller::newsletters::getNewsletter', {id, obj})
=> [qg1igTGIpO] INFO Controller::newsletters::getNewsletter params => {"id":1,"obj":{"value":3}}
log.warn('Controller::newsletters::getNewsletter', {id, obj})
=> [qg1igTGIpO] WARN Controller::newsletters::getNewsletter params => {"id":1,"obj":{"value":3}}
6) middlewares.js
Contains all the middleware used in the index.js file
import koa_log from 'koa-better-log';
import nanoid from 'nano-id';
import path from 'path';
import asyncStorage from './asyncStorage.js';
import log from './log.js';
export const initAsyncStorage = async (ctx, next) => {
const id_transaction = nanoid(10);
ctx.set('x-transaction-id', id_transaction);
ctx.state.id_transaction = id_transaction;
const store = {
headers: ctx.headers,
id_transaction,
request: {
ip: ctx.ip,
},
};
await asyncStorage.run(store, next);
};
export const routeSummaryLog = koa_log({
logger: log.koa,
json: false,
logWith: (ctx) => {
const log_with = {
id_transaction: ctx.state.id_transaction,
result: ctx.body,
};
if (ctx.response.status >= 400) {
log_with.stack = ctx.state.stack;
log_with.request_headers = ctx.request.headers;
log_with.request_body = ctx.request.body;
log_with.message = ctx.response.message;
}
return log_with;
},
exclude: (ctx) => process.env.MODE === 'test' || ctx.path.includes('healthcheck') || path.extname(ctx.path),
});
const ROUTES_SKIP_LOG = ['healthcheck'];
export const logIncomingCall = async (ctx, next) => {
const pathname = ctx.request.originalUrl.split('?')[0];
if (ROUTES_SKIP_LOG.some((exclude) => pathname.includes(exclude))) return await next();
const http_method = ctx.request.method;
const input_params = getContextParams(ctx);
log.info(`Started ${http_method} for ${pathname}`, ...input_params);
await next();
log.info(`End ${http_method} for ${pathname}`);
};
export const routeToFunction =
(func) =>
async (ctx) => {
const args = getContextParams(ctx);
ctx.state.args = args;
const body = await func(...args);
if (body._http_code) {
ctx.status = body._http_code;
delete body._http_code;
}
ctx.body = body;
};
const getContextParams = (ctx) => {
let args;
args = [
{
files: ctx.request.files,
...ctx.request.query,
...ctx.request.body,
...ctx.request.params,
},
];
return args;
};
initAsyncStorage: creates the id_transaction with a random 20-character alphanumeric code and sets up a new asyncStorage instance.
routeSummaryLog: is in charge of making a log at the end of an API call flow. It includes important details like the response payload, id_transaction, and, in case of an error, the stack trace along with other request information, helpful for debugging.
logIncomingCall: creates a log before executing the entire flow, capturing the HTTP verb, called URL, and all passed parameters, whether in the body for a POST or in the querystring.
routeToFunction: makes it easier to get parameters. It makes the method passed as a parameter receive an object with payload and querystring parameters. It also prepares the response to be sent back to the client.
getContextParams: helper for retrieving parameters from the HTTP request.
7) router.js
import Router from '@koa/router';
import {healthcheck} from "./api/healthcheck.js";
import {routeToFunction} from "./middlewares.js";
import {throw422, throw500} from "./controllers/errors.js";
const router = new Router();
router.get('/healthcheck', routeToFunction(healthcheck));
router.get('/errors/422', routeToFunction(throw422));
router.get('/errors/500', routeToFunction(throw500));
export default router;
Creates and exports a router instance, registering all available APIs. Depending on the project's complexity, I make either one router or one for each model. Generally, I prefer simplicity over code fragmentation.
You can download all the code shown directly from my Github repository: https://github.com/marcomoauro/node-backend-template
2)❓Why Koa.js
When I started working in Node.js, the first application I managed used Express, one of the most widely used frameworks still used today. Koa.js is an evolution of that; it was created by the same Express team who designed it to take advantage of more advanced JavaScript features, such as async/await.
Koa allows you to write middleware in a cleaner and more readable way than Express, especially when using async/await. Because of its async/await-based approach, Koa can be considered more modern and scalability-oriented.
I also studied Fastify, I even bought their book, although I never used it in a big project. On one side, I found it fascinating because it provides all the needed features for a backend server. On the other hand, its structure and hidden complexity make me less enthusiastic since I love understanding how things work. I also talk about this aspect in this post:
In general, today Koa is a good choice, from this comparison we can see how Fastify performs better in some tasks but still Koa gives very good results.
3) 📦 External dependencies
These are the external dependencies of the template, specified in the package.json file:
@koa/cors: Cross-Origin Resource Sharing(CORS) for koa.
@koa/router: Router middleware for Koa.
axios: Promise based HTTP client for the browser and Node.js, the best solution between complexity, ease of use and overhead.
jsonwebtoken: library that allows the creation and validation of JWT tokens.
koa: HTTP middleware framework for node.js.
koa-better-json: Koa middleware that returns a JSON-encoded response with some improvement over koa-json.
koa-better-log: after handling an HTTP request, it logs useful information as we have seen in the middlewares.js file.
koa-body: middleware parser for koa.
lodash: makes JavaScript easier by taking the hassle out of working with arrays, numbers, objects, strings, etc.
luxon: the best library for working with dates in JavaScript, supports timezones.
memoizee: Complete in-memory cache solution for JavaScript.
nano-id: A tiny, secure, URL-friendly, unique string ID generator for JavaScript.
nodemailer: Easy e-mail sending from your Node.js applications.
nodemon: tool that helps develop Node.js based applications by automatically restarting the node application when file changes in the directory are detected.
p-queue: Promise queue with concurrency control, useful for rate-limiting async (or sync) operations.
pg-monitor: Events monitor for pg-promise, I use it to log queries with my own format.
pg-promise: PostgreSQL interface for Node.js.
4) 🐘 Why Postgres
In my personal projects, I've always used MySQL as a database. However, limitations like transaction handling and the inability to create materialized views and partial indexes led me to switch to Postgres.
On the usage side, it differs a bit from MySQL. It's more low-level, especially when dealing with auto-incremental values using sequences and triggers for timestamp fields like "updated_at."
Another reason that made me prefer it is over the cost of hosting a Postgres Database on Heroku, for $9 a month you can write up to 10M records compared to MySQL's million for the same price.
Stay tuned for an upcoming post where I'll share how I use Heroku to deploy my projects, create databases, and more!
5) 🚀 Let's implement new API
Let's finally come to the practical part!
We proceed to create a new API that allows us to retrieve the records for a new model Newsletter.
1) Create .env file
It is the file that will contain all the environment variables of the development environment application.
Create the .env file based on the env_template, the only env to configure is DATABASE_URL.
MODE=development
NODE_ENV=production
PORT=80
DATABASE_URL=
* Why NODE_ENV=production? Read this.
2) Table definition
We create a _newsletters_template table with the key field id, a description, and the creation and update timestamps. This is the definition:
create table _newsletters_template
(
id bigserial primary key,
description varchar(255) null,
created_at timestamp default CURRENT_TIMESTAMP not null,
updated_at timestamp default CURRENT_TIMESTAMP not null
);
3) Model Newsletter.js file
import log from '../log.js'
import db from '../database.js'
import {APIError404} from "../errors.js";
export default class Newsletter {
id
description
constructor(properties) {
Object.keys(this)
.filter((k) => typeof this[k] !== 'function')
.map((k) => (this[k] = properties[k]))
}
static fromDBRow = (row) => {
const newsletter = new Newsletter({
id: row.id,
description: row.description,
})
return newsletter
}
static get = async (id) => {
log.info('Model::Newsletter::get', {id})
const row = await db.oneOrNone(`
select *
from _newsletters_template
where id = $1
`, [id]);
if (!row) throw new APIError404('Newsletter not found.')
const newsletter = Newsletter.fromDBRow(row)
return newsletter
}
}
This class has a static Get method. It queries the database to fetch the record with the provided id. If not found, it returns a 404 error. If the object exists, we create an instance for application use (DTO) by using the result of the query with the static fromDBRow method.
4) Controller newsletter.js file
import log from "../log.js";
import Newsletter from "../models/Newsletter.js";
import {APIError500} from "../errors.js";
export const getNewsletter = async ({id}) => {
if (!process.env.DATABASE_URL) throw new APIError500('env variable `DATABASE_URL` not set.')
id = parseInt(id)
log.info('Controller::newsletters::getNewsletter', {id})
const newsletter = await Newsletter.get(id)
return newsletter
}
We create the controller that will export the getNewsletter method that will be used directly in the router.js file.
By defining the id parameter as part of the url, Koa will handle it as a string, so a cast to Int is required before passing it to the database.
If the newsletter is found, the controller sends it to the client and Koa handles converting it to JSON. If not found, an APIError404 exception is raised, resulting in the client getting a response with an HTTP 404 code.
5) Create new api in router.js file
In router.js, we create a new API by adding this snippet:
import {getNewsletter} from "./controllers/newsletters.js";
router.get('/newsletters/:id', routeToFunction(getNewsletter));
You can download all the code shown directly from my Github repository: https://github.com/marcomoauro/node-backend-template
6) Test new api
We are done!
We can start the server by launching:
yarn serve:development
We go to http://localhost/newsletters/1, If the record is created in the database, you'll find the object in the response; otherwise, a 404 error is returned.
I deployed my version of the template on Heroku, here is the links to try all the APIs:
https://node-backend-template-0a553f134efa.herokuapp.com/newsletters/1
https://node-backend-template-0a553f134efa.herokuapp.com/newsletters/2
https://node-backend-template-0a553f134efa.herokuapp.com/errors/422
https://node-backend-template-0a553f134efa.herokuapp.com/errors/500
Stay tuned for an upcoming post where I'll share how I use Heroku to deploy my projects, create databases, and more!
And that’s it for today! I hope you find this episode useful in your work or personal projects.
If you are finding this newsletter valuable, consider doing any of these:
🍻 Read with your friends — Implementing lives thanks to word of mouth. Share the article with someone who would like it.
📣 Provide your feedback - We welcome your thoughts! Please share your opinions or suggestions for improving the newsletter, your input helps us adapt the content to your tastes.
I wish you a great day! ☀️
Marco