A simple but reliable and well structured address book backend api. It allows users to manage their personal info and store as many contacts as they want :)
To run the tests it is pretty straightforward, It's already configured and it will start by default an embedded in memory mongodb server and an embedded firebase server as well. You should be able to run the tests by executing the following command (Just make sure you have npm globally installed and that you've installed the dependencies. Note: some libraries may need to have python installed):
npm install
npm run test
In order to run locally you must have some mongodb instance running in your machine or in some server that you have access to it from your local machine.
You can set your mongodb settings by changing the config file corresponding to your NODE_ENV
environment variable. By default it will be config/development
, unless if you had set NODE_ENV
to something else or are executing the tests, which changes it to test
. After changing the correspondent config file to your environment according to your needs, you should be able to execute the following command and run:
npm install
npm start
For more details on how the config library works, please check the node-config documentation.
You can override some configuration described in config/default.json
by using its correspondent environment variable defined in config/custom-environment-variables.json
. for example, the mongodb.uri
property corresponds to MONGODB_URI
environment variable. You can find below an example of using this technique:
npm install
MONGODB_URI=mongodb://localhost:27017/addressbook npm start
For more details on how the config library works, please check the node-config documentation.
I've prepared a Dockerfile and a docker-compose file that should run without the need for any further configuration, unless if you have another container or service running in the same port: 3000
as I defined in the app
service. If so, I suggest you to change the docker-compose port mapping in the app
service before running it. Assuming that you have docker and docker-compose installed in your machine, run the command below:
docker-compose build
docker-compose up
I tried as much as possible to separate responsibilities on each module/layer. For instance, I assumed that only the models
layer should know how to manage, change, customize and validate a given schema object and it should only send a data to db
layer after it is valid. The db
layer, on the other hand, should be concerned only in knowing how to communicate to the database and in obeying the "implicit" contract of returning an id
property in every saved object. Last but not least, I assumed that the middleware
and routes
layers should avoid to import and use db
layer directly, as it doesn't do nor have any sort of validation, they should rely on models
module instead.
Folder / File | Description |
---|---|
app.js | This file configures the app and add all routes |
index.js | This file imports the app and starts a http server |
/routes | This module contains all things related to http mapping and express endpoints. |
/middleware | This module contains all middlewares used in express routes. |
/models | This module contains all data schema validation and management. |
/db | This module contains all database's communication, queries and configurations. |
/logger | This module centralizes the a common logger to use throughout the application. |
/log | This folder is by default the logger output. It's where you should expect logs. |
The following endpoints are here just for a brief overview. There is an api documentation built following the OpenAPI Specification in docs/ folder. The docs/documentation.html is a zero-dependency HTML file that specifies every route and every possible response sample and status.
Request sample:
{
"email": "[email protected]",
"password": "test"
}
Response sample:
{
"id": "5c89b520aaf712001020e57c",
"email": "[email protected]"
}
Request sample:
{
"email": "[email protected]",
"password": "test"
}
Response sample:
{
"accessToken": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJ1c2VyIjp7ImlkIjoiNWM4OWI3NDRjZGM2YjMwMDExNzhmMjBlIiwiZW1haWwiOiJlbWFpbEBnbWFpbC5jb20ifSwiaWF0IjoxNTUyNTI5MjIyLCJleHAiOjE1NTI1MzI4MjIsImlzcyI6ImFkZHJlc3Nib29rLWFwaSJ9.P3Tsw4LId8Tutm0lhWRv3mIPKhiRCj5_Hd9Wq-xo3IY",
"tokenType": "bearer"
}
The following endpoint requires an Authorization
header with the Bearer <token>
obtained from /login
.
Response sample:
{
"id": "5c89b520aaf712001020e57c",
"email": "[email protected]"
}
These endpoints requires an Authorization
header with the Bearer <token>
obtained from /login
.
Response sample:
{
"id": "5c89b744cdc6b3001178f20e",
"name": "Custom Name",
"email": "[email protected]"
}
Request sample:
{
"name": "Custom Name",
"email": "[email protected]",
"password": "newPassword"
}
Response sample:
{
"id": "5c89b744cdc6b3001178f20e",
"name": "Custom Name",
"email": "[email protected]"
}
Request sample:
{
"email": "[email protected]"
}
Response sample:
{
"id": "5c89b744cdc6b3001178f20e",
"name": "Custom Name",
"email": "[email protected]"
}
Response sample:
{
"id": "5c89b744cdc6b3001178f20e",
"name": "Custom Name",
"email": "[email protected]"
}
The following endpoint requires an Authorization
header with the Bearer <token>
obtained from /login
.
Request sample:
{
"name": "A contact name",
"email": "[email protected]",
"address": {
"street": "St Louis 1",
"zipCode": "350120"
}
}
Response sample:
{
"id": "-L_tz2-qfEMaHUeM9NsG",
"name": "A contact name",
"email": "[email protected]",
"address": {
"street": "St Louis 1",
"zipCode": "350120"
}
}
During the development process I had some challenges and decisions to make. e.g. whether to one library or not, how handle the configuration, how handle the schema validations, etc. I'll try to summarize some of them in this section.
I had to decide whether I should use an ORM/ODM or not, and also, whether I should use a relational database or a non relational database. For that, I've put into account that an address book api wouldn't need so much concern about data integrity nor atomicity, an eventual consistency wouldn't be a problem, so, that was one of the reasons that made me opt to use mongodb.
After I decided to use mongodb, I considered to use some ODM, such as mongoose, after all, they have some features to facilitate development, for instance, the schema validation, pre register hooks etc. But I concluded that It wouldn't be good to rely the schema validation into some specifc database library, cause this project also uses other databases, for instance, the contact
model that is stored in firebase rather than in mongo. So, I've opted to use the native mongo driver to connect to the database and find some library for schema validation.
As described in the previous section, Database and ORM/ODM, I've decided to use a custom library for schema validation rather than using some ODM built in validation. For that, I decided to use Joi, in this case, my decision was more related to its popularity and documentation rather than any specific feature.
The PATCH Method for HTTP specification doesn't describe exactly the payload format that should be sent to apply partial changes, it describes only that the payload should be a description of changes. So, I've decided between JSON Patch notation and JSON Merge Patch, for the sake of simplicity, I decided to use the second one, which in simple words, allows you to use the same resource schema by just omitting the properties that won't change and declaring only the properties that is intended to modify.
A question that raised in my mind while I was designing the api routes was if it is a good approach to use an endpoint that changes accordingly to the authenticated user. When it comes to the contacts endpoint, I had to decide whether to use /contacts
endpoint (differentiating by token) or use /users/{userId}/contacts
endpoint. I've found a stackoverflow topic discussing something similar to the question that I had: Is an API endpoint that differentiates what resources to return based on user credentials RESTful and good URI design?
I liked the approach addressed in this response, so I decided to follow a similar pattern, using the resource id explicitly in the URL and validating if the user has permission to access it.
It isn't so good to manage all configuration settings throughout the code using environment variables, as described in eslint-no-process-env it could lead to maintenance issues as it’s another kind of global dependency. I could solve that by creating a module to centralize all env variables, but I decided to do some googling to find a good library to do that. I've found dotenv and node-config, I decided to use the second one since it allows to define some sort of hierarchical configuration and default values.
Initially, I've used debug library, it is intended to be used only in development environment, and it's not really a logging library, it lacks of some features such as saving do file, logging level, etc. So, after a while, I've changed it to winston for its popularity in the community and some features such as multiple logging transportation and so forth.
I felt the necessity to standardize the error messages and error handling, so, I found a good library Boom HTTP-friendly error objects which helps a lot to maintain the same pattern all over the thrown errors. Alongside with that, I decided to create a "global uncaught error handler" which is the errorHandler middleware.
For this case, I had to use some stateless strategy, so I've opted to use jwt, and I find unnecessary to use some 3rd party library to authenticate the users, for that I've made myself a middleware that verifies the authorization token header.
I had to choose between koa and express. Even though I wanted to use koa, I've opted to use express js in this project due to my previous knowledge with it.
I've opted to use jest rather than the well known mocha cause jest has everything built into it - from matchers to mocks.
- Increase tests coverage.
- Move unique user's email constraint to mongodb to avoid race conditions that may occur in parallel requests.
- Separate jest environments for unit testing and integration testing.
- Configure setup/teardown of embedded mongo and firebase to run only in integration test environment. Currently it is running up in every test execution.
- Improve firebase-end security by adding some sort of validation / authentication in the data sent to firebase and restricing read/write access to each user contacts collection.
- Upgrade to es6 features - import and export. It may need to install babel transpiler and configure deployment to use de output dist folder.
- Separate the node files in src/ folder.
- Change jwt algorithm to RS256 and use private/public keys.
- Implementation of contacts management endpoints.
- Define versioning style and project license type.
- Refactor tests to reuse code.