Build your own REST API with Node, Express, Knex and PostgreSQL — part 3
23 Sep 2019 - John
If you’re new to the series, click here for part 1 and part 2. So far we’ve taken care of our database schema, but we haven’t done much elsewhere so our api doesn’t really do anything yet. Let’s change that. Today we’re going to start work on adding the routes that will be the endpoints of our API.
Adding the development dependencies
We are about to start making a few additions to our API, so it will be a big hassle to restart it every time we add a change, so we’ll use nodemon to take care of that. We’re also adding cors to be able to handle resquests incoming from anywhere; as well as chai, mocha and supertest to write our tests. So fire up a terminal and enter:
npm i -D supertest mocha nodemon chai cors
The -D
switch is shorthand for –-save-dev
, which will add any required dependencies to the development group on your package.json
file. Now open your package.json
file and add the following to the scripts
section, just after start:
"dev": "nodemon",
"test": "(dropdb --if-exists api-server-test && createdb api-server-test) && NODE_ENV=test mocha --exit"
don’t forget to place a comma after start!
Now create three new folders in your project root, called db
, api
and test
:
db folder:
The db folder will contain additional configuration for our database as well as the queries needed to retrieve data from it. Inside, create a file called knex.js
with the following content:
const environment = process.env.NODE_ENV || 'development';
const config = require('../knexfile');
const environmentConfig = config[environment];
const knex = require('knex');
const connection = knex(environmentConfig);
module.exports = connection;
Then, still in the db folder, create another file, called dbqueries.js
and put the following into it:
const knex = require('./knex');
module.exports = {
getAll(table) {
return knex(table);
}
}
api folder:
The api folder will contain the routes we’ll be using. Inside, create three files called users.js
, posts.js
and comments.js
. Paste the following on each one of them:
const express = require('express');
const router = express.Router();
const queries = require('../db/queries');
router.get('/', (req, res) => {
queries.getAll('users').then(users => {
res.json(users);
})
});
module.exports = router;
Of course, change ‘users’ for posts and comments in their respective files, minding the quotation marks.
test folder:
The test folder will contain our, you gessed it, tests. Create three files inside called users.test.js
, posts.test.js
and comments.test.js
and leave them blank for now.
Final touches on app.js
Go back to your app.js
file and add const cors = require('cors');
in a new line after const app = express();
. Should be near the top of the file. You might also want to change those old school var
into const
to keep things nice, but it’s not really necessary. Now delete these two lines:
var indexRouter = require('./routes/index');
var usersRouter = require('./routes/users');
and change them for:
const users = require('./api/users');
const posts = require('./api/posts');
const comments = require('./api/comments');
Also, add cors to express:
app.use(cors());
We also don’t need a view engine or to display anything so delete the lines that say
// view engine setup
app.set('views', path.join(__dirname, 'views'));
app.set('view engine', 'jade');
Then change your error handler to output json instead of drawing anything on the screen:
app.use(function (err, req, res, next) {
res.status(err.status || 500);
res.json({
message: err.message,
error: res.locals.error = req.app.get('env') === 'development' ? err : {}
});
});
What in the world did we just do?
We made quite a few changes, let’s go thorugh them:
- The
knex.js
file is our way to let knex know what evironment it’ll be working in. When we deploy to Heroku, we automatically get an evironment variable called production and it will be detected. Then we’re importing ourknexfile
and exporting the module to have everything available. - The
dbqueries
file is where the bulk of the database action is going to happen and it will work closely with our routes. Anything we need to get from the database will be asked here and for the moment, we just have a simple request that is the equivalent of aSELECT * FROM
query. - Then we have our
api
folder. Every file in this folder corresponds to an endpoint in our API, so that means we have/users
,/posts
and/comments
available. We will add more routes to them, but for the moment, notice how we’re importing the queries file at the top? That’s what’s gives us access to our queries and what I meant by thedbqueries
file "working closely with our routes". Then we have a simpleget
action on the root level of each route that performs our query from thedbqueries
file and returns the result formatted as json. - Finally, we set up our
app.js
file to use our routes and got rid of the predefined ones. Then we got rid of the view engine because we just need it to just output json, which is why we also modified our error handler to do so. Finally, we addedcors
. It is very important for you to know that usingcors
is potentially dangerous since it allows requests coming from absolutely anywhere to work with your API. In the real world, you would restrict your api to respond to incoming requests ONLY if they come from a specific IP. everything else should be ignored. In the meantime, and since we’re in the development environment, we’ll stick tocors
.
Writing the tests
This is the final piece of the puzzle. Every time we add a new functionality to our api, we need to write tests for it. Let’s write one!
In your test folder, create a users.test.js
file. In it, then paste this at the top:
const request = require('supertest');
const expect = require('chai').expect;
const knex = require('../db/knex');
const app = require('../app');
We are importing our database files and any dependencies needed for our tests. Now all we need to do is call the describe function with its first task:
describe('Testing users', () => {
before((done) => {
knex.migrate.latest()
.then(() => {
return knex.seed.run();
}).then(() => done())
});
});
This will run every time we run our tests. If you check the package.json
file, you can see that it has a line that deletes the test database if it exists and creates a new one every time we run the tests. Then our first action is to run the migrations on the test database to then seed it.
Now our first test will be to check if we actually get what we set it out to deliver, so we’ll write a quick test to check our root path on the /users
route:
it('Lists all users', (done) => {
request(app)
.get('/users')
.set('Accept', 'application/json')
.expect('Content-Type', /json/)
.expect(200)
.then((response) => {
expect(response.body).to.be.a('array');
done();
}).catch((e) => {
console.log(e);
});
});
Keep in mind that this goes still inside our describe call, right after our first action and before the last closing braces, so it should look like this:
describe('Testing users', () => {
before((done) => {
knex.migrate.latest()
.then(() => {
return knex.seed.run();
}).then(() => done())
});
it('Lists all users', (done) => {
request(app)
.get('/users')
.set('Accept', 'application/json')
.expect('Content-Type', /json/)
.expect(200)
.then((response) => {
expect(response.body).to.be.a('array');
done();
}).catch((e) => {
console.log(e);
});
});
});
Now all you need to do is run npm test and if everything went well, you should see something similar to this:
> api-tutorial@1.0.0 test /home/john/Development/post
> (dropdb --if-exists tutorial-test && createdb tutorial-test) && NODE_ENV=test mocha --exit
Testing users
GET /users 200 10.777 ms - 123
✓ Lists all users (4ms)
1 passing (105ms)
If you have issues at this point, try deleting both the development and testing databases and run the test again.
Finally, to see it in action, go back to the console and type nodemon, then fire up postman and make get requests to http://localhost:3000/users
, http://localhost:3000/posts
and http://localhost:3000/comments
. If everything went well, you should see the data we entered in the seed files formatted as json.
And that’s it for today! Stay tuned for new functionality in the next part.