A scalable, modular structure for your GraphQL Node.js API

Creating a GraphQL API in Node.js is doable but setting it up right is tricky. In this article we’re going to answer the following questions:

  • What GraphQL library should I use? Apollo vs Express GraphQL vs GraphQL Yoga?
  • How do I setup my code in a modular way, so it doesn’t get messy once the project grows?
  • How do I handle authentication? Do I use directives or wrap resolver functions?
  • How do I write tests?

This is exactly what you are going to learn in this article so let’s get started.

What are we going to built?

This article is pretty advanced and it requires you to at least know the basics of:

  • Node.js + ES6 Javascript
  • MongoDB / Mongoose
  • Basic GraphQL

If you feel lazy and just wanna skip to the code then here is the finished project: https://github.com/mikevercoelen/codersmind-scalable-graphql-node-api

Index

What GraphQL Node.js library to pick?

When you create a GraphQL API in Node.js it means you start by making a tough decision on which GraphQL library you are going to use.

TLDR; go for Apollo Server, it has a cooler playground editor, has great documentation, is commercially backed, is more popular by this time of writing (442 Stack Overflow Questions for Apollo vs 144 for Express Graphql) and you can use GraphQL template literals.

The main differences between express-graphql and Apollo Server is the way how you define your schemas and types, express-graphql uses programmatic definitions vs Apollo Server uses GraphQL template literals.

If you really want to get serious with GraphQL, I highly recommend you try them both, they each work differently and in the end it’s a matter of preference.

There is another library you can try out called GraphQL Yoga, it uses Apollo under the hood and it acts just like create-react-app but it’s a little too opinionated for my taste. I want freedom.

Here is a list of resources that you can use to help you decide:

Setting up the boilerplate

Moving on, let’s get started by creating a directory and setting up the boilerplate files and folders structure:

You can create the above directories using the following command:

mkdir -p src src/directives src/models src/modules src/scalars src/utils

And setup the empty files to start with by using the following commands:

npm init -y && touch .env .env.sample .editorconfig .eslintrc.json nodemon.json src/index.js src/config.js src/app.js 

Now let’s install all the dependencies:

npm install apollo-server-express express graphql bcrypt deepmerge dotenv jsonwebtoken mongoose throng --save

And all the development depencencies:

npm install eslint eslint-config-standard eslint-plugin-import eslint-plugin-node eslint-plugin-promise eslint-plugin-standard expect husky lint-staged mocha nodemon supertest --save-dev

Now lets’s setup .eslintconfig:

root = true

[*]
indent_style = space
indent_size = 2
charset = utf-8
trim_trailing_whitespace = true
insert_final_newline = true

And our .eslintconfig.json:

{
  "env": {
    "es6": true,
    "node": true,
    "mocha": true
  },
  "extends": "standard",
  "rules": {
    "no-console": ["error"],
    "object-curly-spacing": ["error", "always"]
  }
}

We’re using eslint standard, so we don’t use any semicolons (which I like, once you get used to it they feel useless and a waste of key pressing effort). Feel free to make adjustments to the eslint config, the key is to be consistent and that you come up with good coding style guidelines for the team (don’t forget coding style discussions are very non-productive)

Now nodemon.json (to make our server auto re-start when we make changes):

{
  "ignore": [
    ".git",
    "node_modules",
    "package.json",
    "test"
  ],
  "ext": "js json",
  "watch": [
    "src",
    ".env"
  ]
}

Now let’s setup the scripts and make some changes to the package.json:

  ...  
  "scripts": {
    "dev": "nodemon src/index.js --exec \"node -r dotenv/config\"",
    "test": "NODE_ENV=mocha mocha -r dotenv/config --recursive './{,!(node_modules)/**}/*.test.js'"
  },
  "husky": {
    "hooks": {
      "pre-commit": "lint-staged",
      "pre-push": "npm test"
    }
  },
  "lint-staged": {
    "*.js": [
      "eslint --fix",
      "git add"
    ]
  },
  ...

At this point, we have setup our scripts, the pre-commit hook that runs lint-staged to run eslint fix and the pre-push that will run tests.

Now let’s setup our .env files, which helps us easily configure our app. We’re using dotenv to load the .env file automatically (it’s being used in our scripts -> -r dotenv/config this initializes dotenv).

The basic idea is simple: dotenv loads the .env file, which contains environment variables, you NEVER commit this file (so don’t forget to add it to your .gitignore file), because it can contain all the secrets etc. so developer that freshly clone your project, need to create an .env file with all the variables required for the application to load.

So we need to setup an .env.sample file, which newcomers should copy by using cp .env.sample .env (bonus points if you add this to your README.md)

So .env.sample:

PORT=5001
MONGODB_URI=SET_ME
JWT_SECRET=SET_ME
JWT_LIFE_TIME=7d
WORKERS=1

Now run cp .env.sample .env and make sure you set MONGODB_URI to a working connection (locally perhaps), here is an example: mongodb://localhost:27017/test

So when you made changes, the .env file should look something like this:

PORT=5001
MONGODB_URI=mongodb://localhost:27017/my-app
JWT_SECRET=["=R{f6BzbU-W3hm
JWT_LIFE_TIME=7d
WORKERS=1

At this point, run the app (it will prob. error but that’s ok)

npm run dev

Now lets start coding everything.

First src/config.js

const PORT = process.env.PORT
const MONGODB_URI = process.env.MONGODB_URI
const WORKERS = process.env.WORKERS
const JWT_LIFE_TIME = process.env.JWT_LIFE_TIME
const JWT_SECRET = process.env.JWT_SECRET

module.exports = {
  PORT,
  MONGODB_URI,
  WORKERS,
  JWT_LIFE_TIME,
  JWT_SECRET
}

config.js is just a helper in the root of src, that we can use throughout the app to access config variables.

Next the index.js and app.js files.

src/app.js:

const express = require('express')

// The reason why apollo-server-express is because later on for testing we use Supertest, which requires an app object
const { ApolloServer } = require('apollo-server-express')

// we don't have these yet, but don't worry we'll get there.
const context = require('./utils/context')
const schema = require('./modules')

const server = new ApolloServer({
  schema,
  context: async ({ req }) => ({
    user: await context.getUser(req)
  })
})

const app = express()

server.applyMiddleware({
  path: '/',
  app
})

module.exports = app

src/index.js:

const throng = require('throng')
const mongoose = require('mongoose')
const url = require('url')
const app = require('./app')
const config = require('./config')

const mongoHost = new url.URL(config.MONGODB_URI).host

const startServer = async function () {
  const mongooseOptions = {
    useNewUrlParser: true,
    promiseLibrary: global.Promise
  }

  try {
    await Promise.all([
      mongoose.connect(config.MONGODB_URI, mongooseOptions),
      app.listen(config.PORT)
    ])

    // eslint-disable-next-line no-console
    console.log(`Server has started on port: ${config.PORT}, connected to mongo at ${mongoHost}`)
  } catch (error) {
    // eslint-disable-next-line no-console
    console.error(`Could not start the app: `, error)
  }
}

// Let's make Node.js clustered for beter multi-core performance
throng({
  workers: config.WORKERS,
  lifetime: Infinity
}, startServer)

Next up, we’re going to setup the Mongoose models for our app. Our app will have 2 models: User and Book.

So go ahead and create the following files:

src/models/user.js:

const mongoose = require('mongoose')

const userSchema = new mongoose.Schema({
  firstName: {
    type: String,
    required: true
  },
  lastName: {
    type: String,
    required: true
  },
  email: {
    type: String,
    required: true
  },
  hashedPassword: {
    type: String,
    required: true
  },
  created: {
    type: Date,
    default: Date.now
  },
  changed: {
    type: Date,
    default: Date.now
  },
  lastActive: {
    type: Date
  }
})

const User = mongoose.model('User', userSchema)

module.exports = User

src/models/book.js:

const mongoose = require('mongoose')

const bookSchema = new mongoose.Schema({
  title: {
    type: String,
    required: true
  },
  createdBy: {
    type: mongoose.Schema.Types.ObjectId,
    ref: 'User',
    required: true
  },
  created: {
    type: Date,
    default: Date.now
  },
  changed: {
    type: Date,
    default: Date.now
  }
})

const Book = mongoose.model('Book', bookSchema)

module.exports = Book

Our Mongoose models are now setup, let’s move on to utils:

First we setup our token util, which we’re going to use later to do JWT authentication.

src/utils/token.js:

const jwt = require('jsonwebtoken')
const config = require('../config')

const create = userId => new Promise((resolve, reject) => {
  jwt.sign({
    userId
  }, config.JWT_SECRET, {
    expiresIn: config.JWT_LIFE_TIME
  }, (error, token) => {
    if (error) {
      return reject(error)
    }

    resolve(token)
  })
})

const getDecodedToken = token => new Promise((resolve, reject) => {
  jwt.verify(token, config.JWT_SECRET, (error, decodedToken) => {
    if (error) {
      return reject(error)
    }

    if (!decodedToken.exp || !decodedToken.iat) {
      return reject(new Error(`Token had no 'exp' or 'iat' payload`))
    }

    resolve(decodedToken)
  })
})

module.exports = {
  create,
  getDecodedToken
}

Next is the context util (used by Apollo Server, the context object gets injected into resolver functions) it is kinda like express middleware where you have req and res objects, but you have the freedom to specify other types of objects to have access to a “shared” context scope.

More information about context, check this out.

src/utils/context.js:

const tokenUtil = require('./token')
const User = require('../models/user')

const TOKEN_HEADER_NAME = 'x-token'

const getUser = async req => {
  if (!req) {
    return null
  }

  const tokenHeader = req.get(TOKEN_HEADER_NAME)

  if (!tokenHeader) {
    return null
  }

  try {
    const decodedToken = await tokenUtil.getDecodedToken(tokenHeader)
    return await User.findById(decodedToken.userId)
  } catch (error) {
    return null
  }
}

module.exports = {
  getUser
}

The last thing we need is a modules utils, in this file we’re going to export the function makeExecutableSchemaFromModules, this is an important function because it will be the core of our modular setup, more on that later.

src/utils/modules.js

const { gql, makeExecutableSchema } = require('apollo-server-express')
const deepmerge = require('deepmerge')

const directives = require('../directives')
const scalars = require('../scalars')

const globalTypeDefs = gql`
  type Query
  type Mutation
`

const makeExecutableSchemaFromModules = ({
  modules
}) => {
  let typeDefs = [
    globalTypeDefs,
    ...scalars.typeDefs,
    ...directives.typeDefs
  ]

  let resolvers = {
    ...scalars.resolvers
  }

  modules.forEach(module => {
    typeDefs = [
      ...typeDefs,
      ...module.typeDefs
    ]

    resolvers = deepmerge(resolvers, module.resolvers)
  })

  return makeExecutableSchema({
    typeDefs,
    resolvers,
    schemaDirectives: {
      ...directives.schemaDirectives
    }
  })
}

module.exports = {
  makeExecutableSchemaFromModules
}

Authentication by using directives

There are multiple ways to setup authentication in your Apollo app. One way of doing authentication is by wrapping resolvers for certain routes, which is ok but not as great as using directives.

Why? Take a look at the following example:

In our app, we have 2 book queries: book and books. But we only want those queries to be possible if the user is authenticated, so these are protected routes.

Now by using directives, we can simply turn this:

  extend type Query {
    book(id: ID!): Book
    books: [Book]
  }

Into this:

  extend type Query {
    book(id: ID!): Book @isAuthenticated
    books: [Book] @isAuthenticated
  }

Which is simple and clean, but first we gotta make that isAuthenticated directive. So let’s set it up:

src/directives/is-authenticated.js

const { gql, SchemaDirectiveVisitor, AuthenticationError } = require('apollo-server-express')
const { defaultFieldResolver } = require('graphql')

const typeDef = gql`
  directive @isAuthenticated on FIELD_DEFINITION
`

class IsAuthenticatedDirective extends SchemaDirectiveVisitor {
  visitFieldDefinition (field) {
    const { resolve = defaultFieldResolver } = field

    field.resolve = async function (...args) {
      const context = args[2]

      if (!context || !context.user) {
        throw new AuthenticationError('Not allowed')
      }

      return resolve.apply(this, args)
    }
  }
}

module.exports = {
  typeDef,
  directive: IsAuthenticatedDirective
}

src/directives/index.js

const isAuthenticated = require('./is-authenticated')

module.exports = {
  typeDefs: [
    isAuthenticated.typeDef
  ],
  schemaDirectives: {
    isAuthenticated: isAuthenticated.directive
  }
}

DateTime Scalar

With GraphQL you can specify custom types, called Scalars. Since our Mongoose models have date time fields, which are ISO date strings, we should add a DateTime scalar to make our api even more clean.

src/scalars/date-time.js

const { GraphQLScalarType, Kind } = require('graphql')
const { gql } = require('apollo-server-express')

const typeDef = gql`
  scalar DateTime
`

const DateTime = new GraphQLScalarType({
  name: 'DateTime',
  description: 'A DateTime representation in ISO format',
  parseValue (value) {
    return value
  },
  serialize (value) {
    return value
  },
  parseLiteral (ast) {
    if (ast.kind === Kind.INT) {
      return new Date(ast.value)
    }

    return null
  }
})

module.exports = {
  typeDef,
  resolvers: {
    DateTime
  }
}

src/scalars/index.js

const DateTime = require('./date-time')

module.exports = {
  typeDefs: [
    DateTime.typeDef
  ],
  resolvers: {
    ...DateTime.resolvers
  }
}

Modules

Now it’s time to talk about modules, it can be tricky to setup your Apollo app in such a way that it’s modular and keep all logic related to a subject (i.e. auth) bundled together.

The idea is simple, we have a modules folder that contains all the separate modules and a very important index file, which glues everything together. The index file calls makeExecutableSchemaFromModules (the function we created earlier in our modules util) which does the module gluing.

The makeExecutableSchemaFromModules is also responsible for loading the directives and scalars automatically.

Moving on, the index file of the modules directory:

src/modules/index.js

const { makeExecutableSchemaFromModules } = require('../utils/modules')

const auth = require('./auth')
const books = require('./books')

module.exports = makeExecutableSchemaFromModules({
  modules: [
    auth,
    books
  ]
})

As you can see, we have 2 modules: auth and books.

A module contains a schema and resolvers. The best way to wrap around the whole modules concept is by just writing the code:

So let’s setup the auth module.

src/modules/auth/index.js

const { gql } = require('apollo-server-express')

// The schema (feel free to split these in a subfolder if you'd like)
const typeDefs = gql`
  extend type Query {
    me: User @isAuthenticated
  }
  
  extend type Mutation {
    login(
      email: String!,
      password: String!
    ): AuthData

    signup(
      email: String!,
      password: String!,
      firstName: String!,
      lastName: String!
    ): User
  }

  type AuthData {
    user: User
    token: String!
    tokenExpiration: String!
  }

  type User {
    id: ID!
    email: String!
    password: String!
    firstName: String!
    lastName: String!
  }
`

const resolvers = require('./resolvers')

module.exports = {
  // typeDefs is an array, because it should be possible to split your schema if the schema grows to big, you can just export multiple here
  typeDefs: [
    typeDefs
  ],
  resolvers
}

Lets setup the auth resolver index file:

src/modules/auth/resolvers/index.js

const me = require('./me')
const login = require('./login')
const signup = require('./signup')

const resolvers = {
  Query: {
    me
  },
  Mutation: {
    login,
    signup
  }
}

module.exports = resolvers

As you can see, we need to create the following resolver files: me, login and signup

src/modules/auth/resolvers/me.js

const me = async (_, args, { user }) => ({
  ...user._doc,
  id: user.id
})

module.exports = me

An interesting thing to note: you see that { user }, that’s coming from our user context util we created earlier.

Next up is the login resolver:

src/modules/auth/resolvers/login.js

const { AuthenticationError } = require('apollo-server-express')
const tokenUtil = require('../../../utils/token')
const User = require('../../../models/user')
const bcrypt = require('bcrypt')
const config = require('../../../config')

const login = async (_, { email, password }) => {
  const user = await User.findOne({
    email
  })

  if (!user) {
    throw new AuthenticationError('User not found')
  }

  const isPasswordValid = await bcrypt.compare(password, user.hashedPassword)

  if (!isPasswordValid) {
    throw new AuthenticationError('Incorrect password')
  }

  const token = tokenUtil.create(user._id)

  return {
    user: {
      ...user._doc,
      id: user._id
    },
    token,
    tokenExpiration: config.JWT_LIFE_TIME
  }
}

module.exports = login

And the signup resolver:

src/modules/auth/resolvers/signup.js

const { UserInputError } = require('apollo-server-express')
const User = require('../../../models/user')
const bcrypt = require('bcrypt')

const SALT_ROUNDS = 12

const signup = async (_, {
  email,
  password,
  firstName,
  lastName
}) => {
  try {
    const existingUser = await User.findOne({
      email
    })

    if (existingUser) {
      throw new UserInputError('User already exists')
    }

    const hashedPassword = await bcrypt.hash(password, SALT_ROUNDS)

    const user = await User.create({
      email,
      hashedPassword,
      firstName,
      lastName
    })

    return {
      ...user._doc,
      id: user._id,
      hashedPassword: null
    }
  } catch (error) {
    throw error
  }
}

module.exports = signup

Now take a moment, pause, take your dog for a walk and when you feel ready, walk through the code. Once you get modules, and how it is all stitched together, it’s time to continue 🙂

As an exercise, you should create the books module by yourself. What we want is: a user should be able to get a single book, multiple books and create a book, and don’t forget to use the isAuthenticated directive to secure all the routes.

Take a look at how the auth module is setup, and try to port the concepts to the books module.

Here is the answer: the finished books module

Integration tests

The last crucial piece of the puzzle: a proper way to handle integration tests to make sure everything works correctly.

For this application, we’re using Mocha and Supertest. To easily use Supertest with GraphQL, we’re going to setup a simple wrapper to make our lives easier.

But first, start by creating a file “utils.test.js” in your root directory. This file is responsible for setting up a global Mocha before and each hooks, and handle everything Mongoose related.

We don’t want our integration tests to run on the same database.

We’re going to create a fresh database every time the integration tests run.

utils.test.js

const mongoose = require('mongoose')
const pkg = require('./package')

const getMongo = ({
  mongoUrl,
  dropDatabase = true,
  connectionWhitelist
}) => {
  if (mongoose.connection.host) {
    throw new Error(`There was already a mongoose connection, this is dangerous. Was connected to: ${mongoose.connection.host}`)
  }

  let hasConnected = false

  const connect = async () => {
    await mongoose.connect(mongoUrl, {
      useNewUrlParser: true,
      promiseLibrary: global.Promise
    })

    hasConnected = true

    // eslint-disable-next-line no-console
    console.log(`Connected to test database at ${mongoUrl}`)
  }

  const drop = async () => {
    if (!hasConnected) {
      throw new Error('Was trying to drop the database, but was not connected to the test database.')
    }

    if (!connectionWhitelist.includes(mongoose.connection.client.s.url)) {
      throw new Error('Was trying to a non-whitelisted database, cancelled.')
    }

    await mongoose.connection.db.dropDatabase()

    // eslint-disable-next-line no-console
    console.log(`Dropped the test database`)
  }

  const close = async () => {
    // eslint-disable-next-line no-console
    console.log(`Closing mongoose connection`)

    if (!mongoose.connection) {
      throw new Error('Could not close the connection, there was none.')
    }

    if (!hasConnected) {
      throw new Error(`Wanted to close connection to: ${mongoose.connection.host}, but was not connected to url: ${mongoUrl}`)
    }

    await Promise.all(mongoose.modelNames().map(model => {
      return mongoose.model(model).ensureIndexes()
    }))

    await mongoose.disconnect()

    hasConnected = false

    // eslint-disable-next-line no-console
    console.log(`Connection to mongoose closed`)
  }

  return {
    connect,
    drop,
    close
  }
}

const mongo = getMongo({
  mongoUrl: `mongodb://127.0.0.1:27017/${pkg.name}-test`,
  connectionWhitelist: [
    `mongodb://127.0.0.1:27017/${pkg.name}-test`
  ]
})

global.before(async () => {
  if (process.env.NODE_ENV !== 'mocha') {
    throw new Error(`NODE_ENV should be set to "mocha".`)
  }

  await mongo.connect()
  await mongo.drop()
})

global.after(async () => {
  await mongo.close()
})

Next we’re going to create that Supertest wrapper util function:

src/utils/test.js

const request = require('supertest')
const app = require('../app')

const graphQLRequest = ({ query, variables = null }) => {
  return request(app)
    .post('/')
    .send({
      variables,
      query
    })
}

module.exports = {
  request: graphQLRequest
}

Now we’ve setup the boilerplate for our integration tests setup.

Let’s continue with some “real” tests for our auth module:

src/modules/auth/auth.test.js

const expect = require('expect')
const { request } = require('../../utils/test')

const testUser = {
  email: 'test-user@gmail.com',
  password: 'test1234',
  firstName: 'test',
  lastName: 'user'
}

const signup = ({ email, password, firstName, lastName }, returnValues = `{
  id
  email
}`) => {
  return request({
    query: `
      mutation {
        signup(
          email: "${email}",
          password: "${password}",
          firstName: "${firstName}",
          lastName: "${lastName}"
        ) ${returnValues}
      }
    `
  })
}

describe('auth', () => {
  describe('sign up', () => {
    it('should create a new user', () => {
      return signup(testUser)
        .expect(res => {
          expect(res.body).toHaveProperty('data.signup.id')
          expect(res.body).toHaveProperty('data.signup.email', testUser.email)
        })
        .expect(200)
    })

    it('should not create a new user when a password is missing', () => {
      return signup({
        ...testUser,
        password: null
      })
        .expect(res => {
          expect(res.body).toHaveProperty('errors')
          expect(Array.isArray(res.body.errors)).toBe(true)
        })
    })

    it('should not create a new user with the same email', () => {
      return signup(testUser)
        .expect(res => {
          expect(res.body).toHaveProperty('errors')
          expect(Array.isArray(res.body.errors)).toBe(true)
        })
    })
  })

  describe('login', () => {
    it('should succesfully login and return a token', () => {
      return request({
        query: `
          mutation {
            login(email:"${testUser.email}", password:"${testUser.password}") {
              user {
                id
              }
              token
              tokenExpiration
            }
          }
        `
      })
        .expect(res => {
          expect(res.body).toHaveProperty('data.login.user.id')
          expect(res.body).toHaveProperty('data.login.token')
          expect(res.body).toHaveProperty('data.login.tokenExpiration')
        })
        .expect(200)
    })
  })

  describe('me', () => {
    let loginResponse = null

    before(async () => {
      await request({
        query: `
          mutation {
            login(email:"${testUser.email}", password:"${testUser.password}") {
              user {
                id
              }
              token
              tokenExpiration
            }
          }
        `
      })
        .expect(res => {
          expect(res.body).toHaveProperty('data.login.user.id')
          expect(res.body).toHaveProperty('data.login.token')
          expect(res.body).toHaveProperty('data.login.tokenExpiration')

          loginResponse = res.body
        })
        .expect(200)
    })

    it('should not return a profile when not logged in', () => {
      return request({
        query: `
          query me {
            me {
              id
              email
              firstName
              lastName
            }
          }
        `
      })
        .expect(res => {
          expect(res.body).toHaveProperty('errors')
          expect(res.body.data.me).toEqual(null)
          expect(Array.isArray(res.body.errors)).toBe(true)
        })
    })

    it('should succesfully return the profile from me', () => {
      const token = loginResponse.data.login.token

      return request({
        query: `
          query me {
            me {
              id
              email
              firstName
              lastName
            }
          }
        `
      })
        .set('x-token', token)
        .expect(res => {
          expect(res.body).toHaveProperty('data.me.id')
          expect(res.body).toHaveProperty('data.me.email', testUser.email)
          expect(res.body).toHaveProperty('data.me.firstName', testUser.firstName)
          expect(res.body).toHaveProperty('data.me.lastName', testUser.lastName)
        })
        .expect(200)
    })
  })
})

Now run the tests using:

npm test

Conclusion

The trick for a successful GraphQL Apollo Node.js server is too setup a good foundation for you and your team to build a project on. Meaning: a modular consistent base.

Feel free to ask any questions or comments.

Keep it simple.

Mike


Leave a Reply

Your email address will not be published. Required fields are marked *