When I decided to learn GraphQL I knew that the best way to do it was implementing its concepts, so I figure out that develop a chat application were a way to achieve my goal because would allow me to put in practice all GraphQL features, that’s what this post is about: Learning some GraphQL concepts by building a chat application.
Our application will be split in two parts, back-end and front-end, as well as these posts, in this first post we will develop the server side, to do so we’ll use NodeJS, Apollo Server and of course GraphQL, we also will need a database and a query builder module, I used Knex and MySQL.
Before we continue, all the code is in this repository.
Initial setup
Ok, first things first, let's start by creating the project and installing its dependencies.
Inside project folder:
npm init
And:
npm i apollo-server bcrypt dotenv graphql jsonwebtoken knex lodash mysql
npm i --save-dev @babel/cli @babel/core @babel/node @babel/plugin-transform-runtime @babel/preset-env babel-jest jest nodemon standard
In scripts section of package.json
put the following commands:
"start": "nodemon --exec babel-node ./src/index.js",
"test": "jest",
"test:watch": "jest --watch",
"migrate": "knex migrate:latest",
"unmigrate": "knex migrate:rollback",
"seed": "knex seed:run",
"lint": "standard",
"lint:fix": "standard --fix"
In the root folder create a .babelrc
file:
{
"presets": [
"@babel/preset-env"
],
"env": {
"test": {
"plugins": [
"@babel/plugin-transform-runtime"
]
}
}
}
Also in the root folder create a .env
file, this file contains the project’s environment variables:
NODE_ENV=development
DB_HOST=localhost
DB_USER=root
DB_PASSWORD=toor
DB_NAME=chat
SECRET=secret
The first variable is the environment, let's leave as development
for now, the next four variables is database host, user, password and name, for these you can set the values accordingly to your database configuration. The last one is the secret value that we’ll use later in authentication.
Feel free to configure any relational database, I used MySQL, if want to use another, like PostgreSQL, you'll just have to do a different setup in the knexfile.js
.
Database and models
In this section we’ll configure our database and implement our models, in the root folder create a knexfile.js
file, it contains database configuration for development, test and production environments:
require('dotenv').config()
module.exports = {
development: {
client: 'mysql',
connection: {
host: process.env.DB_HOST,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME
},
migrations: {
directory: './src/data/migrations'
},
seeds: { directory: './src/data/seeds' }
},
test: {
client: 'mysql',
connection: {
host: process.env.DB_HOST,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME
},
migrations: {
directory: './src/data/migrations'
},
seeds: { directory: './src/data/seeds' }
},
production: {
client: 'mysql',
connection: {
host: process.env.DB_HOST,
user: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME
},
migrations: {
directory: './src/data/migrations'
},
seeds: { directory: './src/data/seeds' }
}
}
In src/data/
we can store our database migrations, seeds and a file that export a database object with the configurations from the knexfile.js
:
// src/data/db.js
import knex from 'knex'
import knexfile from '../../knexfile'
const env = process.env.NODE_ENV || 'development'
const configs = knexfile[env]
const database = knex(configs)
export default database
Now let’s create our migrations, run:
knex migrate:make user
knex migrate:make message
The generated files are in the directory configured in knexfile.js
, they must have the following contents:
// src/data/migrations/20200107121031_user.js
exports.up = (knex) =>
knex.schema.createTable('user', table => {
table.bigIncrements('id').unsigned()
table.string('name').notNullable()
table.string('email').notNullable()
table.string('password').notNullable()
})
exports.down = (knex) => knex.schema.dropSchemaIfExists('user')
// src/data/migrations/20200107121034_message.js
exports.up = (knex) =>
knex.schema.createTable('message', table => {
table.bigIncrements('id').unsigned()
table.string('message').notNullable()
table.bigInteger('senderId').unsigned().references('id').inTable('user')
table.bigInteger('receiverId').unsigned().references('id').inTable('user')
})
exports.down = function (knex) {
knex.schema.dropSchemaIfExists('message')
}
can run our migrations, the following commands will create user
and message
tables in the database and populate it.
npm run migrate
Next we create our models, let’s start by creating the Model
class, it contains common methods used by another models that will extend it:
// src/model/Model.js
export default class Model {
constructor (database, table) {
this.database = database
this.table = table
}
all () {
return this.database(this.table).select()
}
find (conditions) {
return this.database(this.table).where(conditions).select()
}
findOne (conditions) {
return this.database(this.table).where(conditions).first()
}
findById (id) {
return this.database(this.table).where({ id }).select().first()
}
insert (values) {
return this.database(this.table).insert(values)
}
}
Then we create User
and Message
models, notice that in User
model there is a method to generate a token using the environment variable SECRET
that we defined before, also there are methods to find a user by a token and to retrieve an user's messages.
// src/model/User.js
import Model from './Model'
import bcrypt from 'bcrypt'
import jwt from 'jsonwebtoken'
export class User extends Model {
constructor (database) {
super(database, 'user')
}
async hash (password) {
return bcrypt.hash(password, 10)
}
async compare (hash, password) {
return bcrypt.compare(password, hash)
}
generateToken (user) {
/* knex return a RowDataPacket object and jwt.sign function
expects a plain object, stringify and parse it back does the trick */
return jwt.sign(
JSON.parse(JSON.stringify(user)),
process.env.SECRET,
{
expiresIn: 86400
}
)
}
async getUserByToken (token) {
try {
const decoded = jwt.verify(token, process.env.SECRET)
return decoded
} catch (error) {
console.log(error)
return null
}
}
async getMessages(senderId, lastId) {
return this.database('message')
.where('id', '>', lastId)
.andWhere(q => q.where({ senderId: senderId })
.orWhere({ receiverId: senderId }))
.limit(10)
}
// src/model/Message.js
import Model from './Model'
export class Message extends Model {
constructor (database) {
super(database, 'message')
}
async getConversation (senderId, receiverId, lastId) {
return this.database('message')
.where('id', '>', lastId)
.andWhere({ senderId })
.andWhere({ receiverId })
.limit(10)
}
}
Now we have to export all those models, for the sake of organization, I’ve created a index.js
file in src/model
that export an object models
containing all our models.
// src/model/index.js
import database from '../data/db'
import { User } from '../model/User'
import { Message } from '../model/Message'
const user = new User(database)
const message = new Message(database)
const models = {
user,
message
}
export default models
Schema
Finally we’ll deal with GraphQL, let’s start with the schema, but what is the schema? The schema use GraphQL schema language to define a set of types that our application will provide, a type can be, among others, a query, a mutation, a subscription, a object type or a scalar type.
The query type defines the possible queries that our application will provide, for example, fetch all messages.
Mutation type is like queries but allow to modify data, for example, send a message.
Subscription allows server to send data to a client when an event happens, usually is implemented with WebSockets, for example, in our chat application when a client send a message, the receiver client must receive that message without request it to the server.
Object type defines an object that our application allow to be fetched, like user or message.
And scalar types, well, Object type has fields and these fields must have a value of some type, like string or int, these types are scalar types, the possible scalar types are Int, String, Float, Boolean and ID. In some GraphQL implementations is possible to specify custom scalar types. When we use ! means that field is non-nullable and our service promises to return a non-nullable value. If we want to specify that our service will return an array we use [], for example, [String]!
.
Our GraphQL schema could be defined entirely in a single file, but as our application grows, that file would become a mess, so I decide to separate the schema in files that represents entities, so we’ll have a file to define user schema and another to define message schema, also there will be a file to bring all schema together, let’s start with this file:
// src/schema/index.js
import { merge } from 'lodash'
import { gql, makeExecutableSchema } from 'apollo-server'
import {
typeDef as User,
resolvers as userResolvers
} from './user'
import {
typeDef as Message,
resolvers as messageResolvers
} from './message'
const Query = gql`
type Query {
_empty: String
}
type Mutation {
_empty: String
}
type Subscription {
_empty: String
}
`
export const schema = makeExecutableSchema({
typeDefs: [Query, User, Message],
resolvers: merge(userResolvers, messageResolvers)
})
Next we create user and message schemas, you will notice that in each file there is an object called resolvers
we will talk about it in a bit. Also notice that when we define the schema in the const typeDef
we are extending the types Query, Mutation and Subscription, we have to do this way because a GraphQL schema must have only one of each of these types.
// src/schema/message.js
import { gql } from 'apollo-server'
export const subscriptionEnum = Object.freeze({
MESSAGE_SENT: 'MESSAGE_SENT'
})
export const typeDef = gql`
extend type Query {
messages(cursor: String!): [Message!]!
conversation(cursor: String!, receiverId: ID!): [Message!]!
}
extend type Subscription {
messageSent: Message
}
extend type Mutation {
sendMessage(sendMessageInput: SendMessageInput!): Message!
}
type Message {
id: ID!
message: String!
sender: User!
receiver: User!
}
input SendMessageInput {
message: String!
receiverId: ID!
}
`
export const resolvers = {
Query: {
messages: async (parent, args, { models, user }, info) => {
if (!user) { throw new Error('You must be logged in') }
const { cursor } = args
const users = await models.user.all()
const messages = await models.user.getMessages(user.id, cursor)
const filteredMessages = messages.map(message => {
const sender = users.find(user => user.id === message.senderId)
const receiver = users.find(user => user.id === message.receiverId)
return { ...message, sender, receiver }
})
return filteredMessages
},
conversation: async (parent, args, { models, user }, info) => {
if (!user) { throw new Error('You must be logged in') }
const { cursor, receiverId } = args
const users = await models.user.all()
const messages = await models.message.getConversation(user.id, receiverId, cursor)
const filteredMessages = messages.map(message => {
const sender = users.find(user => user.id === message.senderId)
const receiver = users.find(user => user.id === message.receiverId)
return { ...message, sender, receiver }
})
return filteredMessages
}
},
Subscription: {
messageSent: {
subscribe: (parent, args, { pubsub, user }, info) => {
if (!user) { throw new Error('You must be logged in') }
return pubsub.asyncIterator([subscriptionEnum.MESSAGE_SENT])
}
}
},
Mutation: {
sendMessage: async (parent, args, { models, user, pubsub }, info) => {
if (!user) { throw new Error('You must be logged in') }
const { message, receiverId } = args.sendMessageInput
const receiver = await models.user.findById(receiverId)
if (!receiver) { throw new Error('receiver not found') }
const result = await models.message.insert([{
message,
senderId: user.id,
receiverId
}])
const newMessage = {
id: result[0],
message,
receiver,
sender: user
}
pubsub.publish(subscriptionEnum.MESSAGE_SENT, { messageSent: newMessage })
return newMessage
}
}
}
// src/schema/user.js
import { gql } from 'apollo-server'
export const typeDef = gql`
extend type Query {
users: [User!]!
}
extend type Mutation {
createUser(createUserInput: CreateUserInput!): User!
login(email: String!, password: String!): String!
}
type User {
id: ID!
name: String!
email: String!
password: String!
}
input CreateUserInput {
name: String!
email: String!
password: String!
}
`
export const resolvers = {
Query: {
users: async (parent, args, { models, user }, info) => {
if (!user) { throw new Error('You must be logged in') }
const users = await models.user.all()
return users
}
},
Mutation: {
createUser: async (parent, args, { models }, info) => {
const { name, email, password } = args.createUserInput
const user = await models.user.findOne({ email })
if (user) { throw new Error('Email already taken') }
const hash = await models.user.hash(password)
const result = await models.user.insert([{
name,
email,
password: hash
}])
return {
id: result[0],
password: hash,
name,
email
}
},
login: async (parent, args, { models }, info) => {
const { email, password } = args
const user = await models.user.findOne({ email })
if (!user) { throw new Error('Invalid credentials') }
if (!await models.user.compare(user.password, password)) { throw new Error('Invalid credentials') }
return models.user.generateToken(user)
}
}
}
Each file has the schema defined in the const typeDef
and the resolvers for this schema is in the resolver object.
So what are that resolvers objects? Resolvers contains the logic that will be executed when a query, mutation or subscription defined in our application schema is called. They are functions that accept the following arguments:
parent The object that contains the result returned from the resolver on the parent field
args The arguments passed to the query, for example, the login mutation receives email
and password
arguments
context Is an object shared by all resolvers, in our application it contains the model object that we defined before and the logged in user.
info Contains information about the execution state of the query
So if you want to define the resolvers for the type Query put them in the Query
, if want to define for Mutation type, put inside Mutation
object, and so on.
About pagination, I chose to use a cursor based pagination, you can see in the messages query in message schema, that query accepts a cursor as argument, yes, we can pass arguments to GraphQL queries, the cursor value is the ID of the last message returned.
Now we have one last thing to do, and that is to the define the application entry point (src/index.js
):
//src/index.js
import { ApolloServer, PubSub } from 'apollo-server'
import { schema } from './schema'
import models from './model/index'
const pubsub = new PubSub()
const getUser = async (req, connection) => {
let user = null
if (req && req.headers.authorization) {
const token = req.headers.authorization.replace('Bearer ', '')
user = await models.user.getUserByToken(token)
} else if (connection && connection.context.Authorization) {
const token = connection.context.Authorization.replace('Bearer ', '')
user = await models.user.getUserByToken(token)
}
return user
}
const server = new ApolloServer({
schema,
context: async ({ req, res, connection }) => {
return {
models,
pubsub,
user: await getUser(req, connection)
}
}
})
server.listen().then(({ url }) => {
console.log(`🚀 Server ready at ${url}`)
})
Here we create an instance of ApolloServer with the schema we defined before, in the option context
we set which resources will be available for the resolvers in the context argument, before we return these resources we check if there is a logged in user using the token that we will receive from the request, if you use express you can put the logic of fetch an user by a token in a middleware like in this example
The server will run in the default url http://localhost:4000/
, there you can test the application making some queries in GraphQL playground, you can learn more about here.
In the part two we will develop the front-end using Apollo Client and ReactJS.
Top comments (2)
Nice tutorial, but i would rather not point out NODE_ENV in environment file, you may forget changing it which would give undesirable results, you should let it be as it is hence it is changed automatically when you deploy app
Makes sense, thanks.