Most GraphQL schemas have types representing things that need to be queried, such as Books, Items, Authors, Customers, etc. To constrain on the items returned by the queries, there are multiple approaches I’ve seen in the wild to specify filters:
# Option 1 - 🤮 filter yourself, damn it!
query Books
# Option 2 - 🤕 What about Id and Name???
query BooksById {}
query BooksByName {}
query BooksByAuthor {}
# Option 3 - 🥺 can I haz more?
query Books(filter: {id: String, name: String}) {}
# Option 4 - ✨ awesome!
query Books(filter: {id: String, id_contains, id_in: [String]})
Option 2 is often too verbose, non-composable can be downright ugly. Option 3 starts to provide the right level of composability but wanting anything other than an equality comparison is to be left wanting. I like option 4 a lot because depending on the effort the API owners are willing to take on, provides the most compact, composable, and efficient API. For the rest of this post, I’ll be focused on Option 4.
Writing out the definitions for option 4 can be cumbersome, produce verbose schemas, is prone to errors/typos, mistakes that might be hard to catch, and are subject to variations by different members or teams. Where there are 10 entity fields and an average of 4 operators per entity field, that’s 40 fields necessary per entity multiplied by the number of entities in the schema 😱.
query Books(filter: BookFilter): [Book!]
query Authors(filter: AuthorFilter): [Book!]
type Book: {
id: ID!
}
type Author: {
id: ID!
}
input BookFilter {
AND: [BookFilter!];
OR: [BookFilter!];
NOT: [BookFilter!];
id: ID;
name: String;
name_contains: String;
name_in: [String];
name_startsWith: String;
name_endsWith: String;
quantity: Int;
quantity_lt: Int;
quantity_lte: Int;
quantity_gt: Int;
quantity_gte: Int;
publishDate: Date;
publishDate_lt: Date
publishDate_lte: Date
publishDate_gt: Date
publishDate_gte: Date
authors_some: AuthorFilter
authors_every: AuthorFilter
authors_none: AuthorFilter
# ..other fields
}
input AuthorFilter {AND: [BookFilter!];
OR: [BookFilter!];
NOT: [BookFilter!];
id: ID;
name: String;
name_contains: String;
name_in: [String];
name_startsWith: String;
name_endsWith: String;
# ..other fields
}
I propose a simpler approach for arriving at the same output while keeping the schema DRY-er, smaller and hopefully more manageable across team members and teams using GraphQL directives.
Directives
A directive decorates part of a GraphQL schema or operation with additional configuration. Tools like Apollo Server (and Apollo Client) can read a GraphQL document’s directives and perform custom logic as appropriate.
Directives are preceded by the @ character, like so:
https://www.apollographql.com/docs/apollo-server/schema/directives/
The core GraphQL specification includes exactly two directives, which must be supported by any spec-compliant GraphQL server implementation:
@include(if: Boolean) Only include this field in the result if the argument is true.
@skip(if: Boolean) Skip this field if the argument is true.
Directives can be useful to get out of situations where you otherwise would need to do string manipulation to add and remove fields in your query. Server implementations may also add experimental features by defining completely new directives.
https://graphql.org/learn/queries/#directives
Solution
Using directives, the schema definition shrinks significantly to
query Books(filter: BookFilter): [Book!]
query Authors(filter: AuthorFilter): [Author!]
type Book: {
id: ID!
}
type Author: {
id: ID!
}
input BookFilter {
id: ID @filter
name: String @filter
quantity: Int @filter
authors: AuthorFilter @filter(relation: One)
# other fields
}
input AuthorFilter {
id: ID @filter
name: String @filter
# other fields
}
To expand the schema into the complete output from earlier, we use a directive transformer. A directive called filter here is defined in the schema file. This directive can only be applied to GraphQL input fields hence the constraint.
directive @filter(relation: RelationTypes) on INPUT_FIELD_DEFINITION
enum RelationTypes {
One
Many
}
For the actual transformer (filterDirectiveTransformer
in code below), we map over the schema using the mapSchema
function from @graphql-tools/utils.
mapSchema
is a powerful tool, in that it creates a new copy of the original schema, transforms GraphQL objects as specified, and then rewires the entire schema such that all GraphQL objects that refer to other GraphQL objects correctly point to the new set.
For the things we are interested in, we have registered callbacks. The first two MapperKind.DIRECTIVE
and MapperKind.ENUM_TYPE
removes the @filter
directive, and Enum type called RelationTypes
that we added to the schema definition above from the final schema so consumers of our API has no access to them. MapperKind.INPUT_OBJECT_TYPE
starts by checking every input for fields with the @filter
directives. There’s a MapperKind.INPUT_OBJECT_FIELD
for each input field but in my case, I needed to mutate the entire input object definition, to add the AND
, OR
, and NOT
fields and it made more sense to collect the logic all in one handler rather than track state in closures and spread the logic around.
export const SupportedScalarTypes = [
"String",
"Int",
"Float",
"Boolean",
"Date",
"DateTime",
] as const;
export type Relation = "One" | "Many";
const filterDirectiveTransformer = (
schema: GraphQLSchema,
directiveName: string,
) => {
return mapSchema(schema, {
[MapperKind.DIRECTIVE]: (directiveConfig) =>
directiveConfig.name === directiveName ? null : directiveConfig,
[MapperKind.ENUM_TYPE]: (type) =>
type.name === "RelationTypes" ? null : type,
[MapperKind.INPUT_OBJECT_TYPE]: (type) => {
const fields = type.getFields();
// get all fields with the filter directive
const fieldsWithDirectives = Object.values(fields).filter((field) =>
getDirective(schema, field, directiveName),
);
// if there are no fields with the filter directive, return early
if (fieldsWithDirectives.length === 0) {
return type;
}
const filterFields: string[] = [];
const relationInputDummyTypes = [];
for (const field of fieldsWithDirectives) {
const filterDirective = getDirective(schema, field, directiveName);
if (filterDirective) {
const args = (filterDirective as FilterDirectiveArgs)[0];
// @ts-ignore
const astType = field.astNode.type as NamedTypeNode;
const isScalar = SupportedScalarTypes.includes(
astType.name.value as any,
);
const isRelation = !!args.relation;
if (isScalar) {
filterFields.push(
buildScalarFilterFields(astType.name.value as any, field.name),
);
}
if (isRelation) {
let relatedType = field.type as GraphQLInputObjectType;
filterFields.push(
buildRelationFilterFields(
// @ts-ignore
args.relation,
field.name,
relatedType.name,
),
);
relationInputDummyTypes.push(relatedType);
}
}
}
const source = `
scalar Date
scalar DateTime
${relationInputDummyTypes
.filter((relatedType) => relatedType.name !== type.name)
.map((type) => `input ${type.name} { id: ID! }`)
.join("\n")}
input ${type.name} {
AND: [${type.name}!]
OR: [${type.name}!]
${filterFields.join("")}
}
`;
const newInputSchema = buildSchema(source);
return new GraphQLInputObjectType({
...type,
fields: {
...(
newInputSchema.getType(type.name) as GraphQLInputObjectType
).getFields(),
},
});
},
});
};
For each field found on the Input type passed to the visitor callback, we grab its args and figure out if the field type is a supported scalar type. If the field is a scalar type, we call the buildScalarFilterFields
builder to return a string based on a template. The buildRelationFilterFields
function is called in the case the filter specifies a relation arg on the directive.
export const buildScalarFilterFields = (
fieldType: (typeof SupportedScalarTypes)[number],
fieldName: string,
) => {
switch (fieldType) {
case "String":
return `
${fieldName}: String # equals
${fieldName}_not: String # not equals
${fieldName}_in: [String] # in
${fieldName}_not_in: [String] # not in
${fieldName}_contains: String # contains
${fieldName}_not_contains: String # not contains
${fieldName}_startsWith: String # starts with
${fieldName}_not_startsWith: String # not starts with
${fieldName}_endsWith: String # ends with
${fieldName}_not_endsWith: String # not ends with
`;
case "Int":
case "Float":
case "Date":
case "DateTime":
return `
${fieldName}: ${fieldType} # equals
${fieldName}_not: ${fieldType} # not equals
${fieldName}_in: [${fieldType}] # in
${fieldName}_not_in: [${fieldType}] # not in
${fieldName}_lt: ${fieldType} # less than
${fieldName}_lte: ${fieldType} # less than or equal
${fieldName}_gt: ${fieldType} # greater than
${fieldName}_gte: ${fieldType} # greater than or equal
`;
case "Boolean":
return `
${fieldName}: Boolean # equals
${fieldName}_not: Boolean # not equals
`;
}
};
export const buildRelationFilterFields = (
relationType: Relation,
fieldName: string,
inputTypeName: string,
) => {
switch (relationType) {
case "One":
return `
${fieldName}_is: ${inputTypeName} # condition must be true for all
${fieldName}_is_not: ${inputTypeName} # condition must be false for all
`;
case "Many":
return `
${fieldName}_every: [${inputTypeName}] # condition must be true for all
${fieldName}_some: [${inputTypeName}] # condition must be true for at least one
${fieldName}_none: [${inputTypeName}] # condition must be false for all
`;
}
};
In the transformer, we combine the filters into a new type definition along with “dummy” types we collect that are relations into a string that’s parsed by buildSchema from graphql/utilities. Dummy types here are other input types that are referenced in the input type being transformed. For example, property author
in BookFilter
might be of type AuthorFilter in the schema. Calling buildSchema
without a dummy AuthorFilter
when transforming the BookFilter throws an error due to the missing type. Finally, we collected all the fields on the newly built schema for the filter fields and return a new GraphQLInputObjectType
instance.
The transformer can be applied to a schema and tested against Apollo Server with the following:
let schema = makeExecutableSchema({
typeDefs: readFileSync("./src/schema/schema.graphql", "utf8"),
resolvers,
});
const server = new ApolloServer({
schema: filterDirectiveTransformer(schema, "filter"),
});
async function run() {
const { url } = await startStandaloneServer(server, {
listen: { port: 4000 },
});
console.log(`🚀 Server ready at: ${url}`);
}
run();
Here’s a screenshot of the output schema in Apollo Studio after the transformer has been applied to a schema similar to the one defined above:
GraphQL Code Generator and friends
Because this is a schema-level transformation at server start-up time, these fields won’t be available in tools that read your source schema as part of a build step. One popular tool that can read your schema in such a way is the hugely awesome GraphQL code generator when the schema is specified with a *.ts
or *.graphl
file instead of a URL. In that case, GraphQL codegen provides a way to register plugins that can similarly transform the schema using very similar code save for changes to match typing. Here’s the graphql.config.yml
file to configure GraphQL code generator.
schema: src/schema/schema.graphql
extensions:
codegen:
generates:
src/generated/graphql.ts:
plugins:
- src/graphql/codegen/plugins/filter-transformer.ts
- typescript
- typescript-resolvers
Notice the transformer under the plugins
array. This needs to be the first plugin registered so that any changes it makes to the schema are visible to other plugins downstream. Below is the transformer code which is very similar to the one above in logic. I don’t remove the directive or relation type enum here.
type FilterDirectiveArgs = {
relation?: Relation;
};
const plugin =
(directiveName: string): PluginFunction =>
(schema, documents, config) => {
const rootNode = getCachedDocumentNodeFromSchema(schema);
visit(rootNode, {
InputObjectTypeDefinition(node: InputObjectTypeDefinitionNode) {
const fields = node.fields;
const fieldsWithDirectives = Object.values(fields).filter(
(field) =>
field.directives?.some(
(directive) => directive.name.value === directiveName,
),
);
if (fieldsWithDirectives.length === 0) {
return node;
}
const filterFields: string[] = [];
const relationInputDummyTypes: NamedTypeNode[] = [];
for (const field of fieldsWithDirectives) {
const filterDirective = field.directives?.find(
(directive) => directive.name.value === directiveName,
);
if (filterDirective) {
const args = (
filterDirective.arguments?.[0]?.value
? Object.fromEntries(
filterDirective.arguments.map((field) => [
field.name.value,
(field.value as BooleanValueNode | EnumValueNode).value,
]),
)
: {}
) as FilterDirectiveArgs;
const astType = field.type as NamedTypeNode;
const isScalar = SupportedScalarTypes.includes(
astType.name.value as any,
);
const isRelation = !!args.relation;
if (isScalar) {
filterFields.push(
buildScalarFilterFields(
astType.name.value as any,
field.name.value,
),
);
}
if (isRelation) {
let relatedType = field.type as NamedTypeNode;
filterFields.push(
buildRelationFilterFields(
args.relation,
field.name.value,
relatedType.name.value,
),
);
relationInputDummyTypes.push(relatedType);
}
}
}
const source = `
scalar Date
scalar DateTime
${relationInputDummyTypes
.filter((relatedType) => relatedType.name.value !== node.name.value)
.map((type) => `input ${type.name.value} { id: ID! }`)
.join("\n")}
input ${node.name.value} {
AND: [${node.name.value}!]
OR: [${node.name.value}!]
${filterFields.join("")}
}
`;
const newInputSchema = buildSchema(source);
const returnType = newInputSchema.getType(
node.name.value,
) as GraphQLInputObjectType;
(node.fields as any) = returnType.astNode.fields;
},
});
return "";
};
🔥 with Prisma
Prisma is an open source next-generation ORM. It consists of the following parts:
- Prisma Client: Auto-generated and type-safe query builder for Node.js & TypeScript
- Prisma Migrate: Migration system
- Prisma Studio: GUI to view and edit data in your database.
I like the query syntax of Prisma a lot. You introspect your database using a tool that generates a schema file from which you generate a 💪🏾-ly typed API that you use for CRUD-ing entities to and from your database. I have my issues with Prisma though which I’ll save for another post. The query API shape is a lot like another API I like a lot. It’s been called OpenCRUD. This is a rich query interface packed into a small API surface to allow highly composable specifications of results to be passed kinda like we built above for our API.
When I’ve found 2 independent things I like in the past, I try to create thingamajig out of the 2 of them, and so here’s a simple way to take queries from your GraphQL endpoint and send them straight to Prisma through your GraphQL resolvers.
To simplify this, I implement this for only Scalar fields and no relations and using the flat library. The unflatten
function in flat
allows keys to be “unflattened” into a nested object. For example,
console.log(
unflatten({ id_not_in: ["1", "2", "3"] }, { delimiter: "_" })
)
// { id: { not: { in: ["1", "2", "3"] } } }
Using everything from above,
import { unflatten } from "flat";
import { PrismaClient, Prisma } from "@prisma/client";
import { Resolvers } from "../generated/graphql";
const prismaClient = new PrismaClient({
log: ["query"],
});
const resolvers: Resolvers = {
Query: {
books: async (_, { filter }) => {
const where = unflatten<typeof filter, Prisma.BookWhereInput>(filter, {
delimiter: "_",
});
console.log(where);
const books = await prismaClient.book.findMany({ where });
console.log(books);
return books || [];
},
},
};
export default resolvers;
Sample queries and queries sent to Postgres:
{ title: 'Test' }
prisma:query SELECT "public"."Book"."id", "public"."Book"."title", "public"."Book"."author" FROM "public"."Book" WHERE "public"."Book"."title" = $1 OFFSET $2
{ title: { not: { contains: 'Intro' } } }
prisma:query SELECT "public"."Book"."id", "public"."Book"."title", "public"."Book"."author" FROM "public"."Book" WHERE "public"."Book"."title"::text NOT LIKE $1 OFFSET $2
Note: Using the same pattern, you can add a sort directive for generating sort GraphQL enums for your API using a transformer too.
Conclusion
The implementation of GraphQL directives, particularly the custom @filter
directive, offers a compelling solution for enhancing the filtering capabilities of GraphQL APIs. The article has explored various filtering options, emphasizing the advantages of a concise, composable, and efficient API through the use of directives.
By introducing a directive-based approach, the article aims to streamline the schema, making it more manageable across teams and reducing verbosity. The proposed solution not only addresses the challenge of complex filtering requirements but also contributes to a more maintainable and readable codebase.
The integration with Prisma showcases a practical application, demonstrating how these directives can seamlessly translate GraphQL queries into database operations. While this approach may not be one-size-fits-all, it provides a valuable tool for developers facing scenarios where compact and composable filtering is a priority.
As with any technical solution, it’s essential to consider the specific needs of your project and weigh the benefits against potential trade-offs. Nevertheless, the exploration of GraphQL directives for supercharging API filtering offers an insightful perspective on optimizing GraphQL schemas for improved efficiency and developer experience.
Hire me: I’m a full-stack engineer with over 10 years of experience in React, Javascript/Typescript, C#, PHP, Python, Rest/GraphQL, XState, SQL (Postgres/MySQL), AWS, and more.
LinkedIn: https://www.linkedin.com/in/abdulkadirna/
Twitter: https://twitter.com/adulkadirna
Top comments (1)
Good share