How GraphQL can improve your development workflow

Adrien HARNAY
Brigad Engineering
Published in
5 min readOct 24, 2019

--

Photo by Steven Lelham on Unsplash

There has been some discussion recently about whether GraphQL makes sense in an HTTP/2(3) world, followed by an excellent article written by Marc-André Giroux.

GraphQL isn’t inherently special, and yes alternatives will exist! Typed schema? Just use OpenAPI! Server-side abstraction to handle multiple client use cases? There are many ways to do that. Introspection? Hypermedia can allow clients to discover actions and can start from the root too. The amazing GraphiQL? I’m sure there’s something for OpenAPI. It’s always possible to recreate parts of what GraphQL has. However, it’s everything together, found under one specification that made GraphQL appealing and great to so many of us.

These words resonated the most with me and inspired me to share how GraphQL helped us improve our development workflow at Brigad, from the conception of the feature to the delivery.

Pain points with our previous workflow

Let’s start from the beginning: what was our initial situation and what are the reasons we needed to improve our workflow? Also, can you identify similarities with the way your development workflow?

Our previous workflow would look like this:

Product -> back-end -> front-end -> QA

The product would assess the problem, work on a solution and deliver it once finished. Then the back-end would implement all the routes needed to display/edit the data. Then the front-end would implement the UI and link it to the back-end. Then the QA would test the back-end and the front-end at the same time. When everyone’s clear on the solution and communication occurs naturally, this workflow isn’t so bad.

But what happens when someone at the end of the chain discovers a flaw in the solution? Everyone must start over, or at a minimum adapt their work. We needed to find a way to work in parallel.

The pre-implementation meeting allows for parallelized work between three of the four teams

So we established a detailed pre-implementation meeting (we had one in our previous workflow, but it was not detailed enough that it deserved to be on the diagram). What’s a pre-implementation meeting? Once the product research is complete and all key features have been identified, all teams gather and define a contract that will be implemented by the back-end, and consumed by the front-end. The product is free to build the UI as they like as long as the features they use are in the contract.

Parallelized work means it is faster, right? Yes, and no. No one signed anything, and since until then we were using Javascript and REST, any change to the back-end implementation would require human communication, and changes to the front-end too, or else it could break or behave in the wrong way.

Anyway, wasn’t this article about GraphQL?

GraphQL, the signature to our contract

GraphQL is a query language for APIs. You define a schema, defining all the operations your API supports, and what data these operations return, with a type associated to every field. These types are our missing blocks and allow for even more parallelization.

Our new pre-implementation meeting allows testing the back-end instantly. But how?

What’s our new pre-implementation meeting like? The back-end and front-end teams collaborate to write the schema based on the product research just like before, but this time the server runs a validation on what gets asked and what comes out, based on the schema. This means that if any field of the schema is not defined correctly, the back-end and front-end team will both get an error. When both teams are on the same level, the communication improves.

Now, when you have types and server-side validation, it is easy to write (or generate) mocks for your operations. This will be the next step: write mocks to cover every operation of your schema. This will catch early flaws in the design of your schema, and you will have a fully operational server with which you can interact.

Once the schema and mocks are written, all the teams can work in parallel, even the QA! They will write integration tests with the mocks, and once the back-end is done the mocks will be replaced by real data and the tests should still pass. And when everyone is done, the QA can finish by testing the feature as a whole.

Well, what if you don’t want to rewrite your server using GraphQL? You don’t need to! Write your schema, and proxy each operation to your existing REST API. You will end up with a gateway between your front-end and existing REST API, powered by GraphQL and all of the advantages listed in this article.

But wait, there’s even more

Thanks to Apollo (a library that works with GraphQL), we can generate type definitions from our schema with one command. This means that if you’re using Typescript, Flow, Swift, Java, or any typed language, you won’t ever have to worry about the API silently breaking. One command is all it takes to regenerate the types, type check every line of code that consumes the data from our API, and see a complete list of errors (if any). All of your back-end and front-end types are derived from your schema, automatically.

If you’re using React, react-apollo gives you components and hooks that are wonderfully typed and will help you manage your data layer. All this tooling makes for a great developer experience and more confidence thanks to the types.

For most companies, startups especially, iteration speed is a matter of life and death. Optimizing your feedback loop can improve your iteration speed. GraphQL is known for its benefits on network optimizations, reduced round trips and so on.

But GraphQL is about more than just optimization, and with its ecosystem, it is a great candidate for the task and will most likely reduce the back-and-forth between your teams and improve communication. At least it has for us at Brigad, and we’re even pushing further in this direction to improve the way we work together, automate more and more things and to gain even more confidence in the solutions we build.

--

--