Skip to content

GraphQL Updates- October 2021

This article was written over 18 months ago and may contain information that is out of date. Some content may be relevant but please refer to the relevant official documentation or available resources for the latest information.

State of GraphQL

State of GraphQL brings together core contributors, members of the GraphQL Foundation, and community leaders to talk about the future of GraphQL, and answer audience questions.

Panelists

Sponsors

Shout-out to the sponsors of the event, StepZen and Moon Highway!

  • StepZen

    One GraphQL API. All Your Data. Zero Infrastructure.

StepZen is a GraphQL-as-a-Service provider, even including a FREE forever pricing tier with 300k calls/month included.

  • Moon Highway

    The cutting edge JavaScript training for engineers of all skill levels

Moon Highway is a training company that teaches a lot of courses centered around the JavaScript ecosystem.

Hosts

  • Tracy Lee, CEO, This Dot Labs
  • Eve Porcello, Co-founder & Instructor, Moon Highway

Panelists

GraphQL Updates

The first portion of the State of GraphQL covered updates from each panelist.

The GraphQL Foundation

One of the first topics covered by State of GraphQL was The GraphQL Foundation.

The GraphQL Foundation is a neutral foundation founded by global technology and application development companies. The GraphQL Foundation encourages contributions, stewardship, and a shared investment from a broad group in vendor-neutral events, documentation, tools, and support for GraphQL.

GraphQL.org

Uri opened up the conversation by emphasizing how open the foundation is. The GraphQL Foundation has always been open, but it's now easier than ever to become part of the community. The true "meat" of where the GraphQL action is though is in the GraphQL Working Group. There, topics such as changes to the GraphQL spec, ideas for new features, and more are discussed. Also, all the meetings are recorded and even summarized by text!

Apollo Odyssey

Odyssey is Apollo's free interactive learning platform.

Janessa from the Apollo team introduced us to Apollo Odyssey. Odyssey is Apollo's official learning platform. It was created to improve the developer experience of learning GraphQL. While some people may be able to learn effectively from docs, others may find different learning styles, like those offered by Odyssey, a more effective way of learning.

Courses offered range from the basics, like what are queries and resolvers, to the more advanced subjects, such as Apollo Federation.

Apollo Odyssey

Hasura

Tanmai from Hasura spoke next about updates done within Hasura. One of the biggest new features is support for cross-database joins over GraphQL. This update allows the developer to retrieve data from any number of data stores with efficient queries. The next thing being worked on now is end-to-end streaming.

PayPal

Joey Nenni of PayPal spoke about PayPal's experience with adopting GraphQL. He mentioned that once PayPal supported developers with tooling and sample apps, GraphQL spread like wildfire. Seeing the potential to do more with GraphQL, especially with sharing components, PayPal looked into Apollo Federation about 9 months ago, and as of about 2 weeks ago, they are now live with millions of requests already going through their gateway.

Joey also spoke about how GraphQL adoption is really about persistence. It's an iterative process. By making small steps, collecting small wins, and repeating that process, it becomes a lot easier to sell GraphQL. Joey coined the term "Guerilla Warfare" when it comes to finding successful implementations for GraphQL solutions.

GraphCMS

GraphCMS gives you instant GraphQL Content APIs to create, enrich, unify, and deliver your content across platforms.

GraphCMS recently hosted GraphQL Conf AND has recently raised $10 million in a Series A funding round. With all of this GraphQL focused momentum, GraphCMS is poising itself as a powerful solution for Headless CMS.

The ultimate content platform that enables you to deliver applications at scale.

Carlos Rufo

Defer and Stream

One important piece of information that Uri brought to the groups' attention was that defer, stream, live queries, and other features in spec proposals, but not in the actual spec, CAN BE USED SAFELY. Uri noticed a patter in production settings: once people have been given the option to use these new features, like defer, they see how valuable those features can be.

If you want to see more information involving these GraphQL directives, you can check out a blog post covering them here.

The Defer Directive

Apollo Federation

Implement a single graph across multiple services

Apollo Documentation

One very popular subject that is commonly brought up in Apollo conversations is Apollo Federation, and the idea of having a unified Graph connecting many services.

For PayPal, they spoke about the experiences encountered while transitioning some of their shared logic to a federated graph, while maintaining developer experience.

Apollo itself had a very positive experience using Apollo Federation. They ran into the need to unite two services, one written in Kotlin, and one in TypeScript, and have them run under a single gateway. This was a very big win and prevented the need for developers to learn additional languages to support a larger graph.

Tanmai had a balanced perspective on Apollo Federation:

Are we federating how the API comes together and presents itself? Or are we federating the execution itself?

He shared his experiences with analyzing where exactly a federated graph could be a good fit.

For Uri, he spoke about how the most important question is, "Are we actually making the product development faster?" Adopting Apollo Federation may give a different answer depending on the team and work being done.

Conclusion

GraphQL is still growing rapidly, and as shown in this State of GraphQL, there is a growing thriving community surrounding it. Continue the conversation, no matter your experience level, and check out the GraphQL Working Group. While this was a quick summary of the things spoken about during the State of GraphQL, you can catch the whole video on YouTube.

Thank you for enjoying the State of GraphQL!

This Dot is a consultancy dedicated to guiding companies through their modernization and digital transformation journeys. Specializing in replatforming, modernizing, and launching new initiatives, we stand out by taking true ownership of your engineering projects.

We love helping teams with projects that have missed their deadlines or helping keep your strategic digital initiatives on course. Check out our case studies and our clients that trust us with their engineering.

You might also like

Class and Enum Typings for Handling Data with GraphQL cover image

Class and Enum Typings for Handling Data with GraphQL

Intro Today we’ll talk about class and enum typings for handling more complex data with GraphQL. The typings will be used on class objects to make them easier to read and understand. This blog will build on the concepts of another blog found here, in which I discussed wrapping and enhancing data, which talks about how to create a GraphQL Rest API wrapper, and how to enhance your data. This article assumes you have a basic understanding of classes, typing, and enums. Here is the code repo if you want to review it while reading, or just skip ahead. With that said, we will look at the class structure we’re going to be using, the enums, and the type lookup for our customField enum with our mapper. Class setup This class we’re going to set up will be strictly for typing. We’re going to make a RestaurantGuest class as this data will be restaurant themed. So in our restaurant.ts file, we will name it RestaurantGuest, and include a few different items. *Basic starting class example* ` After setting that up, we will add a type that will reference the above class. *Type example* ` This type will be used later when we do the type lookup in conjunction with our mapper function. With the above finished, we can now move on to handling our enums. Handling enums We’ll now create our Enum to make dealing with our complex data easier. Since the above 3249258 represents FoodReservation, we will create an enum for it. Now you might be wondering why 3249258 represents FoodReservation. Unfortunately, this is an example of how data can be returned to us from a call. It could be due to the field id established in a CMS such as Contentful, or from another source that we don’t control. This isn’t readable, so we’re creating an Enum for the value. *Enum example* ` This will be used later during our type look-up. Type lookup Now we can start combining the enum from above with our class type, and a look-up type that we’re about to create. So let's make another type called RestaurantGuestFieldLookup, which will have an id and value. *Look-up type example* ` Perfect, now we’ll swap out ` to be ` We can now move on to creating and using our mapper. In a separate file called mapper.ts, we will create our restaruantGuestMapper function. ` Tada! Thanks to all the work above, we can easily understand and get back the data we need. Conclusion Today's article covered setting up a typing class, creating an enum, and type lookup with a mapper for more complex data handling with GraphQL. Hopefully, the class structure was straightforward and helpful, along with the enums and type lookup and formatting examples. If you want to learn more about GraphQL, read up on our resources available at graphql.framework.dev....

Efficiently Extract Object References in Shopify Storefront GraphQL API cover image

Efficiently Extract Object References in Shopify Storefront GraphQL API

Efficiently Extract Object References in Shopify Storefront GraphQL API Introduction So, this blog post is born out of necessity and a bit of frustration. If you're diving into the world of Shopify's Storefront API, you've probably realized that while it's powerful, extracting data in the object reference from Metadata fields or Metaobjects (in the GraphQL query) can be a bit like searching for a needle in a haystack. This complexity often arises not from the API's lack of capabilities but from the sparse and sometimes unclear documentation on this specific aspect. That's precisely why I decided to create this post. As a developer, I found myself in a situation where the documentation and community resources were either scarce or not detailed enough for the specific challenges I faced. This guide is the result of my journey - from confusion to clarity. The Situation To understand the crux of my challenge, it's essential to recognize that creating metafields and metaobjects is a common practice for those seeking a more customized and controlled experience with Shopify CMS. In my specific case, I wanted to enrich the information available for each product's vendor beyond what Shopify typically allows, which is just a single text box. I aimed to have each vendor display their name and two versions of their logo: a themed logo that aligns with my website's color scheme and an original logo for use on specific pages. The challenge emerged when I fetched a list of all vendors to display on a page. My GraphQL query for the Storefront API looked like this: ` This was when I hit a roadblock. How do I fetch a field with a more complex type than a simple text or number, like an image? To retrieve the correct data, what specific details must I include in the originalLogo and themedLogo fields? In my quest for a solution, I turned to every resource I could think of. I combed through the Storefront API documentation, searched endlessly on Stack Overflow, and browsed various tech forums. Despite all these efforts, I couldn’t find the clear, detailed answers I needed. It felt like I was looking for something that should be there but wasn’t. Solution Before diving into the solution, it's important to note that this is the method I discovered through trial and error. There might be other approaches, but I want to share the process that worked for me without clear documentation. My first step was to understand the nature of the data returned by the Storefront API. I inspected the value of a metaobject, which looked something like this: ` The key here was the gid, or global unique identifier. What stood out was that it always includes the object type, in this case, MediaImage. This was crucial because it indicated which union to use and what properties to query from this object in the Storefront API documentation. So, I modified my query to include a reference to this object type, focusing on the originalLogo field as an example: ` The next step was to consult the Storefront API documentation for MediaImage at Shopify API Documentation. Here, I discovered the image field within MediaImage, an object containing the url field. With this information, I updated my query: ` Finally, when executing this query, the output for a single object was as follows: ` Through this process, I successfully extracted the necessary data from the object references in the metafields, specifically handling more complex data types like images. Conclusion In wrapping up, it's vital to emphasize that while this guide focused on extracting MediaImage data from Shopify's Storefront API, the methodology I've outlined is broadly applicable. The key is understanding the structure of the gid (global unique identifier) and using it to identify the correct object types within your GraphQL queries. Whether you're dealing with images or any other data type defined in Shopify's Storefront API, this approach can be your compass. Dive into the API documentation, identify the object types relevant to your needs, and adapt your queries accordingly. It's a versatile strategy that can be tailored to suit many requirements. Remember, the world of APIs and e-commerce is constantly evolving, and staying adaptable and resourceful is crucial. This journey has been a testament to the power of perseverance and creative problem-solving in the face of technical challenges. May your ventures into Shopify's Storefront API be equally rewarding and insightful....

How to make Videos with React using Remotion cover image

How to make Videos with React using Remotion

How to make Videos with React using Remotion We've written a blog post discussing how React truly is a "Learn Once, Write Anywhere" library. In that article, we talked about how we aren't limited to rendering React components to the DOM. In this post, we'll discuss how we can leverage our knowledge of developing React components and create Videos, using Remotion. If you already know [React] and follow along with this post, you'll walk away with a custom .mp4 video of you're own creation, and the knowledge of how to customize it to you're heart's content. Remotion Remotion is a suite of libraries for creating videos programmatically with React. Just like react-dom is used to provide an interface between React components, and rendering to the DOM, Remotion is used to provide an interface between React and rendering to Video. > If you know React, you can make videos. > -- Remotion Documentation This is exciting because it allows developers to re-use their knowledge of the React library for yet another purpose. Let's dive in and see how we can start rendering some videos! Initializing a Remotion Video Project Creating a new Remotion video is as easy as a 1 line command: npm init video Running this command gives the user access to a few default templates for Remotion video development: - Hello World - Blank - Hello World (Vanilla Javascript) - React Three Fiber - Still Images - Text to Speech The Hello World templates are fantastic places to go if you are a hands-on learner and like playing around with your code in a live dev environment. They're how I learned to use Remotion. They're packed with such great logic, that I'd encourage you to run a template before moving on and exploring them a bit to see how things are laid out. That being said, I want to go over things rather slowly, so I'm going to create a repo from the Blank template, add a few development features (linting, husky, prettier, etc), and use that as a starting point. If you want to develop with these features and want a similar starting point, clone the repo here and you can see every commit from beginning to end. This is the exact commit after I finished installing linting to the Blank template. The Blank Remotion Template If you cloned the repo I mentioned above, you'll be working in the Blank template, with a few additional dev features. The main 2 files are: - src\index.tsx: This is similar to the index file of a regular create-react-app. It uses a render function to take a React component, and render it to the desired output, in this case, a .mp4 video. We won't be looking in here. - src\Video.tsx: This is the main entry point of our video. Think of this as the App component in a create-react-app project. It's here that we layer together React components to create our Remotion video. This is where we will start to focus our attention. The Composition component The entry point file Video.tsx introduces us to our first Remotion component, Composition. ` We can see this component in action by running npm start and checkouting out our development environment: The Remotion development environment has 4 important sections: - Compositions: The left main column. This is a list of all defined Composition components in Video.tsx. The Blank template defines a single Composition, with an id == 'Empty' - Scenes: The bottom left column. We haven't talked about scenes yet... - Composition Render: The main dev zone. This is the development render of the currently selected Composition. - Composition Timeline: This is a timeline of scenes rendered in the currently selected Composition. The Composition component has a few important properties: - id: This is the name given to the Composition, used to identify the component in the development environment. - component: This is the component that the Composition renders. Different compositions can render the same component, for responsiveness for example. - durationInFrames & fps: A Composition is not defined by how long it is in time, but by how many frames it is made up of, and how fast those frames are rendered. - width & height: Together, these define the aspect ratio of our video. You could define an HD Composition with a 1920/1080 width/height, or a square still from a single frame of a Component with a 200/200 width/height for example. - defaultProps: This is the props object passed into the component when the Composition is rendered. This repository defines a super small empty component as the component-to-render for the Empty composition. Let's rename our component to HelloWorld, and create a full-fledged component to render. We'll render a simple, 4-second video, of the text "Hello, World!", in high definition, at 60fps. src/Video.tsx ` src/components/HelloWorld.tsx ` To render this to .mp4 you need to have ffmpeg installed. Adjust the package.json build command and instruct the CLI to render our HelloWorld composition, and run npm run build to create our first video! Here's a link to what this would render. It's nothing crazy, but with previous React knowledge, and with the help of Remotion, we have created a 4 second long, 60fps, 4K video! Simple animations with interpolate Rendering some static text to video doesn't make for an exciting video, so let's learn how to reach into the state of the application, and adjust the style of our text, depending on the current frame. Here are the 2 hooks we'll use to accomplish this: - useCurrentFrame: This hook returns the current frame number. - interpolate: This hook helps the user change values throughout several frames. It takes in 3 arguments, the current frame (from useCurrentFrame), a tuple designating the start and end frame for the interpolation, and another tuple defining the start and end values for the frame duration. Using these 2 hooks, we can define what the opacity of our text should be, as a function of the current frame: ` Here's a link to what this would render. We aren't rendering static text anymore. Now we have a real video! The Sequence component Videos can be made up of MANY moving parts though. Our current video only has 1 moving part, but let's add some complexity to the video and split this into 2 parts. To do this, we need to take advantage of the Sequence component. > Using a Sequence, you can time-shift certain parts of your animation and therefore make them more reusable. Sequences are also shown in the timeline and help you visually understand the structure of your video. > -- Remotion Docs We can use sequences to help us break apart our logic even further, and we can also view these sequences on our composition timeline. Here are some important props for the Sequence component: - from (required, number): The frame that the Sequence children start at. The initial frame for them is 0. - durationInFrames (optional, default: Infinity): This is how many frames a sequence is displayed. - name (optional): Name your sequences to label them in the development environment. - layout (optional, default: 'absolute-fill'): Sequences are positioned absolutely initially, use this to handle layout manually. Let's animate our "Hello, World!" a bit more: src/components/HelloWorld.tsx ` With a few sequences, we are now separating our animations, and labeling them in our dev environment. Here is the render uploaded to YouTube. Putting the 'World' in 'Hello, World!' Being able to render text is amazing, but in the animation world, models aren't drawn manually with code. Instead, they're generated using software, such as Blender for example. One of the tools we can render with using Remotion is React Three Fiber. Just like Remotion lets us render to a Video format, React Three Fiber allows us to render React components into ThreeJS, a WebGL library. With a bit of React Three Fiber logic and a publicly available model of our planet, we can put the 'World' in 'Hello, World!' Hello, World! Going into how to work with React Three Fiber is a bit out of the scope of this post, but I wanted to show just what the developer is capable of with Remotion, and rendering a spinning planet with JSX is pretty amazing. src/components/HelloWorld.tsx ` Conclusion React truly is a "Learn Once, Write Anywhere" language. The DOM is nowhere close to the only render target React is capable of. With Remotion, we've learned how to slap together some JSX and hooks, and render actual videos, programmatically. Feel free to clone the repo and play around with the logic. If you've followed along with this post, you can now proudly state that you can literally render "Hello, World!" with React!...

Making AI Deliver: From Pilots to Measurable Business Impact cover image

Making AI Deliver: From Pilots to Measurable Business Impact

A lot of organizations have experimented with AI, but far fewer are seeing real business results. At the Leadership Exchange, this panel focused on what it actually takes to move beyond experimentation and turn AI into measurable ROI. Over the past few years, many organizations have experimented with AI, but the challenge today is translating experimentation into measurable business value. Moderated by Tracy Lee, CEO at This Dot Labs, panelists featured Dorren Schmitt, Vice President IT Strategy & Innovation at Allen Media Group, Greg Geodakyan, CTO at Client Command, and Elliott Fouts, CAIO & CTO at This Dot Labs. Panelists discussed how companies are moving from early AI experiments to initiatives that deliver real results. They began by examining how experimentation has evolved over the past year. While many organizations did not fully utilize AI experimentation budgets in 2025, 2026 is showing a shift toward more intentional investment. Structured budgets and clearly defined frameworks are enabling companies to explore AI strategically and identify initiatives with high potential impact. The conversation then turned to alignment and ROI. Panelists highlighted the importance of connecting AI projects to corporate strategy and leadership priorities. Ensuring that AI initiatives translate into operational efficiency, productivity gains, and measurable business impact is essential. Companies that successfully align AI efforts with organizational goals are better equipped to demonstrate tangible outcomes from their investments. Moving from pilots and proofs of concept to production was another major focus. Governance, prioritization, and workflow integration were cited as essential for scaling AI initiatives. One panelist shared that out of nine proofs of concept, eight successfully launched, resulting in improvements in quality and operational efficiency. Panelists also explored the future of AI within organizations, including the potential for agentic workflows and reduced human-in-the-loop processes. New capabilities are emerging that extend beyond coding tasks, reshaping how teams collaborate and how work is structured across departments. Key Takeaways - Structured experimentation and defined budgets allow organizations to explore AI strategically and safely. - Alignment with business priorities is essential for translating AI capabilities into measurable outcomes. - Governance and workflow integration are critical to moving AI initiatives from pilot stages to production deployment. Successfully leveraging AI requires a balance between experimentation, strategic alignment, and operational discipline. Organizations that approach AI as a structured, measurable initiative can capture meaningful results and unlock new opportunities for innovation. Curious how your organization can move from AI experimentation to real impact? Let’s talk. Reach out to continue the conversation or join us at an upcoming Leadership Exchange. Tracy can be reached at tlee@thisdot.co....

Let's innovate together!

We're ready to be your trusted technical partners in your digital innovation journey.

Whether it's modernization or custom software solutions, our team of experts can guide you through best practices and how to build scalable, performant software that lasts.

Prefer email? hi@thisdot.co