February 4, 2026
Dynamic GraphQL Queries at Runtime
When users choose which columns to display, your GraphQL queries can't be generated at build time. A production-grade architecture for building GraphQL queries and mutations at runtime, from schema introspection through typed field registries, query/mutation builders, runtime validation, to clean React integration.
Sascha Becker
Author26 min read
Dynamic GraphQL Queries at Runtime
In an ideal world, every GraphQL query is static. You write it once, a codegen tool like graphql-codegen or gqlgen turns it into a typed hook, and you never think about the query string again.
But some applications don't live in that world.
Think of a table where users select which columns to display via checkboxes. Or an admin dashboard where each role sees different fields. Or a report builder where filters and groupings are chosen at runtime. The query doesn't exist until someone clicks.
This article walks through a clean, production-grade architecture for exactly that scenario.
Available as a Package
The architecture described in this article is now available as an npm package: @saschb2b/gql-drift. It implements the full pipeline — introspection, field registry, query/mutation builders, flatten/unflatten, Zod validation, and React integration — so you don't have to build it from scratch. The article below explains the concepts; the package gives you a ready-to-use implementation.
The Tempting Shortcut
The first instinct is string concatenation:
tsfunction buildQuery(fields: string[]) {return `query {orders {${fields.join("\n ")}}}`;}const data = await fetchGraphQL(buildQuery(selectedFields));// data is `any`
This works for a prototype. For anything beyond that, it creates problems:
- No type safety.
dataisany. Every access is unvalidated. - No validation. Pass
"nonExistentField"and you get a runtime GraphQL error. - Injection surface. If field names come from user input without validation, you're trusting the client with query structure.
- Impossible to refactor. Rename a field in the schema and nothing in your codebase will warn you.
- No nesting. Real schemas have nested objects (
address { city state }). String joining can't express that.
Architecture Overview
The clean approach separates concerns into layers:
DIAGRAM
Each layer has one job:
| Layer | Responsibility |
|---|---|
| Introspection | Reads the schema to discover types, fields, nesting, scalars, and available mutations |
| Field Registry | Structured output that maps introspected fields to UI labels, GraphQL paths, and formatting types |
| Query Builder | Pure function: selected fields in, valid GraphQL query string out |
| Mutation Builder | Pure function: changed fields in, valid GraphQL mutation string out |
| Runtime Validation | Zod schema that validates API responses and user input before mutation |
| Data Layer | Transport layer: fetch wrapper or TanStack Query with proper cache keys |
| UI Layer | Checkboxes drive selection, table renders result, edits trigger mutations |
Starting from the Schema: Introspection
With static queries, codegen reads your .graphql files and generates typed hooks. With dynamic queries, there are no .graphql files. But the schema itself is still the source of truth. The first step is reading it.
The Introspection Query
Every GraphQL API supports introspection (unless explicitly disabled). You can ask it: "What types do you have? What fields does each type have?"
tsconst INTROSPECTION_QUERY = `query IntrospectType($typeName: String!) {__type(name: $typeName) {namefields {nametype {namekindofType {namekindofType {namekind}}}}}}`;async function introspectType(typeName: string) {const res = await fetch("/graphql", {method: "POST",headers: { "Content-Type": "application/json" },body: JSON.stringify({query: INTROSPECTION_QUERY,variables: { typeName },}),});const json = await res.json();return json.data.__type;}
Calling introspectType("Order") returns the full field structure: field names, scalar types, nested objects, everything the schema defines.
Parsing Introspection into a Field Registry
The raw introspection response is verbose. The next step is transforming it into the flat FieldDefinition[] structure that the rest of the pipeline works with:
ts// --- Types for introspection response ---interface IntrospectionType {name: string | null;kind: string;ofType?: IntrospectionType;}interface IntrospectionField {name: string;type: IntrospectionType;}interface IntrospectionResult {name: string;fields: IntrospectionField[];}// --- Field definition used throughout the pipeline ---interface FieldDefinition {key: string;label: string;graphqlPath: string;type: "string" | "number" | "date" | "boolean" | "enum";enumValues?: string[]; // ["PENDING", "SHIPPED", "DELIVERED"]}// --- Helpers ---function capitalize(s: string): string {return s.charAt(0).toUpperCase() + s.slice(1);}function formatLabel(fieldName: string): string {// camelCase → "Camel Case"return fieldName.replace(/([a-z])([A-Z])/g, "$1 $2").replace(/^./, (s) => s.toUpperCase());}// Unwrap NON_NULL and LIST wrappers to get the underlying typefunction unwrapType(t: IntrospectionType): IntrospectionType {while (t.kind === "NON_NULL" || t.kind === "LIST") {t = t.ofType!;}return t;}// Map GraphQL scalar names to our simplified type systemconst SCALAR_MAP: Record<string, FieldDefinition["type"]> = {String: "string",Int: "number",Float: "number",Boolean: "boolean",DateTime: "date",ID: "string",};// --- Registry builder ---function buildFieldRegistry(introspection: IntrospectionResult,prefix = "",pathPrefix = "",): FieldDefinition[] {const fields: FieldDefinition[] = [];for (const field of introspection.fields) {if (field.name === "id") continue; // id is always included automaticallyconst unwrapped = unwrapType(field.type);const graphqlPath = pathPrefix ? `${pathPrefix}.${field.name}` : field.name;const key = prefix ? `${prefix}${capitalize(field.name)}` : field.name;if (unwrapped.kind === "SCALAR") {const mappedType = SCALAR_MAP[unwrapped.name!];if (mappedType) {fields.push({key,label: formatLabel(field.name),graphqlPath,type: mappedType,});} else if (unwrapped.kind === "OBJECT") {}} else if (unwrapped.kind === "ENUM") {fields.push({key,label: formatLabel(field.name),graphqlPath,type: "enum",enumValues: [], // populated via enum introspection});} else if (unwrapped.kind === "OBJECT") {// Nested object: handled via recursive introspection (see below)}}return fields;}
Build-Time vs Runtime Introspection
You have two options for when to introspect:
- Build-time script: run introspection during CI/build, generate a static
fieldRegistry.tsfile. This is the closest to the codegen workflow and gives you compile-time guarantees. Best for schemas that change infrequently. - Runtime introspection: call the introspection query when the app loads (or when a specific page loads). Best for multi-tenant apps where each tenant might have a different schema, or when the schema changes frequently without redeployments.
Build-Time Generation Script
For most projects, a build-time script is the better choice. It runs once, outputs a typed file, and you get IDE autocompletion:
ts// scripts/generate-field-registry.tsimport { writeFileSync } from "fs";async function main() {const type = await introspectType("Order");const fields = buildFieldRegistry(type);const output = `// AUTO-GENERATED - do not edit manually// Run: npx tsx scripts/generate-field-registry.tsimport type { FieldDefinition } from "../types";export const ORDER_FIELDS: FieldDefinition[] = ${JSON.stringify(fields, null, 2)};`;writeFileSync("src/generated/orderFields.ts", output);console.log(`Generated ${fields.length} field definitions for Order`);}main();
Add it to your package.json:
json{"scripts": {"generate:fields": "tsx scripts/generate-field-registry.ts"}}
Now pnpm generate:fields produces a typed field registry from your live schema, similar to how gqlgen generate works for static queries. The difference is that this registry feeds a runtime query builder instead of static typed hooks.
Handling Nested Types
Real schemas have nesting. The introspection for Order might reveal that shippingAddress is an OBJECT type with its own fields. The registry generator handles this by recursing one level deep:
ts// During introspection parsing, when we hit an OBJECT field:if (nestedTypeName) {const nestedType = await introspectType(nestedTypeName);const nestedFields = buildFieldRegistry(nestedType,field.name, // prefix: "shippingAddress" → keys like "shippingAddressCity"field.name, // pathPrefix: "shippingAddress" → paths like "shippingAddress.city");fields.push(...nestedFields);}
This flattens shippingAddress.city into a single FieldDefinition with key: "shippingAddressCity" and graphqlPath: "shippingAddress.city". The UI sees flat checkboxes; the query builder reconstructs the nesting.
Limit Recursion Depth
Don't recurse indefinitely into nested objects. One level deep covers most use cases. Deeper nesting usually means the user needs a different UI (a tree picker, a sub-table) rather than flat checkboxes.
The Field Registry
After introspection and generation, you have a FieldDefinition[], the structured output that drives everything downstream.
Here's what the generated registry looks like for an Order type:
tsconst ORDER_FIELDS: FieldDefinition[] = [{ key: "orderNumber", label: "Order Number", graphqlPath: "orderNumber", type: "string" },{ key: "customerName", label: "Customer Name", graphqlPath: "customerName", type: "string" },{ key: "status", label: "Status", graphqlPath: "status", type: "string" },{ key: "total", label: "Total", graphqlPath: "total", type: "number" },{ key: "currency", label: "Currency", graphqlPath: "currency", type: "string" },{ key: "createdAt", label: "Created At", graphqlPath: "createdAt", type: "date" },{ key: "shippingAddressCity", label: "City", graphqlPath: "shippingAddress.city", type: "string" },{ key: "shippingAddressCountry", label: "Country", graphqlPath: "shippingAddress.country", type: "string" },];
Enriching the Generated Registry
The auto-generated labels from formatLabel are decent but not always ideal ("Customer Name" is fine, "Created At" might need to be "Created"). You can add an override layer:
tsconst LABEL_OVERRIDES: Partial<Record<string, string>> = {orderNumber: "Order #",createdAt: "Created",shippingAddressCity: "Ship. City",shippingAddressCountry: "Ship. Country",};const ORDER_FIELDS_ENRICHED = ORDER_FIELDS.map((f) => ({...f,label: LABEL_OVERRIDES[f.key] ?? f.label,}));
This keeps the generated file untouched (re-runnable) while giving you control over the UI labels.
Why Not Just keyof Order?
Because the registry does more than list field names. It maps flat UI keys ("shippingAddressCity") to nested GraphQL paths ("shippingAddress.city"), defines display labels, and declares types for formatting. A plain keyof can't express that, and with introspection-based generation, you don't need to maintain it by hand.
The Query Builder
The query builder is a pure function. It takes a list of field definitions and returns a valid GraphQL query string. The key challenge is handling nested fields.
tsfunction buildOrderQuery(fields: FieldDefinition[]): string {// Always include idconst paths = ["id", ...fields.map((f) => f.graphqlPath)];// Group nested paths: "shippingAddress.city" → { shippingAddress: ["city"] }const roots: string[] = [];const nested = new Map<string, string[]>();for (const path of paths) {const dot = path.indexOf(".");if (dot === -1) {roots.push(path);} else {const parent = path.slice(0, dot);const child = path.slice(dot + 1);if (!nested.has(parent)) nested.set(parent, []);nested.get(parent)!.push(child);}}// Build the selection setconst selections = [...roots,...[...nested.entries()].map(([parent, children]) => `${parent} { ${children.join(" ")} }`),];return `query GetOrders($filter: OrderFilter) {orders(filter: $filter) {${selections.join("\n ")}}}`;}
For the selection ["orderNumber", "customerName", "shippingAddressCity", "shippingAddressCountry"], this produces:
graphqlquery GetOrders($filter: OrderFilter) {orders(filter: $filter) {idorderNumbercustomerNameshippingAddress { city country }}}
Keep It Pure
The query builder has no side effects, no state, no dependencies. This makes it trivially testable: pass fields in, assert the string out. Snapshot tests work well here.
Deeper Nesting
If your schema has deeper nesting (e.g. shippingAddress.coordinates.lat), you can make the builder recursive. For most applications, one level of nesting is enough. Resist the urge to build a general-purpose query AST unless you actually need it.
Typing the Result
This is where TypeScript's limits become visible. The query shape depends on runtime selection, so you can't have full compile-time narrowing. But you have options along a spectrum.
Option 1: Partial<Order>, Simple and Honest
tstype OrderQueryResult = Pick<Order, "id"> & Partial<Omit<Order, "id">>;
Every field except id might be undefined. This forces you to handle the absence, which is correct. You genuinely don't know at compile time which fields were selected.
Option 2: Generic Selection Type
When the selection is known at the call site:
tstype DynamicResult<T, K extends keyof T> = Pick<T, "id" & keyof T> & Pick<T, K>;// If you know the selection at the call site:const fields = ["orderNumber", "status"] as const;type Result = DynamicResult<Order, (typeof fields)[number]>;// = { id: string; orderNumber: string; status: "pending" | "shipped" | ... }
This works in controlled scenarios (e.g. presets, saved views). It doesn't help when the selection comes from user interaction at runtime.
Option 3: Runtime Validation, The Real Safety Net
Types Disappear at Runtime
TypeScript types are erased at compile time. For dynamic queries where the shape is user-determined, runtime validation is your actual safety net. Types are developer convenience on top.
Build a Zod schema dynamically from the same field registry. Since the registry uses flat keys (shippingAddressCity) while the API response is nested (shippingAddress: { city: "..." }), validate after flattening the response so the schema matches the shape your UI actually consumes:
tsimport { z } from "zod";const FIELD_VALIDATORS: Record<FieldDefinition["type"], z.ZodTypeAny> = {string: z.string(),number: z.number(),date: z.string(),boolean: z.boolean(),enum: z.string(), // overridden with z.enum() when enumValues are available};function buildResultSchema(fields: FieldDefinition[]) {const shape: Record<string, z.ZodTypeAny> = { id: z.string() };for (const field of fields) {shape[field.key] = FIELD_VALIDATORS[field.type];}return z.object(shape);}// Usage: flatten first, then validateconst flatRows = rawData.orders.map((order) => flattenOrder(order, selectedFields));const schema = z.array(buildResultSchema(selectedFields));const validated = schema.parse(flatRows); // throws if shape doesn't match
This gives you real runtime guarantees. If the API returns something unexpected (missing field, wrong type), you catch it immediately after flattening instead of crashing somewhere in the table rendering.
The Data Layer
Simple Fetch Wrapper
For a straightforward setup with use() + Suspense (see the use() hook article):
tsconst queryCache = new Map<string, Promise<unknown>>();function fetchOrders(fields: FieldDefinition[], filter?: OrderFilter) {const cacheKey = fields.map((f) => f.key).toSorted().join(",")+ "|" + JSON.stringify(filter);if (!queryCache.has(cacheKey)) {queryCache.set(cacheKey,fetch("/graphql", {method: "POST",headers: { "Content-Type": "application/json" },body: JSON.stringify({query: buildOrderQuery(fields),variables: { filter },}),}).then((r) => r.json()).then((json) => {const rows = json.data.orders.map((order: Record<string, unknown>) =>flattenOrder(order, fields),);return z.array(buildResultSchema(fields)).parse(rows);}),);}return queryCache.get(cacheKey)!;}
TanStack Query Integration
When you need cache invalidation, background refetching, or pagination:
tsfunction useOrders(fields: FieldDefinition[], filter?: OrderFilter) {const fieldKeys = fields.map((f) => f.key).toSorted();return useQuery({queryKey: ["orders", fieldKeys, filter],queryFn: async () => {const res = await fetch("/graphql", {method: "POST",headers: { "Content-Type": "application/json" },body: JSON.stringify({query: buildOrderQuery(fields),variables: { filter },}),});} else if (unwrapped.kind === "OBJECT") {const json = await res.json();const rows = json.data.orders.map((order: Record<string, unknown>) =>flattenOrder(order, fields),);return z.array(buildResultSchema(fields)).parse(rows);},enabled: fields.length > 0,});}
Cache Key Must Include the Selection
If the queryKey doesn't include the selected fields, switching from 3 columns to 5 columns will show stale 3-column data until the new query finishes. The sorted field list in the key ensures each unique selection gets its own cache entry.
Putting It All Together
Here's the complete flow, from checkboxes to table:
tsx"use client";import { useState, useMemo } from "react";import { useQuery } from "@tanstack/react-query";// --- Types ---interface FieldDefinition {key: string;label: string;graphqlPath: string;type: "string" | "number" | "date" | "boolean" | "enum";}// --- Field Registry ---const ORDER_FIELDS: FieldDefinition[] = [{ key: "orderNumber", label: "Order #", graphqlPath: "orderNumber", type: "string" },{ key: "customerName", label: "Customer", graphqlPath: "customerName", type: "string" },{ key: "status", label: "Status", graphqlPath: "status", type: "string" },{ key: "total", label: "Total", graphqlPath: "total", type: "number" },{ key: "currency", label: "Currency", graphqlPath: "currency", type: "string" },{ key: "createdAt", label: "Created", graphqlPath: "createdAt", type: "date" },{ key: "shippingAddressCity", label: "City", graphqlPath: "shippingAddress.city", type: "string" },{ key: "shippingAddressCountry", label: "Country", graphqlPath: "shippingAddress.country", type: "string" },];// --- Query Builder ---function buildOrderQuery(fields: FieldDefinition[]): string {const paths = ["id", ...fields.map((f) => f.graphqlPath)];const roots: string[] = [];const nested = new Map<string, string[]>();for (const path of paths) {const dot = path.indexOf(".");if (dot === -1) {roots.push(path);} else {const parent = path.slice(0, dot);const child = path.slice(dot + 1);if (!nested.has(parent)) nested.set(parent, []);nested.get(parent)!.push(child);}}const selections = [...roots,...[...nested.entries()].map(([parent, children]) => `${parent} { ${children.join(" ")} }`),];return `query GetOrders($filter: OrderFilter) {orders(filter: $filter) {${selections.join("\n ")}}}`;}// --- Flattener (nested response → flat row) ---function flattenOrder(order: Record<string, unknown>, fields: FieldDefinition[]) {const row: Record<string, unknown> = { id: order.id };for (const field of fields) {const parts = field.graphqlPath.split(".");let value: unknown = order;for (const part of parts) {value = (value as Record<string, unknown>)?.[part];}row[field.key] = value;}return row;}// --- Component ---export function OrderTable() {const [selectedKeys, setSelectedKeys] = useState<Set<string>>(new Set(["orderNumber", "customerName", "status"]));const selectedFields = useMemo(() => ORDER_FIELDS.filter((f) => selectedKeys.has(f.key)),[selectedKeys]);const { data, isLoading, error } = useQuery({queryKey: ["orders", [...selectedKeys].sort()],queryFn: async () => {const res = await fetch("/graphql", {method: "POST",headers: { "Content-Type": "application/json" },body: JSON.stringify({query: buildOrderQuery(selectedFields),}),});} else if (unwrapped.kind === "OBJECT") {return res.json();},enabled: selectedKeys.size > 0,});const rows = useMemo(() => {if (!data?.data?.orders) return [];return data.data.orders.map((order: Record<string, unknown>) =>flattenOrder(order, selectedFields));}, [data, selectedFields]);const toggleField = (key: string) => {setSelectedKeys((prev) => {const next = new Set(prev);if (next.has(key)) next.delete(key);else next.add(key);return next;});};return (<div>{/* Column Selector */}<fieldset><legend>Columns</legend>{ORDER_FIELDS.map((field) => (<label key={field.key} style={{ marginRight: 16 }}><inputtype="checkbox"checked={selectedKeys.has(field.key)}onChange={() => toggleField(field.key)}/>{field.label}</label>))}</fieldset>{/* Data Table */}{error && <div>Error: {error.message}</div>}{isLoading && <div>Loading...</div>}{!isLoading && !error && (<table><thead><tr>{selectedFields.map((f) => (<th key={f.key}>{f.label}</th>))}</tr></thead><tbody>{rows.map((row: Record<string, unknown>) => (<tr key={row.id as string}>{selectedFields.map((f) => (<td key={f.key}>{String(row[f.key] ?? "")}</td>))}</tr>))}</tbody></table>)}</div>);}
The data flow:
- User toggles checkboxes →
selectedKeysstate updates selectedFieldsis derived viauseMemofiltering the registryuseQueryfires with the sorted field keys in the cache keybuildOrderQueryproduces the GraphQL string from the selected fields- The response is flattened (nested paths → flat row keys) for the table
- The table renders only the selected columns
Dynamic Mutations
So far the pipeline only reads data. But if the user can see an order's status in a dynamic table, the next question is: can they edit it?
The same architecture that builds queries can build mutations. The additions are: discovering which mutation to call, knowing which fields accept writes, and building the mutation string + variables from the user's changes.
Discovering Mutations via Naming Convention
Most GraphQL APIs follow a predictable naming convention:
| Query type | Mutation | Input type |
|---|---|---|
Order | updateOrder | UpdateOrderInput |
Customer | updateCustomer | UpdateCustomerInput |
Product | updateProduct | UpdateProductInput |
You can formalize this with a helper and then validate it against the schema:
tstype MutationOperation = "update" | "create" | "delete";function getMutationName(typeName: string, operation: MutationOperation): string {return `${operation}${typeName}`;}function getInputTypeName(typeName: string, operation: MutationOperation): string {return `${operation.charAt(0).toUpperCase() + operation.slice(1)}${typeName}Input`;}// getMutationName("Order", "update") → "updateOrder"// getInputTypeName("Order", "update") → "UpdateOrderInput"
To confirm the mutation actually exists, introspect the Mutation root type:
tsasync function discoverMutations(typeName: string) {const mutationRoot = await introspectType("Mutation");const available = new Map<MutationOperation, string>();for (const op of ["update", "create", "delete"] as const) {const name = getMutationName(typeName, op);if (mutationRoot.fields.some((f: IntrospectionField) => f.name === name)) {available.set(op, name);}}return available;}// discoverMutations("Order")// → Map { "update" → "updateOrder", "delete" → "deleteOrder" }// (no "createOrder" if the API doesn't expose it)
Validate at Build Time
Add mutation discovery to your generate:fields script. If the convention breaks (someone names it modifyOrder instead of updateOrder), you catch it during generation, not when a user clicks save.
Introspecting the Input Type
The update mutation typically takes an input type that mirrors the query fields, minus computed and read-only fields:
tsasync function buildInputRegistry(typeName: string): Promise<FieldDefinition[]> {const inputTypeName = getInputTypeName(typeName, "update");const inputType = await introspectType(inputTypeName);if (!inputType) {throw new Error(`Input type ${inputTypeName} not found in schema`);}return buildFieldRegistry(inputType);}
The result tells you exactly which fields are writable. If Order has 12 query fields but UpdateOrderInput only has 6, the other 6 are read-only (computed totals, timestamps, etc.). Your edit UI only renders inputs for fields that appear in both the query registry and the input registry:
tsconst queryFields = ORDER_FIELDS;const inputFields = await buildInputRegistry("Order");const editableKeys = new Set(inputFields.map((f) => f.key));const editableQueryFields = queryFields.filter((f) => editableKeys.has(f.key));// Only these fields get an "edit" affordance in the UI
The Mutation Builder
Mirrors buildOrderQuery, but wraps the field selection in a mutation with an $input variable. The return selection reuses the query fields so the cache can update with fresh data:
tsfunction buildUpdateMutation(typeName: string,inputTypeName: string,returnFields: FieldDefinition[],): string {const mutationName = getMutationName(typeName, "update");// Reuse the same grouping logic as the query builder for return fieldsconst paths = ["id", ...returnFields.map((f) => f.graphqlPath)];const roots: string[] = [];const nested = new Map<string, string[]>();for (const path of paths) {const dot = path.indexOf(".");if (dot === -1) {roots.push(path);} else {const parent = path.slice(0, dot);const child = path.slice(dot + 1);if (!nested.has(parent)) nested.set(parent, []);nested.get(parent)!.push(child);}}const selections = [...roots,...[...nested.entries()].map(([parent, children]) => `${parent} { ${children.join(" ")} }`,),];return `mutation ${mutationName.charAt(0).toUpperCase() + mutationName.slice(1)}($id: ID!, $input: ${inputTypeName}!) {${mutationName}(id: $id, input: $input) {${selections.join("\n ")}}}`;}
For an order update with ["status", "shippingAddressCity"] selected, this produces:
graphqlmutation UpdateOrder($id: ID!, $input: UpdateOrderInput!) {updateOrder(id: $id, input: $input) {idstatusshippingAddress { city }}}
Building Input Variables
The query pipeline flattens nested responses into flat keys (shippingAddress.city → shippingAddressCity). The mutation pipeline needs the reverse: unflatten edited values back into the nested structure the API expects:
tsfunction unflattenInput(flatData: Record<string, unknown>,fields: FieldDefinition[],): Record<string, unknown> {const result: Record<string, unknown> = {};for (const field of fields) {if (!(field.key in flatData)) continue;const dot = field.graphqlPath.indexOf(".");if (dot === -1) {// Top-level fieldresult[field.graphqlPath] = flatData[field.key];} else {// Nested field: reconstruct the objectconst parent = field.graphqlPath.slice(0, dot);const child = field.graphqlPath.slice(dot + 1);if (!result[parent]) result[parent] = {};(result[parent] as Record<string, unknown>)[child] = flatData[field.key];}}return result;}// unflattenInput(// { status: "shipped", shippingAddressCity: "Berlin" },// selectedFields,// )// → { status: "shipped", shippingAddress: { city: "Berlin" } }
Validating Input Before Sending
Reuse the same Zod approach from the query side, but this time validate the input before sending rather than the response after receiving:
tsfunction buildInputSchema(fields: FieldDefinition[]) {const shape: Record<string, z.ZodTypeAny> = {};for (const field of fields) {shape[field.key] = FIELD_VALIDATORS[field.type];}return z.object(shape);}// Validate what the user entered before building the mutation variablesconst inputSchema = buildInputSchema(editableFields);const parsed = inputSchema.parse(dirtyValues); // throws on invalid inputconst variables = unflattenInput(parsed, editableFields);
This catches type mismatches (user typed "abc" into a number field) before the request ever leaves the client.
Wiring It Up
With TanStack Query's useMutation, the integration looks like this:
tsfunction useUpdateOrder(fields: FieldDefinition[]) {const queryClient = useQueryClient();const inputTypeName = getInputTypeName("Order", "update");const editableFields = fields.filter((f) => editableKeys.has(f.key));return useMutation({mutationFn: async ({ id, values }: { id: string; values: Record<string, unknown> }) => {// 1. Validateconst parsed = buildInputSchema(editableFields).parse(values);// 2. Unflattenconst input = unflattenInput(parsed, editableFields);// 3. Build & sendconst res = await fetch("/graphql", {method: "POST",headers: { "Content-Type": "application/json" },body: JSON.stringify({query: buildUpdateMutation("Order", inputTypeName, fields),variables: { id, input },}),});} else if (unwrapped.kind === "OBJECT") {return res.json();},onSuccess: () => {// Invalidate the query cache so the table refetches with updated dataqueryClient.invalidateQueries({ queryKey: ["orders"] });},});}
The component tracks which fields the user changed (dirty state), then calls the mutation with only those values:
tsxconst updateOrder = useUpdateOrder(selectedFields);// On save:updateOrder.mutate({id: row.id,values: dirtyValues, // only the fields the user actually edited});
The Full Pipeline: Read and Write
The read and write paths are symmetric:
Read: Registry → Query Builder → fetch → flatten → validate → table
Write: dirty fields → validate → unflatten → Mutation Builder → fetch → invalidate cache
Both paths share the same FieldDefinition[], the same naming conventions, and the same validation approach. The registry is the single source of truth for both directions.
Create and Delete Follow the Same Pattern
This section focuses on update as the representative case. create works identically: introspect CreateOrderInput, build with buildCreateMutation, use unflattenInput. Delete is even simpler: it only needs the id, no input introspection required. The naming convention (createOrder, deleteOrder) and discovery via discoverMutations handle all three.
Consider Fetching Everything Instead
Before committing to dynamic queries, ask whether you actually need them.
If your object is a flat or shallow type with under ~30 scalar fields and no heavy text/binary blobs, a static query that fetches everything is simpler:
ts// Static, fully generated, fully typedconst { data } = useGetOrdersQuery();// Checkboxes only control which columns render, not what's fetchedconst visibleColumns = ORDER_FIELDS.filter((f) => selectedKeys.has(f.key));
This gives you full codegen types, zero query-building code, and simpler caching. The checkboxes become a pure UI concern.
When "Everything" Is Undefined
This approach breaks down the moment your schema has recursive or circular types:
graphqltype User {id: ID!name: String!friends: [User!]! # User → Usermanager: User # User → User againdepartment: Department!}type Department {id: ID!name: String!members: [User!]! # Department → User → Department → ...}
User.friends returns [User]. Each friend has their own friends. There is no "all fields". The graph is infinite. A naive "fetch everything" query would recurse until the server hits its depth limit and errors out.
This is a fundamental property of GraphQL: the G stands for Graph, and graphs have cycles. Any type that references itself (directly or through other types) makes the "fetch everything" shortcut impossible.
Recursive Types Require Explicit Depth
You can't introspect your way out of this. The schema tells you that User.friends returns [User], but it doesn't tell you how deep to go. That's a product decision, not a schema decision. Dynamic queries with explicit depth limits (the approach in this article) are the only clean solution for recursive types.
Dynamic queries earn their complexity when:
- The type graph has cycles: recursive or mutually recursive types where "all fields" is undefined
- Objects have 50+ fields or include large text/binary fields
- Each field triggers expensive resolver work (separate DB joins, external API calls)
- Bandwidth is constrained (mobile, metered connections)
- Field-level authorization means different users see different schemas
- The API bills per field or per resolver hit
If your types are flat, none of these apply, and you can define every query at build time, fetch everything and filter the display. For everything else, you need the dynamic pipeline.
Hardening for Production
Query Complexity Limits
Don't let users select 200 fields:
tsconst MAX_FIELDS = 20;const selectedFields = fields.slice(0, MAX_FIELDS);if (fields.length > MAX_FIELDS) {console.warn(`Field selection capped at ${MAX_FIELDS}`);}
Field-Level Authorization
Some fields might not be available to all users. Filter the registry before rendering checkboxes:
tsconst visibleFields = ORDER_FIELDS.filter((f) =>userPermissions.allowedFields.includes(f.key));
This way unauthorized fields never appear in the UI and never end up in the query.
Error Handling
Wrap the data table in an error boundary. GraphQL partial errors (some fields resolve, some don't) are common with dynamic queries. Your UI should handle them gracefully rather than crashing.
Testing
Three layers of testing make this architecture reliable:
-
Query builder snapshots: assert that a given field selection produces the expected GraphQL string. These catch regressions when you refactor the builder.
-
Schema validation tests: if you have access to the GraphQL schema (via introspection or a local schema file), validate that every field in your registry exists in the schema. This catches drift between your registry and the API.
-
Integration tests: send actual queries to a test API (or mock server) and validate the response through the Zod schema. This tests the full pipeline from selection to parsed result.
ts// Example: query builder snapshot testtest("builds query with nested fields", () => {const fields = ORDER_FIELDS.filter((f) =>["orderNumber", "shippingAddressCity", "shippingAddressCountry"].includes(f.key));expect(buildOrderQuery(fields)).toMatchInlineSnapshot(`"query GetOrders($filter: OrderFilter) {orders(filter: $filter) {idorderNumbershippingAddress { city country }}}"`);});
Use It: @saschb2b/gql-drift
Everything described in this article — the introspection layer, field registries, query/mutation builders, flatten/unflatten, Zod validation, and React integration — is available as an open-source npm package.
bashpnpm add @saschb2b/gql-drift
The CLI generates typed field registries from any GraphQL endpoint or local schema file:
bashnpx gql-drift generate --endpoint http://localhost:4000/graphql --types Order,Customernpx gql-drift generate --schema ./schema.graphql --types '*' --exclude '*Connection,*Edge'
Wildcard type discovery (types: "*") auto-discovers all object types from the schema, with glob-style --exclude patterns to filter out relay types or other noise.
The generated code follows the TanStack Query v5 queryOptions pattern, producing options factories you spread into standard hooks:
tsximport { useQuery, useMutation } from "@tanstack/react-query";import { orderQueryOptions, updateOrderMutation } from "./generated/order";const { data } = useQuery({ ...orderQueryOptions({ config }) });const { mutate } = useMutation({ ...updateOrderMutation({ config }) });
The package also supports custom GraphQL clients (urql, Apollo, graphql-request) via a fetcher option, and works without React as a vanilla TypeScript library.
For the full API, see the package documentation.
Sources & Further Links
- GraphQL Spec: Field Selection
The official GraphQL specification on field selections and how selection sets work at the protocol level.
- TanStack Query
The most popular client-side data fetching library for React. Handles caching, background refetching, pagination, and query key management.
- Zod
TypeScript-first schema validation library, ideal for runtime validation of dynamic API responses.
- TypeScript Utility Types: Pick, Partial, Omit
TypeScript's built-in utility types for constructing partial and picked types from existing interfaces.
- GraphQL Code Generator
The standard codegen tool for static GraphQL queries. Use it for everything that CAN be generated, and dynamic queries for the rest.
- @saschb2b/gql-drift
Open-source npm package implementing the full dynamic GraphQL pipeline described in this article: introspection, field registries, query/mutation builders, Zod validation, and React integration.
