2026

February 4, 2026

Dynamic GraphQL Queries at Runtime

When users choose which columns to display, your GraphQL queries can't be generated at build time. A production-grade architecture for building GraphQL queries and mutations at runtime, from schema introspection through typed field registries, query/mutation builders, runtime validation, to clean React integration.

S
Sascha Becker
Author

26 min read

Dynamic GraphQL Queries at Runtime

Dynamic GraphQL Queries at Runtime

In an ideal world, every GraphQL query is static. You write it once, a codegen tool like graphql-codegen or gqlgen turns it into a typed hook, and you never think about the query string again.

But some applications don't live in that world.

Think of a table where users select which columns to display via checkboxes. Or an admin dashboard where each role sees different fields. Or a report builder where filters and groupings are chosen at runtime. The query doesn't exist until someone clicks.

This article walks through a clean, production-grade architecture for exactly that scenario.

The Tempting Shortcut

The first instinct is string concatenation:

ts
function buildQuery(fields: string[]) {
return `
query {
orders {
${fields.join("\n ")}
}
}
`;
}
const data = await fetchGraphQL(buildQuery(selectedFields));
// data is `any`

This works for a prototype. For anything beyond that, it creates problems:

  • No type safety. data is any. Every access is unvalidated.
  • No validation. Pass "nonExistentField" and you get a runtime GraphQL error.
  • Injection surface. If field names come from user input without validation, you're trusting the client with query structure.
  • Impossible to refactor. Rename a field in the schema and nothing in your codebase will warn you.
  • No nesting. Real schemas have nested objects (address { city state }). String joining can't express that.

Architecture Overview

The clean approach separates concerns into layers:

DIAGRAM

Each layer has one job:

LayerResponsibility
IntrospectionReads the schema to discover types, fields, nesting, scalars, and available mutations
Field RegistryStructured output that maps introspected fields to UI labels, GraphQL paths, and formatting types
Query BuilderPure function: selected fields in, valid GraphQL query string out
Mutation BuilderPure function: changed fields in, valid GraphQL mutation string out
Runtime ValidationZod schema that validates API responses and user input before mutation
Data LayerTransport layer: fetch wrapper or TanStack Query with proper cache keys
UI LayerCheckboxes drive selection, table renders result, edits trigger mutations

Starting from the Schema: Introspection

With static queries, codegen reads your .graphql files and generates typed hooks. With dynamic queries, there are no .graphql files. But the schema itself is still the source of truth. The first step is reading it.

The Introspection Query

Every GraphQL API supports introspection (unless explicitly disabled). You can ask it: "What types do you have? What fields does each type have?"

ts
const INTROSPECTION_QUERY = `
query IntrospectType($typeName: String!) {
__type(name: $typeName) {
name
fields {
name
type {
name
kind
ofType {
name
kind
ofType {
name
kind
}
}
}
}
}
}
`;
async function introspectType(typeName: string) {
const res = await fetch("/graphql", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
query: INTROSPECTION_QUERY,
variables: { typeName },
}),
});
const json = await res.json();
return json.data.__type;
}

Calling introspectType("Order") returns the full field structure: field names, scalar types, nested objects, everything the schema defines.

Parsing Introspection into a Field Registry

The raw introspection response is verbose. The next step is transforming it into the flat FieldDefinition[] structure that the rest of the pipeline works with:

ts
// --- Types for introspection response ---
interface IntrospectionType {
name: string | null;
kind: string;
ofType?: IntrospectionType;
}
interface IntrospectionField {
name: string;
type: IntrospectionType;
}
interface IntrospectionResult {
name: string;
fields: IntrospectionField[];
}
// --- Field definition used throughout the pipeline ---
interface FieldDefinition {
key: string;
label: string;
graphqlPath: string;
type: "string" | "number" | "date" | "boolean" | "enum";
enumValues?: string[]; // ["PENDING", "SHIPPED", "DELIVERED"]
}
// --- Helpers ---
function capitalize(s: string): string {
return s.charAt(0).toUpperCase() + s.slice(1);
}
function formatLabel(fieldName: string): string {
// camelCase → "Camel Case"
return fieldName
.replace(/([a-z])([A-Z])/g, "$1 $2")
.replace(/^./, (s) => s.toUpperCase());
}
// Unwrap NON_NULL and LIST wrappers to get the underlying type
function unwrapType(t: IntrospectionType): IntrospectionType {
while (t.kind === "NON_NULL" || t.kind === "LIST") {
t = t.ofType!;
}
return t;
}
// Map GraphQL scalar names to our simplified type system
const SCALAR_MAP: Record<string, FieldDefinition["type"]> = {
String: "string",
Int: "number",
Float: "number",
Boolean: "boolean",
DateTime: "date",
ID: "string",
};
// --- Registry builder ---
function buildFieldRegistry(
introspection: IntrospectionResult,
prefix = "",
pathPrefix = "",
): FieldDefinition[] {
const fields: FieldDefinition[] = [];
for (const field of introspection.fields) {
if (field.name === "id") continue; // id is always included automatically
const unwrapped = unwrapType(field.type);
const graphqlPath = pathPrefix ? `${pathPrefix}.${field.name}` : field.name;
const key = prefix ? `${prefix}${capitalize(field.name)}` : field.name;
if (unwrapped.kind === "SCALAR") {
const mappedType = SCALAR_MAP[unwrapped.name!];
if (mappedType) {
fields.push({
key,
label: formatLabel(field.name),
graphqlPath,
type: mappedType,
});
} else if (unwrapped.kind === "OBJECT") {
}
} else if (unwrapped.kind === "ENUM") {
fields.push({
key,
label: formatLabel(field.name),
graphqlPath,
type: "enum",
enumValues: [], // populated via enum introspection
});
} else if (unwrapped.kind === "OBJECT") {
// Nested object: handled via recursive introspection (see below)
}
}
return fields;
}
Build-Time Generation Script

For most projects, a build-time script is the better choice. It runs once, outputs a typed file, and you get IDE autocompletion:

ts
// scripts/generate-field-registry.ts
import { writeFileSync } from "fs";
async function main() {
const type = await introspectType("Order");
const fields = buildFieldRegistry(type);
const output = `// AUTO-GENERATED - do not edit manually
// Run: npx tsx scripts/generate-field-registry.ts
import type { FieldDefinition } from "../types";
export const ORDER_FIELDS: FieldDefinition[] = ${JSON.stringify(fields, null, 2)};
`;
writeFileSync("src/generated/orderFields.ts", output);
console.log(`Generated ${fields.length} field definitions for Order`);
}
main();

Add it to your package.json:

json
{
"scripts": {
"generate:fields": "tsx scripts/generate-field-registry.ts"
}
}

Now pnpm generate:fields produces a typed field registry from your live schema, similar to how gqlgen generate works for static queries. The difference is that this registry feeds a runtime query builder instead of static typed hooks.

Handling Nested Types

Real schemas have nesting. The introspection for Order might reveal that shippingAddress is an OBJECT type with its own fields. The registry generator handles this by recursing one level deep:

ts
// During introspection parsing, when we hit an OBJECT field:
if (nestedTypeName) {
const nestedType = await introspectType(nestedTypeName);
const nestedFields = buildFieldRegistry(
nestedType,
field.name, // prefix: "shippingAddress" → keys like "shippingAddressCity"
field.name, // pathPrefix: "shippingAddress" → paths like "shippingAddress.city"
);
fields.push(...nestedFields);
}

This flattens shippingAddress.city into a single FieldDefinition with key: "shippingAddressCity" and graphqlPath: "shippingAddress.city". The UI sees flat checkboxes; the query builder reconstructs the nesting.

The Field Registry

After introspection and generation, you have a FieldDefinition[], the structured output that drives everything downstream.

Here's what the generated registry looks like for an Order type:

ts
const ORDER_FIELDS: FieldDefinition[] = [
{ key: "orderNumber", label: "Order Number", graphqlPath: "orderNumber", type: "string" },
{ key: "customerName", label: "Customer Name", graphqlPath: "customerName", type: "string" },
{ key: "status", label: "Status", graphqlPath: "status", type: "string" },
{ key: "total", label: "Total", graphqlPath: "total", type: "number" },
{ key: "currency", label: "Currency", graphqlPath: "currency", type: "string" },
{ key: "createdAt", label: "Created At", graphqlPath: "createdAt", type: "date" },
{ key: "shippingAddressCity", label: "City", graphqlPath: "shippingAddress.city", type: "string" },
{ key: "shippingAddressCountry", label: "Country", graphqlPath: "shippingAddress.country", type: "string" },
];
Enriching the Generated Registry

The auto-generated labels from formatLabel are decent but not always ideal ("Customer Name" is fine, "Created At" might need to be "Created"). You can add an override layer:

ts
const LABEL_OVERRIDES: Partial<Record<string, string>> = {
orderNumber: "Order #",
createdAt: "Created",
shippingAddressCity: "Ship. City",
shippingAddressCountry: "Ship. Country",
};
const ORDER_FIELDS_ENRICHED = ORDER_FIELDS.map((f) => ({
...f,
label: LABEL_OVERRIDES[f.key] ?? f.label,
}));

This keeps the generated file untouched (re-runnable) while giving you control over the UI labels.

The Query Builder

The query builder is a pure function. It takes a list of field definitions and returns a valid GraphQL query string. The key challenge is handling nested fields.

ts
function buildOrderQuery(fields: FieldDefinition[]): string {
// Always include id
const paths = ["id", ...fields.map((f) => f.graphqlPath)];
// Group nested paths: "shippingAddress.city" → { shippingAddress: ["city"] }
const roots: string[] = [];
const nested = new Map<string, string[]>();
for (const path of paths) {
const dot = path.indexOf(".");
if (dot === -1) {
roots.push(path);
} else {
const parent = path.slice(0, dot);
const child = path.slice(dot + 1);
if (!nested.has(parent)) nested.set(parent, []);
nested.get(parent)!.push(child);
}
}
// Build the selection set
const selections = [
...roots,
...[...nested.entries()].map(
([parent, children]) => `${parent} { ${children.join(" ")} }`
),
];
return `query GetOrders($filter: OrderFilter) {
orders(filter: $filter) {
${selections.join("\n ")}
}
}`;
}

For the selection ["orderNumber", "customerName", "shippingAddressCity", "shippingAddressCountry"], this produces:

graphql
query GetOrders($filter: OrderFilter) {
orders(filter: $filter) {
id
orderNumber
customerName
shippingAddress { city country }
}
}
Deeper Nesting

If your schema has deeper nesting (e.g. shippingAddress.coordinates.lat), you can make the builder recursive. For most applications, one level of nesting is enough. Resist the urge to build a general-purpose query AST unless you actually need it.

Typing the Result

This is where TypeScript's limits become visible. The query shape depends on runtime selection, so you can't have full compile-time narrowing. But you have options along a spectrum.

Option 1: Partial<Order>, Simple and Honest
ts
type OrderQueryResult = Pick<Order, "id"> & Partial<Omit<Order, "id">>;

Every field except id might be undefined. This forces you to handle the absence, which is correct. You genuinely don't know at compile time which fields were selected.

Option 2: Generic Selection Type

When the selection is known at the call site:

ts
type DynamicResult<T, K extends keyof T> = Pick<T, "id" & keyof T> & Pick<T, K>;
// If you know the selection at the call site:
const fields = ["orderNumber", "status"] as const;
type Result = DynamicResult<Order, (typeof fields)[number]>;
// = { id: string; orderNumber: string; status: "pending" | "shipped" | ... }

This works in controlled scenarios (e.g. presets, saved views). It doesn't help when the selection comes from user interaction at runtime.

Option 3: Runtime Validation, The Real Safety Net

Build a Zod schema dynamically from the same field registry. Since the registry uses flat keys (shippingAddressCity) while the API response is nested (shippingAddress: { city: "..." }), validate after flattening the response so the schema matches the shape your UI actually consumes:

ts
import { z } from "zod";
const FIELD_VALIDATORS: Record<FieldDefinition["type"], z.ZodTypeAny> = {
string: z.string(),
number: z.number(),
date: z.string(),
boolean: z.boolean(),
enum: z.string(), // overridden with z.enum() when enumValues are available
};
function buildResultSchema(fields: FieldDefinition[]) {
const shape: Record<string, z.ZodTypeAny> = { id: z.string() };
for (const field of fields) {
shape[field.key] = FIELD_VALIDATORS[field.type];
}
return z.object(shape);
}
// Usage: flatten first, then validate
const flatRows = rawData.orders.map((order) => flattenOrder(order, selectedFields));
const schema = z.array(buildResultSchema(selectedFields));
const validated = schema.parse(flatRows); // throws if shape doesn't match

This gives you real runtime guarantees. If the API returns something unexpected (missing field, wrong type), you catch it immediately after flattening instead of crashing somewhere in the table rendering.

The Data Layer

Simple Fetch Wrapper

For a straightforward setup with use() + Suspense (see the use() hook article):

ts
const queryCache = new Map<string, Promise<unknown>>();
function fetchOrders(fields: FieldDefinition[], filter?: OrderFilter) {
const cacheKey = fields.map((f) => f.key).toSorted().join(",")
+ "|" + JSON.stringify(filter);
if (!queryCache.has(cacheKey)) {
queryCache.set(
cacheKey,
fetch("/graphql", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
query: buildOrderQuery(fields),
variables: { filter },
}),
})
.then((r) => r.json())
.then((json) => {
const rows = json.data.orders.map((order: Record<string, unknown>) =>
flattenOrder(order, fields),
);
return z.array(buildResultSchema(fields)).parse(rows);
}),
);
}
return queryCache.get(cacheKey)!;
}
TanStack Query Integration

When you need cache invalidation, background refetching, or pagination:

ts
function useOrders(fields: FieldDefinition[], filter?: OrderFilter) {
const fieldKeys = fields.map((f) => f.key).toSorted();
return useQuery({
queryKey: ["orders", fieldKeys, filter],
queryFn: async () => {
const res = await fetch("/graphql", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
query: buildOrderQuery(fields),
variables: { filter },
}),
});
} else if (unwrapped.kind === "OBJECT") {
const json = await res.json();
const rows = json.data.orders.map((order: Record<string, unknown>) =>
flattenOrder(order, fields),
);
return z.array(buildResultSchema(fields)).parse(rows);
},
enabled: fields.length > 0,
});
}

Putting It All Together

Here's the complete flow, from checkboxes to table:

tsx
"use client";
import { useState, useMemo } from "react";
import { useQuery } from "@tanstack/react-query";
// --- Types ---
interface FieldDefinition {
key: string;
label: string;
graphqlPath: string;
type: "string" | "number" | "date" | "boolean" | "enum";
}
// --- Field Registry ---
const ORDER_FIELDS: FieldDefinition[] = [
{ key: "orderNumber", label: "Order #", graphqlPath: "orderNumber", type: "string" },
{ key: "customerName", label: "Customer", graphqlPath: "customerName", type: "string" },
{ key: "status", label: "Status", graphqlPath: "status", type: "string" },
{ key: "total", label: "Total", graphqlPath: "total", type: "number" },
{ key: "currency", label: "Currency", graphqlPath: "currency", type: "string" },
{ key: "createdAt", label: "Created", graphqlPath: "createdAt", type: "date" },
{ key: "shippingAddressCity", label: "City", graphqlPath: "shippingAddress.city", type: "string" },
{ key: "shippingAddressCountry", label: "Country", graphqlPath: "shippingAddress.country", type: "string" },
];
// --- Query Builder ---
function buildOrderQuery(fields: FieldDefinition[]): string {
const paths = ["id", ...fields.map((f) => f.graphqlPath)];
const roots: string[] = [];
const nested = new Map<string, string[]>();
for (const path of paths) {
const dot = path.indexOf(".");
if (dot === -1) {
roots.push(path);
} else {
const parent = path.slice(0, dot);
const child = path.slice(dot + 1);
if (!nested.has(parent)) nested.set(parent, []);
nested.get(parent)!.push(child);
}
}
const selections = [
...roots,
...[...nested.entries()].map(
([parent, children]) => `${parent} { ${children.join(" ")} }`
),
];
return `query GetOrders($filter: OrderFilter) {
orders(filter: $filter) {
${selections.join("\n ")}
}
}`;
}
// --- Flattener (nested response → flat row) ---
function flattenOrder(order: Record<string, unknown>, fields: FieldDefinition[]) {
const row: Record<string, unknown> = { id: order.id };
for (const field of fields) {
const parts = field.graphqlPath.split(".");
let value: unknown = order;
for (const part of parts) {
value = (value as Record<string, unknown>)?.[part];
}
row[field.key] = value;
}
return row;
}
// --- Component ---
export function OrderTable() {
const [selectedKeys, setSelectedKeys] = useState<Set<string>>(
new Set(["orderNumber", "customerName", "status"])
);
const selectedFields = useMemo(
() => ORDER_FIELDS.filter((f) => selectedKeys.has(f.key)),
[selectedKeys]
);
const { data, isLoading, error } = useQuery({
queryKey: ["orders", [...selectedKeys].sort()],
queryFn: async () => {
const res = await fetch("/graphql", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
query: buildOrderQuery(selectedFields),
}),
});
} else if (unwrapped.kind === "OBJECT") {
return res.json();
},
enabled: selectedKeys.size > 0,
});
const rows = useMemo(() => {
if (!data?.data?.orders) return [];
return data.data.orders.map((order: Record<string, unknown>) =>
flattenOrder(order, selectedFields)
);
}, [data, selectedFields]);
const toggleField = (key: string) => {
setSelectedKeys((prev) => {
const next = new Set(prev);
if (next.has(key)) next.delete(key);
else next.add(key);
return next;
});
};
return (
<div>
{/* Column Selector */}
<fieldset>
<legend>Columns</legend>
{ORDER_FIELDS.map((field) => (
<label key={field.key} style={{ marginRight: 16 }}>
<input
type="checkbox"
checked={selectedKeys.has(field.key)}
onChange={() => toggleField(field.key)}
/>
{field.label}
</label>
))}
</fieldset>
{/* Data Table */}
{error && <div>Error: {error.message}</div>}
{isLoading && <div>Loading...</div>}
{!isLoading && !error && (
<table>
<thead>
<tr>
{selectedFields.map((f) => (
<th key={f.key}>{f.label}</th>
))}
</tr>
</thead>
<tbody>
{rows.map((row: Record<string, unknown>) => (
<tr key={row.id as string}>
{selectedFields.map((f) => (
<td key={f.key}>{String(row[f.key] ?? "")}</td>
))}
</tr>
))}
</tbody>
</table>
)}
</div>
);
}

The data flow:

  1. User toggles checkboxes → selectedKeys state updates
  2. selectedFields is derived via useMemo filtering the registry
  3. useQuery fires with the sorted field keys in the cache key
  4. buildOrderQuery produces the GraphQL string from the selected fields
  5. The response is flattened (nested paths → flat row keys) for the table
  6. The table renders only the selected columns

Dynamic Mutations

So far the pipeline only reads data. But if the user can see an order's status in a dynamic table, the next question is: can they edit it?

The same architecture that builds queries can build mutations. The additions are: discovering which mutation to call, knowing which fields accept writes, and building the mutation string + variables from the user's changes.

Discovering Mutations via Naming Convention

Most GraphQL APIs follow a predictable naming convention:

Query typeMutationInput type
OrderupdateOrderUpdateOrderInput
CustomerupdateCustomerUpdateCustomerInput
ProductupdateProductUpdateProductInput

You can formalize this with a helper and then validate it against the schema:

ts
type MutationOperation = "update" | "create" | "delete";
function getMutationName(typeName: string, operation: MutationOperation): string {
return `${operation}${typeName}`;
}
function getInputTypeName(typeName: string, operation: MutationOperation): string {
return `${operation.charAt(0).toUpperCase() + operation.slice(1)}${typeName}Input`;
}
// getMutationName("Order", "update") → "updateOrder"
// getInputTypeName("Order", "update") → "UpdateOrderInput"

To confirm the mutation actually exists, introspect the Mutation root type:

ts
async function discoverMutations(typeName: string) {
const mutationRoot = await introspectType("Mutation");
const available = new Map<MutationOperation, string>();
for (const op of ["update", "create", "delete"] as const) {
const name = getMutationName(typeName, op);
if (mutationRoot.fields.some((f: IntrospectionField) => f.name === name)) {
available.set(op, name);
}
}
return available;
}
// discoverMutations("Order")
// → Map { "update" → "updateOrder", "delete" → "deleteOrder" }
// (no "createOrder" if the API doesn't expose it)
Introspecting the Input Type

The update mutation typically takes an input type that mirrors the query fields, minus computed and read-only fields:

ts
async function buildInputRegistry(typeName: string): Promise<FieldDefinition[]> {
const inputTypeName = getInputTypeName(typeName, "update");
const inputType = await introspectType(inputTypeName);
if (!inputType) {
throw new Error(`Input type ${inputTypeName} not found in schema`);
}
return buildFieldRegistry(inputType);
}

The result tells you exactly which fields are writable. If Order has 12 query fields but UpdateOrderInput only has 6, the other 6 are read-only (computed totals, timestamps, etc.). Your edit UI only renders inputs for fields that appear in both the query registry and the input registry:

ts
const queryFields = ORDER_FIELDS;
const inputFields = await buildInputRegistry("Order");
const editableKeys = new Set(inputFields.map((f) => f.key));
const editableQueryFields = queryFields.filter((f) => editableKeys.has(f.key));
// Only these fields get an "edit" affordance in the UI
The Mutation Builder

Mirrors buildOrderQuery, but wraps the field selection in a mutation with an $input variable. The return selection reuses the query fields so the cache can update with fresh data:

ts
function buildUpdateMutation(
typeName: string,
inputTypeName: string,
returnFields: FieldDefinition[],
): string {
const mutationName = getMutationName(typeName, "update");
// Reuse the same grouping logic as the query builder for return fields
const paths = ["id", ...returnFields.map((f) => f.graphqlPath)];
const roots: string[] = [];
const nested = new Map<string, string[]>();
for (const path of paths) {
const dot = path.indexOf(".");
if (dot === -1) {
roots.push(path);
} else {
const parent = path.slice(0, dot);
const child = path.slice(dot + 1);
if (!nested.has(parent)) nested.set(parent, []);
nested.get(parent)!.push(child);
}
}
const selections = [
...roots,
...[...nested.entries()].map(
([parent, children]) => `${parent} { ${children.join(" ")} }`,
),
];
return `mutation ${mutationName.charAt(0).toUpperCase() + mutationName.slice(1)}($id: ID!, $input: ${inputTypeName}!) {
${mutationName}(id: $id, input: $input) {
${selections.join("\n ")}
}
}`;
}

For an order update with ["status", "shippingAddressCity"] selected, this produces:

graphql
mutation UpdateOrder($id: ID!, $input: UpdateOrderInput!) {
updateOrder(id: $id, input: $input) {
id
status
shippingAddress { city }
}
}
Building Input Variables

The query pipeline flattens nested responses into flat keys (shippingAddress.cityshippingAddressCity). The mutation pipeline needs the reverse: unflatten edited values back into the nested structure the API expects:

ts
function unflattenInput(
flatData: Record<string, unknown>,
fields: FieldDefinition[],
): Record<string, unknown> {
const result: Record<string, unknown> = {};
for (const field of fields) {
if (!(field.key in flatData)) continue;
const dot = field.graphqlPath.indexOf(".");
if (dot === -1) {
// Top-level field
result[field.graphqlPath] = flatData[field.key];
} else {
// Nested field: reconstruct the object
const parent = field.graphqlPath.slice(0, dot);
const child = field.graphqlPath.slice(dot + 1);
if (!result[parent]) result[parent] = {};
(result[parent] as Record<string, unknown>)[child] = flatData[field.key];
}
}
return result;
}
// unflattenInput(
// { status: "shipped", shippingAddressCity: "Berlin" },
// selectedFields,
// )
// → { status: "shipped", shippingAddress: { city: "Berlin" } }
Validating Input Before Sending

Reuse the same Zod approach from the query side, but this time validate the input before sending rather than the response after receiving:

ts
function buildInputSchema(fields: FieldDefinition[]) {
const shape: Record<string, z.ZodTypeAny> = {};
for (const field of fields) {
shape[field.key] = FIELD_VALIDATORS[field.type];
}
return z.object(shape);
}
// Validate what the user entered before building the mutation variables
const inputSchema = buildInputSchema(editableFields);
const parsed = inputSchema.parse(dirtyValues); // throws on invalid input
const variables = unflattenInput(parsed, editableFields);

This catches type mismatches (user typed "abc" into a number field) before the request ever leaves the client.

Wiring It Up

With TanStack Query's useMutation, the integration looks like this:

ts
function useUpdateOrder(fields: FieldDefinition[]) {
const queryClient = useQueryClient();
const inputTypeName = getInputTypeName("Order", "update");
const editableFields = fields.filter((f) => editableKeys.has(f.key));
return useMutation({
mutationFn: async ({ id, values }: { id: string; values: Record<string, unknown> }) => {
// 1. Validate
const parsed = buildInputSchema(editableFields).parse(values);
// 2. Unflatten
const input = unflattenInput(parsed, editableFields);
// 3. Build & send
const res = await fetch("/graphql", {
method: "POST",
headers: { "Content-Type": "application/json" },
body: JSON.stringify({
query: buildUpdateMutation("Order", inputTypeName, fields),
variables: { id, input },
}),
});
} else if (unwrapped.kind === "OBJECT") {
return res.json();
},
onSuccess: () => {
// Invalidate the query cache so the table refetches with updated data
queryClient.invalidateQueries({ queryKey: ["orders"] });
},
});
}

The component tracks which fields the user changed (dirty state), then calls the mutation with only those values:

tsx
const updateOrder = useUpdateOrder(selectedFields);
// On save:
updateOrder.mutate({
id: row.id,
values: dirtyValues, // only the fields the user actually edited
});

Consider Fetching Everything Instead

Before committing to dynamic queries, ask whether you actually need them.

If your object is a flat or shallow type with under ~30 scalar fields and no heavy text/binary blobs, a static query that fetches everything is simpler:

ts
// Static, fully generated, fully typed
const { data } = useGetOrdersQuery();
// Checkboxes only control which columns render, not what's fetched
const visibleColumns = ORDER_FIELDS.filter((f) => selectedKeys.has(f.key));

This gives you full codegen types, zero query-building code, and simpler caching. The checkboxes become a pure UI concern.

When "Everything" Is Undefined

This approach breaks down the moment your schema has recursive or circular types:

graphql
type User {
id: ID!
name: String!
friends: [User!]! # User → User
manager: User # User → User again
department: Department!
}
type Department {
id: ID!
name: String!
members: [User!]! # Department → User → Department → ...
}

User.friends returns [User]. Each friend has their own friends. There is no "all fields". The graph is infinite. A naive "fetch everything" query would recurse until the server hits its depth limit and errors out.

This is a fundamental property of GraphQL: the G stands for Graph, and graphs have cycles. Any type that references itself (directly or through other types) makes the "fetch everything" shortcut impossible.

Dynamic queries earn their complexity when:

  • The type graph has cycles: recursive or mutually recursive types where "all fields" is undefined
  • Objects have 50+ fields or include large text/binary fields
  • Each field triggers expensive resolver work (separate DB joins, external API calls)
  • Bandwidth is constrained (mobile, metered connections)
  • Field-level authorization means different users see different schemas
  • The API bills per field or per resolver hit

If your types are flat, none of these apply, and you can define every query at build time, fetch everything and filter the display. For everything else, you need the dynamic pipeline.

Hardening for Production

Query Complexity Limits

Don't let users select 200 fields:

ts
const MAX_FIELDS = 20;
const selectedFields = fields.slice(0, MAX_FIELDS);
if (fields.length > MAX_FIELDS) {
console.warn(`Field selection capped at ${MAX_FIELDS}`);
}
Field-Level Authorization

Some fields might not be available to all users. Filter the registry before rendering checkboxes:

ts
const visibleFields = ORDER_FIELDS.filter((f) =>
userPermissions.allowedFields.includes(f.key)
);

This way unauthorized fields never appear in the UI and never end up in the query.

Error Handling

Wrap the data table in an error boundary. GraphQL partial errors (some fields resolve, some don't) are common with dynamic queries. Your UI should handle them gracefully rather than crashing.

Testing

Three layers of testing make this architecture reliable:

  1. Query builder snapshots: assert that a given field selection produces the expected GraphQL string. These catch regressions when you refactor the builder.

  2. Schema validation tests: if you have access to the GraphQL schema (via introspection or a local schema file), validate that every field in your registry exists in the schema. This catches drift between your registry and the API.

  3. Integration tests: send actual queries to a test API (or mock server) and validate the response through the Zod schema. This tests the full pipeline from selection to parsed result.

ts
// Example: query builder snapshot test
test("builds query with nested fields", () => {
const fields = ORDER_FIELDS.filter((f) =>
["orderNumber", "shippingAddressCity", "shippingAddressCountry"].includes(f.key)
);
expect(buildOrderQuery(fields)).toMatchInlineSnapshot(`
"query GetOrders($filter: OrderFilter) {
orders(filter: $filter) {
id
orderNumber
shippingAddress { city country }
}
}"
`);
});

Use It: @saschb2b/gql-drift

Everything described in this article — the introspection layer, field registries, query/mutation builders, flatten/unflatten, Zod validation, and React integration — is available as an open-source npm package.

bash
pnpm add @saschb2b/gql-drift

The CLI generates typed field registries from any GraphQL endpoint or local schema file:

bash
npx gql-drift generate --endpoint http://localhost:4000/graphql --types Order,Customer
npx gql-drift generate --schema ./schema.graphql --types '*' --exclude '*Connection,*Edge'

Wildcard type discovery (types: "*") auto-discovers all object types from the schema, with glob-style --exclude patterns to filter out relay types or other noise.

The generated code follows the TanStack Query v5 queryOptions pattern, producing options factories you spread into standard hooks:

tsx
import { useQuery, useMutation } from "@tanstack/react-query";
import { orderQueryOptions, updateOrderMutation } from "./generated/order";
const { data } = useQuery({ ...orderQueryOptions({ config }) });
const { mutate } = useMutation({ ...updateOrderMutation({ config }) });

The package also supports custom GraphQL clients (urql, Apollo, graphql-request) via a fetcher option, and works without React as a vanilla TypeScript library.

For the full API, see the package documentation.


S
Written by
Sascha Becker
More articles