From a52397cbc728fcc8db8eea7c6c678feaef2ba956 Mon Sep 17 00:00:00 2001 From: Alec Aivazis Date: Wed, 28 Jul 2021 23:07:14 -0700 Subject: [PATCH] Pagination (#144) * add paginate directive and transform * inject page info into cursor-based pagination * add validators for @paginate * fixed pagination check * rework internal api for looking up fragment arguments * a required argument must be flagged with a ! * @paginate arg adds arguments to its fragment * hoist first, last, and limit as fragment arguments * hoist the pagination args as default values * attempt to clean up pagination transform logic * unname unused variable * continue cleaning up pagination transform * hoist correctly * fix snapshot tests * embed pagination fragment in a new query * embed existing args as default values to query * point snapshots at generated query * fix tests * dont rely on fragment argument transform for paginate tests * fragment variables transform define fragments in new document * add utils to verify node interface * validate node interface and field * paginating a fragment on node embeds it in a query on node * paginate directive can only appear in a document once * error if cursor pagination goes forwards and backwards * @paginate cannot appear on a fragment with required args * paginate from query * paginated queries don't overlap existing variables * more overlapping tests * coordinate pagination transform with a central state object * embed refetch meta data in artifact * added tests for refetch spec * tests pass * invert generated flag - some generated documents need to be there * add paginate directive and transform * inject page info into cursor-based pagination * add validators for @paginate * fixed pagination check * rework internal api for looking up fragment arguments * a required argument must be flagged with a ! * @paginate arg adds arguments to its fragment * hoist first, last, and limit as fragment arguments * hoist the pagination args as default values * attempt to clean up pagination transform logic * unname unused variable * continue cleaning up pagination transform * hoist correctly * fix snapshot tests * embed pagination fragment in a new query * embed existing args as default values to query * point snapshots at generated query * fix tests * dont rely on fragment argument transform for paginate tests * fragment variables transform define fragments in new document * add utils to verify node interface * validate node interface and field * paginating a fragment on node embeds it in a query on node * paginate directive can only appear in a document once * error if cursor pagination goes forwards and backwards * @paginate cannot appear on a fragment with required args * paginate from query * paginated queries don't overlap existing variables * more overlapping tests * coordinate pagination transform with a central state object * embed refetch meta data in artifact * added tests for refetch spec * tests pass * invert generated flag - some generated documents need to be there * move pagination query name convention to config * update example to use connections * define paginatedQuery * embed pagination behavior on appropriate field * cache.write args are now an object * rename selection.paginate to update * failing tests for updating cache with pagination values * cache can optionally merge results with existing values * build artifact as object before serializing * fix duplicate update field * fix keys for embedded data * write data before subscribing so subscribers dont get lost * pass current page info as store * @list tagged connections have the correct fragments generated * lists on connections get flagged in their artifacts * cache can append to connections * @paginate can provide a list name * failing test for removing record from connection * can remove record from connection * implement loadPreviousPage * add list names and mutations back in * add list item subscription back * verify insert and deletes from connections * update list location directives with schema-less api * update cache test * removed record list references * make sure the store always has the fresh version when writing new data * cleanup * split up loadNext and loadPrevious functions * start implementing offset pagination handler * fix types in scalar tests * first pass at offset pagination loader * failing test for correctly loaded node in pagination result * overwrite entries in a connection that come from list operations * remove unreachable code * add embedded flag to query refetch spec for fragments under node * better test for embedded fragment queries * first pass at paginatedFragment * handle paginated componentQueries * dont add __typename everywhere, just unions, interfaces, and connections * only use typename for embedded references if it exists * add preliminary documentation for pagination support * v0.10.0-alpha.0 * fix pagination link in readme * fix forward cursor-based example * pass id to embedded fragment queries * v0.10.0-alpha.1 * document name arg of paginate directive * actually mix in query variables :facepalm: * v0.10.0-alpha.2 * clarify that paginatedQueries do not need node interface/resolver * typo in readme * added missing fence to readme * add paginated fragments and mutation operations to table of contents * fix pageInfo example * merge main * merge conflict in code of conduct * more readme tweaks * undo CoC changes * grammar is hard * pass extra variables to offset pagination handler * pass extra variables to loadNextPage * bump * more bumps * v0.10.0-alpha.5 * connection targets might be non-null * v0.10.0-alpha.8 * add loading state to pagination handlers * v0.10.0-alpha.9 * document pagination loading state * dry up cursor page loads * more bumps * v0.10.0-alpha.11 * catch empty page sizes * mix query variables into pagination handlers * v0.10.0-alpha.12 * context variables overwrite default ones for queries * better check for missing page size * typo * v0.10.0-alpha.13 --- CODE_OF_CONDUCT.md | 32 +- README.md | 145 +- example/package.json | 7 +- example/schema/index.cjs | 45 +- example/schema/schema.gql | 20 +- example/src/lib/ItemEntry.svelte | 4 +- example/src/routes/[filter].svelte | 71 +- example/src/routes/__layout.svelte | 7 + jest.setup.js | 46 +- lerna.json | 2 +- packages/houdini-common/package.json | 4 +- packages/houdini-common/src/config.ts | 43 +- packages/houdini-common/src/graphql.ts | 25 + packages/houdini-preprocess/package.json | 8 +- .../src/transforms/fragment.test.ts | 30 + .../src/transforms/fragment.ts | 34 +- .../src/transforms/query.test.ts | 46 + .../src/transforms/query.ts | 15 +- .../src/utils/walkTaggedDocuments.ts | 2 + packages/houdini/cmd/generate.ts | 3 +- .../generators/artifacts/artifacts.test.ts | 1409 +++++---- .../cmd/generators/artifacts/fieldKey.ts | 32 +- .../houdini/cmd/generators/artifacts/index.ts | 76 +- .../cmd/generators/artifacts/indexFile.ts | 8 +- .../cmd/generators/artifacts/inputs.ts | 77 +- .../cmd/generators/artifacts/operations.ts | 227 +- .../generators/artifacts/pagination.test.ts | 439 +++ .../cmd/generators/artifacts/selection.ts | 380 +-- .../cmd/generators/artifacts/utils.test.ts | 116 + .../houdini/cmd/generators/artifacts/utils.ts | 106 + .../generators/runtime/copyRuntime.test.ts | 4 +- .../cmd/generators/runtime/indexFile.test.ts | 2 +- .../cmd/generators/runtime/indexFile.ts | 2 +- .../cmd/generators/typescript/index.ts | 4 +- .../generators/typescript/typescript.test.ts | 1 + packages/houdini/cmd/testUtils.ts | 5 +- packages/houdini/cmd/transforms/addID.test.ts | 12 +- packages/houdini/cmd/transforms/addID.ts | 2 +- .../cmd/transforms/composeQueries.test.ts | 41 +- .../cmd/transforms/fragmentVariables.test.ts | 190 +- .../cmd/transforms/fragmentVariables.ts | 179 +- packages/houdini/cmd/transforms/index.ts | 1 + packages/houdini/cmd/transforms/list.ts | 192 +- packages/houdini/cmd/transforms/lists.test.ts | 214 +- .../houdini/cmd/transforms/paginate.test.ts | 1228 ++++++++ packages/houdini/cmd/transforms/paginate.ts | 673 +++++ .../houdini/cmd/transforms/schema.test.ts | 6 +- packages/houdini/cmd/transforms/schema.ts | 17 +- .../houdini/cmd/transforms/typename.test.ts | 20 +- packages/houdini/cmd/types.ts | 4 +- .../houdini/cmd/validators/typeCheck.test.ts | 266 +- packages/houdini/cmd/validators/typeCheck.ts | 400 ++- packages/houdini/package-lock.json | 2 +- packages/houdini/package.json | 6 +- packages/houdini/runtime/cache/cache.test.ts | 2521 ++++++++++++----- packages/houdini/runtime/cache/cache.ts | 306 +- packages/houdini/runtime/cache/index.ts | 2 +- packages/houdini/runtime/cache/list.ts | 142 +- packages/houdini/runtime/cache/record.ts | 13 +- packages/houdini/runtime/fragment.ts | 10 +- packages/houdini/runtime/index.ts | 9 +- packages/houdini/runtime/mutation.ts | 8 +- packages/houdini/runtime/pagination.test.ts | 27 + packages/houdini/runtime/pagination.ts | 387 +++ packages/houdini/runtime/query.ts | 24 +- packages/houdini/runtime/scalars.test.ts | 18 +- packages/houdini/runtime/subscription.ts | 8 +- packages/houdini/runtime/types.ts | 33 +- packages/houdini/runtime/utils.ts | 19 + yarn.lock | 26 +- 70 files changed, 8123 insertions(+), 2360 deletions(-) create mode 100644 packages/houdini/cmd/generators/artifacts/pagination.test.ts create mode 100644 packages/houdini/cmd/generators/artifacts/utils.test.ts create mode 100644 packages/houdini/cmd/generators/artifacts/utils.ts create mode 100644 packages/houdini/cmd/transforms/paginate.test.ts create mode 100644 packages/houdini/cmd/transforms/paginate.ts create mode 100644 packages/houdini/runtime/pagination.test.ts create mode 100644 packages/houdini/runtime/pagination.ts create mode 100644 packages/houdini/runtime/utils.ts diff --git a/CODE_OF_CONDUCT.md b/CODE_OF_CONDUCT.md index 9399d44bb..eee22c9fa 100644 --- a/CODE_OF_CONDUCT.md +++ b/CODE_OF_CONDUCT.md @@ -17,24 +17,24 @@ diverse, inclusive, and healthy community. Examples of behavior that contributes to a positive environment for our community include: -* Demonstrating empathy and kindness toward other people -* Being respectful of differing opinions, viewpoints, and experiences -* Giving and gracefully accepting constructive feedback -* Accepting responsibility and apologizing to those affected by our mistakes, - and learning from the experience -* Focusing on what is best not just for us as individuals, but for the - overall community +- Demonstrating empathy and kindness toward other people +- Being respectful of differing opinions, viewpoints, and experiences +- Giving and gracefully accepting constructive feedback +- Accepting responsibility and apologizing to those affected by our mistakes, + and learning from the experience +- Focusing on what is best not just for us as individuals, but for the + overall community Examples of unacceptable behavior include: -* The use of sexualized language or imagery, and sexual attention or - advances of any kind -* Trolling, insulting or derogatory comments, and personal or political attacks -* Public or private harassment -* Publishing others' private information, such as a physical or email - address, without their explicit permission -* Other conduct which could reasonably be considered inappropriate in a - professional setting +- The use of sexualized language or imagery, and sexual attention or + advances of any kind +- Trolling, insulting or derogatory comments, and personal or political attacks +- Public or private harassment +- Publishing others' private information, such as a physical or email + address, without their explicit permission +- Other conduct which could reasonably be considered inappropriate in a + professional setting ## Enforcement Responsibilities @@ -106,7 +106,7 @@ Violating these terms may lead to a permanent ban. ### 4. Permanent Ban **Community Impact**: Demonstrating a pattern of violation of community -standards, including sustained inappropriate behavior, harassment of an +standards, including sustained inappropriate behavior, harassment of an individual, or aggression toward or disparagement of classes of individuals. **Consequence**: A permanent ban from any sort of public interaction within diff --git a/README.md b/README.md index b631099ec..813c37d95 100644 --- a/README.md +++ b/README.md @@ -59,6 +59,9 @@ for the generation of an incredibly lean GraphQL abstraction for your applicatio 1. [Configuring the WebSocket client](#configuring-the-websocket-client) 1. [Using graphql-ws](#using-graphql-ws) 1. [Using subscriptions-transport-ws](#using-subscriptions-transport-ws) +1. [Pagination](#%EF%B8%8Fpagination) + 1. [Paginated Fragments](#paginated-fragments) + 1. [Mutation Operations](#mutation-operations) 1. [Custom Scalars](#%EF%B8%8Fcustom-scalars) 1. [Authentication](#authentication) 1. [Notes, Constraints, and Conventions](#%EF%B8%8Fnotes-constraints-and-conventions) @@ -114,7 +117,7 @@ import houdini from 'houdini-preprocess' ### Sapper -You'll need to add the preprocessor to both your client and your server configuration. With that in place, +You'll need to add the preprocessor to both your client and your server configuration. With that in place, the only thing left to configure your Sapper application is to connect your client and server to the generate network layer: ```typescript @@ -444,9 +447,16 @@ fragment UserAvatar on User @arguments(width: {type:"Int", default: 50}) { } ``` -An argument with no default value is considered required. If no value is provided, -an error will be thrown when generating your runtime. Providing values for fragments -is done with the `@with` decorator: +In order to mark an argument as required, pass the type with a `!` at the end. +If no value is provided, an error will be thrown when generating your runtime. + +```graphql +fragment UserAvatar on User @arguments(width: {type:"Int!"}) { + profilePicture(width: $width) +} +``` + +Providing values for fragments is done with the `@with` decorator: ```graphql query AllUsers { @@ -603,7 +613,7 @@ applied. To support this, houdini provides the `@when` and `@when_not` directive ```graphql mutation NewItem($input: AddItemInput!) { addItem(input: $input) { - ...All_Items_insert @when_not(argument: "completed", value: "true") + ...All_Items_insert @when_not(completed: true) } } ``` @@ -732,6 +742,131 @@ if (browser) { export default new Environment(fetchQuery, socketClient) ``` +## ♻️ Pagination + +It's often the case that you want to avoid querying an entire list from your API in order +to minimize the amount of data transfers over the network. To support this, GraphQL APIs will +"paginate" a field, allowing users to query a slice of the list. The strategy used to access +slices of a list fall into two categories. Offset-based pagination relies `offset` and `limit` +arguments and mimics the mechanisms provided by most database engines. Cursor-based pagination +is a bi-directional strategy that relies on `first`/`after` or `last`/`before` arguments and +is designed to handle modern pagination features such a infinite scrolling. + +Regardless of the strategy used, houdini follows a simple pattern: wrap your document in a +"paginated" function (ie, `paginatedQuery` or `paginatedFragment`), mark the field with +`@paginate`, and provide the "page size" via the `first`, `last` or `limit` arguments to the field. +`paginatedQuery` and `paginatedFragment` behave identically: they return a `data` field containing +a svelte store with your full dataset, functions you can call to load the next or previous +page, as well as a readable store with a boolean loading state. For example, a field +supporting offset-based pagination would look something like: + +```javascript +const { data, loadNextPage, loading } = paginatedQuery(graphql` + query UserList { + friends(limit: 10) @paginate { + id + } + } +`) +``` + +and a field that supports cursor-based pagination starting at the end of the list would look something like: + +```javascript +const { data, loadPreviousPage } = paginatedQuery(graphql` + query UserList { + friends(last: 10) @paginate { + edges { + node { + id + } + } + } + } +`) +``` + +If you are paginating a field with a cursor-based strategy (forward or backwards), the current page +info can be looked up with the `pageInfo` store returned from the paginated function: + +```svelte + + +{#if $pageInfo.hasNextPage} + +{/if} +``` + +### Paginated Fragments + +`paginatedFragment` functions very similarly to `paginatedQuery` with a few caveats. +Consider the following: + +```javascript +const { loadNextPage, data, pageInfo } = paginatedFragment(graphql` + fragment UserWithFriends on User { + friends(first: 10) @paginate { + edges { + node { + id + } + } + } + } +`) +``` + +In order to look up the next page for the user's friend. We need a way to query the specific user +that this fragment has been spread into. In order to pull this off, houdini relies on the generic `Node` +interface and corresponding query: + +```graphql +interface Node { + id: ID! +} + +type Query { + node(id: ID!): Node +} +``` + +In short, this means that any paginated fragment must be of a type that implements the Node interface +(so it can be looked up in the api). You can read more information about the `Node` interface in +[this section](https://graphql.org/learn/global-object-identification/) of the graphql community website. +This is only a requirement for paginated fragments. If your application only uses paginated queries, +you do not need to implement the Node interface and resolver. + +### Mutation Operations + +A paginated field can be marked as a potential target for a mutation operation by passing +a `name` argument to the `@paginate` directive: + +```javascript +const { loadNextPage, data, pageInfo } = paginatedFragment(graphql` + fragment UserWithFriends on User { + friends(first: 10) @paginate(name: "User_Friends") { + edges { + node { + id + } + } + } + } +`) +``` + ## ⚖️ Custom Scalars Configuring your runtime to handle custom scalars is done under the `scalars` key in your config: diff --git a/example/package.json b/example/package.json index 4f6f9e565..f4721d248 100644 --- a/example/package.json +++ b/example/package.json @@ -1,7 +1,7 @@ { "name": "example-kit", "private": true, - "version": "0.9.11", + "version": "0.10.0-alpha.13", "scripts": { "dev": "svelte-kit dev", "build": "svelte-kit build", @@ -12,8 +12,8 @@ "devDependencies": { "@sveltejs/kit": "1.0.0-next.107", "graphql": "15.5.0", - "houdini": "^0.9.11", - "houdini-preprocess": "^0.9.11", + "houdini": "^0.10.0-alpha.13", + "houdini-preprocess": "^0.10.0-alpha.13", "svelte": "^3.38.2", "svelte-preprocess": "^4.0.0", "tslib": "^2.2.0", @@ -22,6 +22,7 @@ "type": "module", "dependencies": { "apollo-server": "^2.24.0", + "graphql-relay": "^0.8.0", "subscriptions-transport-ws": "^0.9.18" } } diff --git a/example/schema/index.cjs b/example/schema/index.cjs index 522a2fbce..d92dad43c 100644 --- a/example/schema/index.cjs +++ b/example/schema/index.cjs @@ -1,6 +1,7 @@ const gql = require('graphql-tag') const { PubSub, withFilter } = require('apollo-server') const { GraphQLScalarType, Kind } = require('graphql') +const { connectionFromArray } = require('graphql-relay') const pubsub = new PubSub() @@ -20,7 +21,7 @@ module.exports.typeDefs = gql` } type Query { - items(completed: Boolean): [TodoItem!]! + items(first: Int, after: String, completed: Boolean): TodoItemConnection! } type Mutation { @@ -57,25 +58,51 @@ module.exports.typeDefs = gql` type ItemUpdate { item: TodoItem! } -` -id = 3 + type PageInfo { + startCursor: String + endCursor: String + hasNextPage: Boolean! + hasPreviousPage: Boolean! + } + + type TodoItemConnection { + totalCount: Int! + pageInfo: PageInfo! + edges: [TodoItemEdge!]! + } + + type TodoItemEdge { + cursor: String + node: TodoItem + } +` // example data let items = [ { id: '1', text: 'Taste JavaScript', createdAt: new Date() }, { id: '2', text: 'Buy a unicorn', createdAt: new Date() }, + { id: '3', text: 'Taste more JavaScript', createdAt: new Date() }, + { id: '4', text: 'Buy a another unicorn', createdAt: new Date() }, + { id: '5', text: 'Taste even more JavaScript', createdAt: new Date() }, + { id: '6', text: 'Buy a third unicorn', createdAt: new Date() }, ] +id = items.length + module.exports.resolvers = { Query: { - items: (_, { completed } = {}) => { - // if completed is undefined there is no filter - if (typeof completed === 'undefined') { - return items - } + items: (_, { completed, ...args } = {}) => { + const filtered = items.filter((item) => + typeof completed === 'boolean' + ? Boolean(item.completed) === Boolean(completed) + : true + ) + + const connection = connectionFromArray(filtered, args) + connection.totalCount = items.length - return items.filter((item) => Boolean(item.completed) === Boolean(completed)) + return connection }, }, Mutation: { diff --git a/example/schema/schema.gql b/example/schema/schema.gql index c76382231..bf0046eb8 100644 --- a/example/schema/schema.gql +++ b/example/schema/schema.gql @@ -41,7 +41,7 @@ type Mutation { } type Query { - items(completed: Boolean): [TodoItem!]! + items(first: Int, after: String, completed: Boolean): TodoItemConnection! } type Subscription { @@ -61,6 +61,24 @@ type UpdateItemOutput { item: TodoItem } +type PageInfo { + startCursor: String + endCursor: String + hasNextPage: Boolean! + hasPreviousPage: Boolean! +} + +type TodoItemConnection { + totalCount: Int! + pageInfo: PageInfo! + edges: [TodoItemEdge!]! +} + +type TodoItemEdge { + cursor: String + node: TodoItem +} + """ The `Upload` scalar type represents a file upload. """ diff --git a/example/src/lib/ItemEntry.svelte b/example/src/lib/ItemEntry.svelte index 169be9d1c..d44fdc8da 100755 --- a/example/src/lib/ItemEntry.svelte +++ b/example/src/lib/ItemEntry.svelte @@ -33,7 +33,7 @@ item { id completed - ...Filtered_Items_remove @when(argument: "completed", value: "false") + ...Filtered_Items_remove @when(completed: false) } } } @@ -44,7 +44,7 @@ item { id completed - ...Filtered_Items_remove @when(argument: "completed", value: "true") + ...Filtered_Items_remove @when(completed: true) } } } diff --git a/example/src/routes/[filter].svelte b/example/src/routes/[filter].svelte index 935ad0cac..112395a3e 100644 --- a/example/src/routes/[filter].svelte +++ b/example/src/routes/[filter].svelte @@ -18,26 +18,34 @@ diff --git a/jest.setup.js b/jest.setup.js index 24f12bc97..9b2bdd1b5 100644 --- a/jest.setup.js +++ b/jest.setup.js @@ -3,21 +3,57 @@ const graphql = require('graphql') const { testConfig } = require('houdini-common') const mockFs = require('mock-fs') const path = require('path') +const { toMatchInlineSnapshot } = require('jest-snapshot') +const fs = require('fs/promises') +const typeScriptParser = require('recast/parsers/typescript') process.env.TEST = 'true' +// the config to use in tests +const config = testConfig() + expect.addSnapshotSerializer({ - test: (val) => val && val.type, - serialize: (val) => recast.print(val).code, + test: (val) => val && Object.keys(recast.types.namedTypes).includes(val.type), + serialize: (val) => { + console.log + return recast.print(val).code + }, }) expect.addSnapshotSerializer({ - test: (val) => val && val.kind, + test: (val) => val && Object.values(graphql.Kind).includes(val.kind), serialize: (val) => graphql.print(val), }) -// the config to use in tests -const config = testConfig() +expect.addSnapshotSerializer({ + test: (val) => + val && + !Object.values(graphql.Kind).includes(val.kind) && + !Object.keys(recast.types.namedTypes).includes(val.type), + serialize: (val) => { + return JSON.stringify(val, null, 4) + }, +}) + +expect.extend({ + async toMatchArtifactSnapshot(value, ...rest) { + // The error (and its stacktrace) must be created before any `await` + this.error = new Error() + + // assuming that the value we were given is a collected document, figure + // out the path holding the artifact + const path = config.artifactPath(value.document) + + const artifactContents = await fs.readFile(path, 'utf-8') + + // parse the contents + const parsed = recast.parse(artifactContents, { + parser: typeScriptParser, + }).program + + return toMatchInlineSnapshot.call(this, parsed, ...rest) + }, +}) beforeEach(() => { mockFs({ diff --git a/lerna.json b/lerna.json index 86e943b38..80f91727b 100644 --- a/lerna.json +++ b/lerna.json @@ -1,5 +1,5 @@ { "packages": ["packages/*", "example"], - "version": "0.9.11", + "version": "0.10.0-alpha.13", "npmClient": "yarn" } diff --git a/packages/houdini-common/package.json b/packages/houdini-common/package.json index 2156090d8..591759481 100755 --- a/packages/houdini-common/package.json +++ b/packages/houdini-common/package.json @@ -1,6 +1,6 @@ { "name": "houdini-common", - "version": "0.9.0", + "version": "0.10.0-alpha.8", "description": "", "main": "build/cjs/index.js", "module": "build/esm/index.js", @@ -19,7 +19,7 @@ "keywords": [], "author": "", "license": "ISC", - "gitHead": "5c8d7507e445cdfb5db7e31ce724a9be9672452c", + "gitHead": "53c9c521029f9539e63fdaf5a9cd11244ef12cd5", "dependencies": { "@babel/parser": "^7.13.4", "mkdirp": "^1.0.4" diff --git a/packages/houdini-common/src/config.ts b/packages/houdini-common/src/config.ts index 8926bde32..afd50a2e1 100644 --- a/packages/houdini-common/src/config.ts +++ b/packages/houdini-common/src/config.ts @@ -295,6 +295,14 @@ export class Config { return 'with' } + get paginateDirective() { + return 'paginate' + } + + paginationQueryName(documentName: string) { + return documentName + '_Pagination_Query' + } + isDeleteDirective(name: string) { return name.endsWith(this.deleteDirectiveSuffix) } @@ -334,6 +342,7 @@ export class Config { this.whenNotDirective, this.argumentsDirective, this.withDirective, + this.paginateDirective, ].includes(name.value) || this.isDeleteDirective(name.value) ) } @@ -419,10 +428,14 @@ export function testConfig(config: Partial = {}) { filepath: path.join(process.cwd(), 'config.cjs'), sourceGlob: '123', schema: ` - type User { + type User implements Node { id: ID! firstName: String! friends: [User!]! + friendsByCursor(first: Int, after: String, last: Int, before: String, filter: String): UserConnection + friendsByBackwardsCursor(last: Int, before: String, filter: String): UserConnection + friendsByForwardsCursor(first: Int, after: String, filter: String): UserConnection + friendsByOffset(offset: Int, limit: Int, filter: String): [User!]! friendsInterface: [Friend!]! believesIn: [Ghost!]! cats: [Cat!]! @@ -435,7 +448,7 @@ export function testConfig(config: Partial = {}) { friends: [Ghost!]! } - type Cat implements Friend { + type Cat implements Friend & Node { id: ID! name: String! owner: User! @@ -448,6 +461,28 @@ export function testConfig(config: Partial = {}) { friends: [Friend!]! users(boolValue: Boolean, intValue: Int, floatValue: Float, stringValue: String!): [User!]! entities: [Entity!]! + usersByCursor(first: Int, after: String, last: Int, before: String): UserConnection + usersByBackwardsCursor(last: Int, before: String): UserConnection + usersByForwardsCursor(first: Int, after: String): UserConnection + usersByOffset(offset: Int, limit: Int): [User!]! + node(id: ID!): Node + } + + type PageInfo { + hasPreviousPage: Boolean! + hasNextPage: Boolean! + startCursor: String! + endCursor: String! + } + + type UserEdge { + cursor: String! + node: User + } + + type UserConnection { + pageInfo: PageInfo! + edges: [UserEdge] } interface Friend { @@ -492,6 +527,10 @@ export function testConfig(config: Partial = {}) { type CatMutationOutput { cat: Cat } + + interface Node { + id: ID! + } `, framework: 'sapper', quiet: true, diff --git a/packages/houdini-common/src/graphql.ts b/packages/houdini-common/src/graphql.ts index 7bfe3cf86..a423dc916 100644 --- a/packages/houdini-common/src/graphql.ts +++ b/packages/houdini-common/src/graphql.ts @@ -162,3 +162,28 @@ function walkAncestors( return getRootType(field.type) as GraphQLParentType } + +export function definitionFromAncestors(ancestors: readonly any[]) { + // in order to look up field type information we have to start at the parent + // and work our way down + // note: the top-most parent is always gonna be a document so we ignore it + let parents = [...ancestors] as ( + | graphql.FieldNode + | graphql.InlineFragmentNode + | graphql.FragmentDefinitionNode + | graphql.OperationDefinitionNode + | graphql.SelectionSetNode + )[] + parents.shift() + + // the first meaningful parent is a definition of some kind + let definition = parents.shift() as + | graphql.FragmentDefinitionNode + | graphql.OperationDefinitionNode + while (Array.isArray(definition) && definition) { + // @ts-ignore + definition = parents.shift() + } + + return definition +} diff --git a/packages/houdini-preprocess/package.json b/packages/houdini-preprocess/package.json index 260609183..1a1a33c11 100755 --- a/packages/houdini-preprocess/package.json +++ b/packages/houdini-preprocess/package.json @@ -1,6 +1,6 @@ { "name": "houdini-preprocess", - "version": "0.9.11", + "version": "0.10.0-alpha.13", "description": "", "main": "build/cjs/index.js", "module": "build/esm/index.js", @@ -35,13 +35,13 @@ "babylon": "^7.0.0-beta.47", "estree-walker": "^2.0.2", "graphql": "15.5.0", - "houdini": "^0.9.11", - "houdini-common": "^0.9.0", + "houdini": "^0.10.0-alpha.13", + "houdini-common": "^0.10.0-alpha.8", "mkdirp": "^1.0.4", "prettier": "*", "prettier-plugin-svelte": "^2.1.1", "recast": "^0.20.4", "svelte": "^3.17.3" }, - "gitHead": "5c8d7507e445cdfb5db7e31ce724a9be9672452c" + "gitHead": "53c9c521029f9539e63fdaf5a9cd11244ef12cd5" } diff --git a/packages/houdini-preprocess/src/transforms/fragment.test.ts b/packages/houdini-preprocess/src/transforms/fragment.test.ts index 322917f72..fae471162 100644 --- a/packages/houdini-preprocess/src/transforms/fragment.test.ts +++ b/packages/houdini-preprocess/src/transforms/fragment.test.ts @@ -28,4 +28,34 @@ describe('fragment preprocessor', function () { }, reference); `) }) + + test('paginated', async function () { + const doc = await preprocessorTest(` + + `) + + // make sure we added the right stuff + expect(doc.instance?.content).toMatchInlineSnapshot(` + import _TestFragment_Pagination_QueryArtifact from "$houdini/artifacts/TestFragment_Pagination_Query"; + import _TestFragmentArtifact from "$houdini/artifacts/TestFragment"; + let reference; + + const data = fragment({ + "kind": "HoudiniFragment", + "artifact": _TestFragmentArtifact, + "config": houdiniConfig, + "paginationArtifact": TestFragment_Pagination_Query + }, reference); + `) + }) }) diff --git a/packages/houdini-preprocess/src/transforms/fragment.ts b/packages/houdini-preprocess/src/transforms/fragment.ts index 9d15ef28c..c855df42e 100644 --- a/packages/houdini-preprocess/src/transforms/fragment.ts +++ b/packages/houdini-preprocess/src/transforms/fragment.ts @@ -31,21 +31,37 @@ export default async function fragmentProcessor( ) }, // if we found a tag we want to replace it with an object that the runtime can use - async onTag({ artifact, node }) { + async onTag({ artifact, node, tagContent }) { // the local identifier for the artifact const artifactVariable = artifactIdentifier(artifact) // replace the node with an object - node.replaceWith( - AST.objectExpression([ - AST.objectProperty(AST.stringLiteral('kind'), AST.stringLiteral(artifact.kind)), - AST.objectProperty(AST.literal('artifact'), AST.identifier(artifactVariable)), - AST.objectProperty(AST.literal('config'), AST.identifier('houdiniConfig')), - ]) - ) + const replacement = AST.objectExpression([ + AST.objectProperty(AST.stringLiteral('kind'), AST.stringLiteral(artifact.kind)), + AST.objectProperty(AST.literal('artifact'), AST.identifier(artifactVariable)), + AST.objectProperty(AST.literal('config'), AST.identifier('houdiniConfig')), + ]) // add an import to the body pointing to the artifact - doc.instance?.content.body.unshift(artifactImport(config, artifact)) + doc.instance!.content.body.unshift(artifactImport(config, artifact)) + + // if the fragment is paginated we need to add a reference to the pagination query + if (tagContent.includes(`@${config.paginateDirective}`)) { + // add the import to the pagination query + doc.instance!.content.body.unshift( + artifactImport(config, { name: config.paginationQueryName(artifact.name) }) + ) + + // and a reference in the tag replacement + replacement.properties.push( + AST.objectProperty( + AST.literal('paginationArtifact'), + AST.identifier(config.paginationQueryName(artifact.name)) + ) + ) + } + + node.replaceWith(replacement) }, }) } diff --git a/packages/houdini-preprocess/src/transforms/query.test.ts b/packages/houdini-preprocess/src/transforms/query.test.ts index 0e011fea6..5828768fc 100644 --- a/packages/houdini-preprocess/src/transforms/query.test.ts +++ b/packages/houdini-preprocess/src/transforms/query.test.ts @@ -400,6 +400,52 @@ describe('query preprocessor', function () { `) }) + test('paginated query gets reference to refetch artifact', async function () { + const doc = await preprocessorTest( + ` + + + + ` + ) + expect(doc.instance?.content).toMatchInlineSnapshot(` + import { routeQuery, componentQuery, query } from "$houdini"; + export let _TestQuery = undefined; + export let _TestQuery_Input = undefined; + + let _TestQuery_handler = paginatedQuery({ + "config": houdiniConfig, + "initialValue": _TestQuery, + "variables": _TestQuery_Input, + "kind": "HoudiniQuery", + "artifact": _TestQueryArtifact + }); + + const { + data + } = routeQuery(_TestQuery_handler); + + $: + { + _TestQuery_handler.writeData(_TestQuery, _TestQuery_Input); + } + `) + }) + test('bare svelte component in route filepath', async function () { const doc = await preprocessorTest( ` diff --git a/packages/houdini-preprocess/src/transforms/query.ts b/packages/houdini-preprocess/src/transforms/query.ts index f3f8bc2bb..6142d5549 100644 --- a/packages/houdini-preprocess/src/transforms/query.ts +++ b/packages/houdini-preprocess/src/transforms/query.ts @@ -49,7 +49,8 @@ export default async function queryProcessor( // note: we'll replace the tags as we discover them with something the runtime library can use const queries: EmbeddedGraphqlDocument[] = [] - // we need to keep track of + // remember the function that the document is passed to + let functionName = '' // go to every graphql document await walkTaggedDocuments(config, doc, doc.instance.content, { @@ -65,7 +66,7 @@ export default async function queryProcessor( // we want to replace it with an object that the runtime can use onTag(tag: EmbeddedGraphqlDocument) { // pull out what we need - const { node, parsedDocument, parent, artifact } = tag + const { node, parsedDocument, parent, artifact, tagContent } = tag // add the document to the list queries.push(tag) @@ -109,6 +110,9 @@ export default async function queryProcessor( // handler into a value that's useful for the operation context (route vs component) const callParent = parent as namedTypes.CallExpression if (callParent.type === 'CallExpression' && callParent.callee.type === 'Identifier') { + // need to make sure that we call the same function we were passed to + functionName = callParent.callee.name + // update the function called for the environment callParent.callee.name = isRoute ? 'routeQuery' : 'componentQuery' } }, @@ -142,7 +146,7 @@ export default async function queryProcessor( doc.instance.content.body.unshift(artifactImport(config, document.artifact)) } } - processInstance(config, isRoute, doc.instance, queries) + processInstance(config, isRoute, doc.instance, queries, functionName) } function processModule(config: Config, script: Script, queries: EmbeddedGraphqlDocument[]) { @@ -174,7 +178,8 @@ function processInstance( config: Config, isRoute: boolean, script: Script, - queries: EmbeddedGraphqlDocument[] + queries: EmbeddedGraphqlDocument[], + functionName: string ) { // make sure we have the imports we need ensureImports(config, script.content.body, ['routeQuery', 'componentQuery', 'query']) @@ -222,7 +227,7 @@ function processInstance( AST.variableDeclaration('let', [ AST.variableDeclarator( queryHandlerIdentifier(operation), - AST.callExpression(AST.identifier('query'), [ + AST.callExpression(AST.identifier(functionName), [ AST.objectExpression([ AST.objectProperty( AST.stringLiteral('config'), diff --git a/packages/houdini-preprocess/src/utils/walkTaggedDocuments.ts b/packages/houdini-preprocess/src/utils/walkTaggedDocuments.ts index 3ca27b3f1..69fce815a 100644 --- a/packages/houdini-preprocess/src/utils/walkTaggedDocuments.ts +++ b/packages/houdini-preprocess/src/utils/walkTaggedDocuments.ts @@ -29,6 +29,7 @@ export type EmbeddedGraphqlDocument = { remove: () => void replaceWith: (node: BaseNode) => void } + tagContent: string parent: BaseNode } @@ -127,6 +128,7 @@ export default async function walkTaggedDocuments( kind, }, parent, + tagContent, }) } }, diff --git a/packages/houdini/cmd/generate.ts b/packages/houdini/cmd/generate.ts index 68f9bf549..b30d57eae 100755 --- a/packages/houdini/cmd/generate.ts +++ b/packages/houdini/cmd/generate.ts @@ -39,6 +39,7 @@ export const runPipeline = async (config: Config, docs: CollectedGraphQLDocument transforms.typename, validators.uniqueNames, validators.noIDAlias, + transforms.paginate, // must go before fragment variables transforms.fragmentVariables, transforms.composeQueries, generators.artifacts, @@ -127,7 +128,7 @@ async function collectDocuments(config: Config): Promise directive.name.value === config.paginateDirective + ) + const paginationArgs = ['first', 'after', 'last', 'before', 'limit', 'offset'] + const argObj = (secondParse.arguments || []).reduce<{ [key: string]: string }>((acc, arg) => { // the query already contains a serialized version of the argument so just pull it out of the // document string const start = arg.value.loc?.start const end = arg.value.loc?.end + // if the field is paginated, ignore the pagination args in the key + if (paginated && paginationArgs.includes(arg.name.value)) { + return acc + } + // if the argument is not in the query, life doesn't make sense if (!start || !end) { return acc @@ -32,9 +44,17 @@ export default function fieldKey(field: graphql.FieldNode): string { } }, {}) - return Object.values(argObj).length > 0 - ? `${attributeName}(${Object.entries(argObj) - .map((entries) => entries.join(': ')) - .join(', ')})` - : attributeName + let key = + Object.values(argObj).length > 0 + ? `${attributeName}(${Object.entries(argObj) + .map((entries) => entries.join(': ')) + .join(', ')})` + : attributeName + + // if the field is paginated, key it differently so other documents can ask for the non paginated value without conflict + if (paginated) { + key = key + '::paginated' + } + + return key } diff --git a/packages/houdini/cmd/generators/artifacts/index.ts b/packages/houdini/cmd/generators/artifacts/index.ts index 2809074de..3bac72ad2 100644 --- a/packages/houdini/cmd/generators/artifacts/index.ts +++ b/packages/houdini/cmd/generators/artifacts/index.ts @@ -1,5 +1,5 @@ // externals -import { Config, getRootType, hashDocument, parentTypeFromAncestors } from 'houdini-common' +import { Config, getRootType, parentTypeFromAncestors } from 'houdini-common' import * as graphql from 'graphql' import { CompiledQueryKind, @@ -16,13 +16,14 @@ import selection from './selection' import { operationsByPath, FilterMap } from './operations' import writeIndexFile from './indexFile' import { inputObject } from './inputs' +import { serializeValue } from './utils' const AST = recast.types.builders // the artifact generator creates files in the runtime directory for each // document containing meta data that the preprocessor might use export default async function artifactGenerator(config: Config, docs: CollectedGraphQLDocument[]) { - // put together the type information for the filter for everylist + // put together the type information for the filter for every list const filterTypes: FilterMap = {} for (const doc of docs) { @@ -89,10 +90,13 @@ export default async function artifactGenerator(config: Config, docs: CollectedG writeIndexFile(config, docs), ].concat( // and an artifact for every document - docs.map(async ({ document, name, generated }) => { + docs.map(async (doc) => { + // pull out the info we need from the collected doc + const { document, name, generate } = doc + // if the document is generated, don't write it to disk - it's use is to provide definitions // for the other transforms - if (generated) { + if (!generate) { return } @@ -147,20 +151,6 @@ export default async function artifactGenerator(config: Config, docs: CollectedG throw new Error('Could not figure out what kind of document we were given') } - // generate a hash of the document that we can use to detect changes - // start building up the artifact - const artifact = AST.objectExpression([ - AST.objectProperty(AST.identifier('name'), AST.stringLiteral(name)), - AST.objectProperty(AST.identifier('kind'), AST.stringLiteral(docKind)), - AST.objectProperty( - AST.identifier('raw'), - AST.templateLiteral( - [AST.templateElement({ raw: rawString, cooked: rawString }, true)], - [] - ) - ), - ]) - let rootType: string | undefined = '' let selectionSet: graphql.SelectionSetNode @@ -201,35 +191,39 @@ export default async function artifactGenerator(config: Config, docs: CollectedG selectionSet = matchingFragment.selectionSet } - // add the selection information so we can subscribe to the store - artifact.properties.push( - AST.objectProperty(AST.identifier('rootType'), AST.stringLiteral(rootType)), - AST.objectProperty( - AST.identifier('selection'), - selection({ - config, - rootType, - selectionSet: selectionSet, - operations: operationsByPath(config, operations[0], filterTypes), - // do not include used fragments if we are rendering the selection - // for a fragment document - includeFragments: docKind !== 'HoudiniFragment', - document, - }) - ) - ) - // if there are inputs to the operation const inputs = operations[0]?.variableDefinitions - // add the input type definition to the artifact + + // generate a hash of the document that we can use to detect changes + // start building up the artifact + const artifact: Record = { + name, + kind: docKind, + refetch: doc.refetch, + raw: rawString, + rootType, + selection: selection({ + config, + rootType, + selectionSet: selectionSet, + operations: operationsByPath(config, operations[0], filterTypes), + // do not include used fragments if we are rendering the selection + // for a fragment document + includeFragments: docKind !== 'HoudiniFragment', + document: doc, + }), + } + + // if the document has inputs describe their types in the artifact so we can + // marshal and unmarshal scalars if (inputs && inputs.length > 0) { - artifact.properties.push( - AST.objectProperty(AST.identifier('input'), inputObject(config, inputs)) - ) + artifact.input = inputObject(config, inputs) } // the artifact should be the default export of the file - const file = AST.program([moduleExport(config, 'default', artifact)]) + const file = AST.program([ + moduleExport(config, 'default', serializeValue(artifact)), + ]) // write the result to the artifact path we're configured to write to await writeFile(config.artifactPath(document), recast.print(file).code) diff --git a/packages/houdini/cmd/generators/artifacts/indexFile.ts b/packages/houdini/cmd/generators/artifacts/indexFile.ts index 7cdd0e1f9..085c90cdf 100644 --- a/packages/houdini/cmd/generators/artifacts/indexFile.ts +++ b/packages/houdini/cmd/generators/artifacts/indexFile.ts @@ -6,20 +6,18 @@ import path from 'path' import { CollectedGraphQLDocument } from '../../types' import { cjsIndexFilePreamble, exportDefaultFrom, writeFile } from '../../utils' -const AST = recast.types.builders - export default async function writeIndexFile(config: Config, docs: CollectedGraphQLDocument[]) { - const nonGeneratedDocs = docs.filter((doc) => !doc.generated) + const docsToGenerate = docs.filter((doc) => doc.generate) // we want to export every artifact from the index file. let body = config.module === 'esm' - ? nonGeneratedDocs.reduce( + ? docsToGenerate.reduce( (content, doc) => content + `\n export { default as ${doc.name}} from './${doc.name}'`, '' ) - : nonGeneratedDocs.reduce( + : docsToGenerate.reduce( (content, doc) => content + `\n${exportDefaultFrom(`./${doc.name}`, doc.name)}`, cjsIndexFilePreamble ) diff --git a/packages/houdini/cmd/generators/artifacts/inputs.ts b/packages/houdini/cmd/generators/artifacts/inputs.ts index 525c8c721..ed32ac483 100644 --- a/packages/houdini/cmd/generators/artifacts/inputs.ts +++ b/packages/houdini/cmd/generators/artifacts/inputs.ts @@ -2,7 +2,9 @@ import * as recast from 'recast' import * as graphql from 'graphql' import { Config } from 'houdini-common' +// locals import { unwrapType } from '../../utils' +import type { InputObject } from '../../../runtime/types' const AST = recast.types.builders @@ -12,46 +14,37 @@ type ObjectExpression = recast.types.namedTypes.ObjectExpression export function inputObject( config: Config, inputs: readonly graphql.VariableDefinitionNode[] -): ObjectExpression { - // inputs can be recursive so we can't flatten the input type into a single object - - // there will always be an object that maps the root inputs to their type - const properties: ObjectProperty[] = [ - AST.objectProperty( - AST.literal('fields'), - AST.objectExpression( - inputs.map((input) => { - // find the inner type - const { type } = unwrapType(config, input.type) - - // embed the type in the input - return AST.objectProperty( - AST.literal(input.variable.name.value), - AST.stringLiteral(type.name) - ) - }) - ) - ), - ] - +): InputObject { // make sure we don't define the same input type const visitedTypes = new Set() - const typeObjectProperties: ObjectProperty[] = [] + // inputs can be recursive so we can't flatten the input type into a single object + const inputObj: InputObject = { + fields: inputs.reduce((fields, input) => { + // find the inner type + const { type } = unwrapType(config, input.type) + + // embed the type in the input + return { + ...fields, + [input.variable.name.value]: type.name, + } + }, {}), + types: {}, + } + + // walk through every type referenced and add it to the list for (const input of inputs) { - walkInputs(config, visitedTypes, typeObjectProperties, input.type) + walkInputs(config, visitedTypes, inputObj, input.type) } - properties.push( - AST.objectProperty(AST.literal('types'), AST.objectExpression(typeObjectProperties)) - ) - return AST.objectExpression(properties) + return inputObj } function walkInputs( config: Config, visitedTypes: Set, - properties: ObjectProperty[], + inputObj: InputObject, rootType: graphql.TypeNode | graphql.GraphQLNamedType ) { // find the core type @@ -75,22 +68,18 @@ function walkInputs( visitedTypes.add(type.name) // generate the entry for the type - properties.push( - AST.objectProperty( - AST.literal(type.name), - AST.objectExpression( - Object.values(type.getFields()).map((field: graphql.GraphQLInputField) => { - const { type: fieldType } = unwrapType(config, field.type) + inputObj!.types[type.name] = Object.values(type.getFields()).reduce( + (typeFields, field: graphql.GraphQLInputField) => { + const { type: fieldType } = unwrapType(config, field.type) - // keep walking down - walkInputs(config, visitedTypes, properties, fieldType) + // keep walking down + walkInputs(config, visitedTypes, inputObj, fieldType) - return AST.objectProperty( - AST.literal(field.name), - AST.stringLiteral(fieldType.toString()) - ) - }) - ) - ) + return { + ...typeFields, + [field.name]: fieldType.toString(), + } + }, + {} ) } diff --git a/packages/houdini/cmd/generators/artifacts/operations.ts b/packages/houdini/cmd/generators/artifacts/operations.ts index 4365b382a..6fb762070 100644 --- a/packages/houdini/cmd/generators/artifacts/operations.ts +++ b/packages/houdini/cmd/generators/artifacts/operations.ts @@ -4,6 +4,7 @@ import { Config, parentTypeFromAncestors } from 'houdini-common' import { ListWhen, MutationOperation } from '../../../runtime' import * as recast from 'recast' import * as graphql from 'graphql' +import { convertValue } from './utils' const AST = recast.types.builders @@ -12,9 +13,9 @@ export function operationsByPath( config: Config, definition: graphql.OperationDefinitionNode, filterTypes: FilterMap -): { [path: string]: namedTypes.ArrayExpression } { +): { [path: string]: MutationOperation[] } { // map the path in the response to the list of operations that treat it as the source - const pathOperations: { [path: string]: namedTypes.ArrayExpression } = {} + const pathOperations: { [path: string]: MutationOperation[] } = {} // we need to look for three different things in the operation: // - insert fragments @@ -33,18 +34,17 @@ export function operationsByPath( // if this is the first time we've seen this path give us a home const path = ancestorKey(ancestors) if (!pathOperations[path]) { - pathOperations[path] = AST.arrayExpression([]) + pathOperations[path] = [] } // add the operation object to the list - pathOperations[path].elements.push( + pathOperations[path].push( operationObject({ config, listName: config.listNameFromFragment(node.name.value), operationKind: config.listOperationFromFragment(node.name.value), - info: operationInfo(config, node), type: parentTypeFromAncestors(config.schema, ancestors).name, - filterTypes, + selection: node, }) ) }, @@ -57,21 +57,17 @@ export function operationsByPath( // if this is the first time we've seen this path give us a home const path = ancestorKey(ancestors) if (!pathOperations[path]) { - pathOperations[path] = AST.arrayExpression([]) + pathOperations[path] = [] } // add the operation object to the list - pathOperations[path].elements.push( + pathOperations[path].push( operationObject({ config, listName: node.name.value, operationKind: 'delete', - info: operationInfo( - config, - ancestors[ancestors.length - 1] as graphql.FieldNode - ), type: config.listNameFromDirective(node.name.value), - filterTypes, + selection: ancestors[ancestors.length - 1] as graphql.FieldNode, }) ) }, @@ -84,99 +80,15 @@ function operationObject({ config, listName, operationKind, - info, type, - filterTypes, + selection, }: { config: Config listName: string - operationKind: string - info: OperationInfo + operationKind: MutationOperation['action'] type: string - filterTypes: FilterMap -}) { - const operation = AST.objectExpression([ - AST.objectProperty(AST.literal('action'), AST.stringLiteral(operationKind)), - ]) - - // delete doesn't have a target - if (operationKind !== 'delete') { - operation.properties.push( - AST.objectProperty(AST.literal('list'), AST.stringLiteral(listName)) - ) - } - - // add the target type to delete operations - if (operationKind === 'delete' && type) { - operation.properties.push(AST.objectProperty(AST.literal('type'), AST.stringLiteral(type))) - } - - // only add the position argument if we are inserting something - if (operationKind === 'insert') { - operation.properties.push( - AST.objectProperty(AST.literal('position'), AST.stringLiteral(info.position || 'last')) - ) - } - - // if there is a parent id - if (info.parentID) { - // add it to the object - operation.properties.push( - AST.objectProperty( - AST.literal('parentID'), - AST.objectExpression([ - AST.objectProperty(AST.literal('kind'), AST.stringLiteral(info.parentID.kind)), - AST.objectProperty( - AST.literal('value'), - AST.stringLiteral(info.parentID.value) - ), - ]) - ) - ) - } - - // if there is a conditional - if (info.when) { - // build up the when object - const when = AST.objectExpression([]) - - // if there is a must - if (info.when.must) { - when.properties.push( - AST.objectProperty( - AST.literal('must'), - filterAST(filterTypes, listName, info.when.must) - ) - ) - } - - // if there is a must_not - if (info.when.must_not) { - when.properties.push( - AST.objectProperty( - AST.literal('must_not'), - filterAST(filterTypes, listName, info.when.must_not) - ) - ) - } - - // add it to the object - operation.properties.push(AST.objectProperty(AST.literal('when'), when)) - } - - return operation -} - -type OperationInfo = { - position: string - parentID?: { - value: string - kind: string - } - when?: ListWhen -} - -function operationInfo(config: Config, selection: graphql.SelectionNode): OperationInfo { + selection: graphql.SelectionNode +}): MutationOperation { // look at the directives applies to the spread for meta data about the mutation let parentID let parentKind: 'Variable' | 'String' = 'String' @@ -247,17 +159,6 @@ function operationInfo(config: Config, selection: graphql.SelectionNode): Operat continue } - // which are we looking at - const which = i ? 'must_not' : 'must' - // build up all of the values into a single object - const key = arg.value.fields.find(({ name, value }) => name.value === 'argument')?.value - const value = arg.value.fields.find(({ name }) => name.value === 'value') - - // make sure we got a string for the key - if (key?.kind !== 'StringValue' || !value || value.value.kind !== 'StringValue') { - throw new Error('Key and Value must be strings') - } - // make sure we have a place to record the when condition if (!operationWhen) { operationWhen = {} @@ -266,9 +167,13 @@ function operationInfo(config: Config, selection: graphql.SelectionNode): Operat // the kind of `value` is always going to be a string because the directive // can only take one type as its argument so we'll worry about parsing when // generating the artifact - operationWhen[which] = { - [key.value]: value.value.value, - } + operationWhen[i ? 'must_not' : 'must'] = arg.value.fields.reduce( + (obj, arg) => ({ + ...obj, + [arg.name.value]: convertValue(arg.value).value, + }), + {} + ) } // look at the when and when_not directives @@ -280,74 +185,56 @@ function operationInfo(config: Config, selection: graphql.SelectionNode): Operat // which are we looking at const which = i ? 'must_not' : 'must' - // look for the argument field - const key = directive.arguments?.find(({ name }) => name.value === 'argument') - const value = directive.arguments?.find(({ name }) => name.value === 'value') - - // make sure we got a string for the key - if (key?.value.kind !== 'StringValue' || !value || value.value.kind !== 'StringValue') { - throw new Error('Key and Value must be strings') - } - // make sure we have a place to record the when condition if (!operationWhen) { operationWhen = {} } - // the kind of `value` is always going to be a string because the directive - // can only take one type as its argument so we'll worry about parsing when - // generating the patches - operationWhen[which] = { - [key.value.value]: value.value.value, - } + // look for the argument field + operationWhen[which] = directive.arguments?.reduce( + (filters, argument) => ({ + ...filters, + [argument.name.value]: convertValue(argument.value).value, + }), + {} + ) } } - return { - parentID: parentID - ? { - value: parentID, - kind: parentKind, - } - : undefined, - position, - when: operationWhen, + const operation: MutationOperation = { + action: operationKind, + } + + // delete doesn't have a target + if (operationKind !== 'delete') { + operation.list = listName } -} -function filterAST( - filterTypes: FilterMap, - listName: string, - filter: ListWhen['must'] -): namedTypes.ObjectExpression { - if (!filter) { - return AST.objectExpression([]) + // add the target type to delete operations + if (operationKind === 'delete' && type) { + operation.type = type } - // build up the object - return AST.objectExpression( - Object.entries(filter).map(([key, value]) => { - // look up the key in the type map - const type = filterTypes[listName] && filterTypes[listName][key] - if (!type) { - throw new Error(`It looks like "${key}" is an invalid filter for list ${listName}`) - } + // only add the position argument if we are inserting something + if (operationKind === 'insert') { + operation.position = position || 'last' + } - let literal - if (type === 'String') { - literal = AST.stringLiteral(value as string) - } else if (type === 'Boolean') { - literal = AST.booleanLiteral(value === 'true') - } else if (type === 'Float') { - literal = AST.numericLiteral(parseFloat(value as string)) - } else if (type === 'Int') { - literal = AST.numericLiteral(parseInt(value as string, 10)) - } else { - throw new Error('Could not figure out filter value with type: ' + type) - } - return AST.objectProperty(AST.literal(key), literal) - }) - ) + // if there is a parent id + if (parentID) { + // add it to the object + operation.parentID = { + kind: parentKind, + value: parentID, + } + } + + // if there is a conditional + if (operationWhen) { + operation.when = operationWhen + } + + return operation } // TODO: find a way to reference the actual type for ancestors, using any as escape hatch diff --git a/packages/houdini/cmd/generators/artifacts/pagination.test.ts b/packages/houdini/cmd/generators/artifacts/pagination.test.ts new file mode 100644 index 000000000..15755886c --- /dev/null +++ b/packages/houdini/cmd/generators/artifacts/pagination.test.ts @@ -0,0 +1,439 @@ +// external imports +import { testConfig } from 'houdini-common' +// local imports +import '../../../../../jest.setup' +import { runPipeline } from '../../generate' +import { mockCollectedDoc } from '../../testUtils' + +// the config to use in tests +const config = testConfig() + +test('pagination arguments stripped from key', async function () { + const docs = [ + mockCollectedDoc( + ` + fragment PaginatedFragment on User { + friendsByCursor(first:10, filter: "hello") @paginate { + edges { + node { + id + } + } + } + } + ` + ), + ] + + await runPipeline(config, docs) + + // look at the artifact for the generated pagination query + await expect(docs[0]).toMatchArtifactSnapshot(` + module.exports = { + name: "PaginatedFragment", + kind: "HoudiniFragment", + + refetch: { + update: "append", + path: ["friendsByCursor"], + method: "cursor", + pageSize: 10, + embedded: true + }, + + raw: \`fragment PaginatedFragment on User { + friendsByCursor(first: $first, filter: "hello", after: $after) { + edges { + node { + id + } + } + edges { + cursor + node { + __typename + } + } + pageInfo { + hasPreviousPage + hasNextPage + startCursor + endCursor + } + } + } + \`, + + rootType: "User", + + selection: { + friendsByCursor: { + type: "UserConnection", + keyRaw: "friendsByCursor(filter: \\"hello\\")::paginated", + + fields: { + edges: { + type: "UserEdge", + keyRaw: "edges", + + fields: { + cursor: { + type: "String", + keyRaw: "cursor" + }, + + node: { + type: "User", + keyRaw: "node", + + fields: { + __typename: { + type: "String", + keyRaw: "__typename" + }, + + id: { + type: "ID", + keyRaw: "id" + } + } + } + }, + + update: "append" + }, + + pageInfo: { + type: "PageInfo", + keyRaw: "pageInfo", + + fields: { + hasPreviousPage: { + type: "Boolean", + keyRaw: "hasPreviousPage" + }, + + hasNextPage: { + type: "Boolean", + keyRaw: "hasNextPage" + }, + + startCursor: { + type: "String", + keyRaw: "startCursor" + }, + + endCursor: { + type: "String", + keyRaw: "endCursor" + } + } + } + } + } + } + }; + `) +}) + +test('offset based pagination marks appropriate field', async function () { + const docs = [ + mockCollectedDoc( + ` + fragment PaginatedFragment on User { + friendsByOffset(limit:10, filter: "hello") @paginate { + id + } + } + ` + ), + ] + + await runPipeline(config, docs) + + // look at the artifact for the generated pagination query + await expect(docs[0]).toMatchArtifactSnapshot(` + module.exports = { + name: "PaginatedFragment", + kind: "HoudiniFragment", + + refetch: { + update: "append", + path: ["friendsByOffset"], + method: "offset", + pageSize: 10, + embedded: true + }, + + raw: \`fragment PaginatedFragment on User { + friendsByOffset(limit: $limit, filter: "hello", offset: $offset) { + id + } + } + \`, + + rootType: "User", + + selection: { + friendsByOffset: { + type: "User", + keyRaw: "friendsByOffset(filter: \\"hello\\")::paginated", + update: "append", + + fields: { + id: { + type: "ID", + keyRaw: "id" + } + } + } + } + }; + `) +}) + +test("sibling aliases don't get marked", async function () { + const docs = [ + mockCollectedDoc( + ` + fragment PaginatedFragment on User { + friendsByCursor(first:10, filter: "hello") @paginate { + edges { + node { + friendsByCursor { + edges { + node { + id + } + } + } + } + } + } + friends: friendsByCursor(first:10, filter: "hello") { + edges { + node { + friendsByCursor { + edges { + node { + id + } + } + } + } + } + } + } + ` + ), + ] + + await runPipeline(config, docs) + + // look at the artifact for the generated pagination query + await expect(docs[0]).toMatchArtifactSnapshot(` + module.exports = { + name: "PaginatedFragment", + kind: "HoudiniFragment", + + refetch: { + update: "append", + path: ["friendsByCursor"], + method: "cursor", + pageSize: 10, + embedded: true + }, + + raw: \`fragment PaginatedFragment on User { + friendsByCursor(first: $first, filter: "hello", after: $after) { + edges { + node { + friendsByCursor { + edges { + node { + id + } + } + } + id + } + } + edges { + cursor + node { + __typename + } + } + pageInfo { + hasPreviousPage + hasNextPage + startCursor + endCursor + } + } + friends: friendsByCursor(first: 10, filter: "hello") { + edges { + node { + friendsByCursor { + edges { + node { + id + } + } + } + id + } + } + } + } + \`, + + rootType: "User", + + selection: { + friendsByCursor: { + type: "UserConnection", + keyRaw: "friendsByCursor(filter: \\"hello\\")::paginated", + + fields: { + edges: { + type: "UserEdge", + keyRaw: "edges", + + fields: { + cursor: { + type: "String", + keyRaw: "cursor" + }, + + node: { + type: "User", + keyRaw: "node", + + fields: { + __typename: { + type: "String", + keyRaw: "__typename" + }, + + friendsByCursor: { + type: "UserConnection", + keyRaw: "friendsByCursor", + + fields: { + edges: { + type: "UserEdge", + keyRaw: "edges", + + fields: { + node: { + type: "User", + keyRaw: "node", + + fields: { + id: { + type: "ID", + keyRaw: "id" + } + } + } + } + } + } + }, + + id: { + type: "ID", + keyRaw: "id" + } + } + } + }, + + update: "append" + }, + + pageInfo: { + type: "PageInfo", + keyRaw: "pageInfo", + + fields: { + hasPreviousPage: { + type: "Boolean", + keyRaw: "hasPreviousPage" + }, + + hasNextPage: { + type: "Boolean", + keyRaw: "hasNextPage" + }, + + startCursor: { + type: "String", + keyRaw: "startCursor" + }, + + endCursor: { + type: "String", + keyRaw: "endCursor" + } + } + } + } + }, + + friends: { + type: "UserConnection", + keyRaw: "friends(first: 10, filter: \\"hello\\")", + + fields: { + edges: { + type: "UserEdge", + keyRaw: "edges", + + fields: { + node: { + type: "User", + keyRaw: "node", + + fields: { + friendsByCursor: { + type: "UserConnection", + keyRaw: "friendsByCursor", + + fields: { + edges: { + type: "UserEdge", + keyRaw: "edges", + + fields: { + node: { + type: "User", + keyRaw: "node", + + fields: { + id: { + type: "ID", + keyRaw: "id" + } + } + } + } + } + } + }, + + id: { + type: "ID", + keyRaw: "id" + } + } + } + } + } + } + } + } + }; + `) +}) diff --git a/packages/houdini/cmd/generators/artifacts/selection.ts b/packages/houdini/cmd/generators/artifacts/selection.ts index 114589b92..11e604501 100644 --- a/packages/houdini/cmd/generators/artifacts/selection.ts +++ b/packages/houdini/cmd/generators/artifacts/selection.ts @@ -2,9 +2,12 @@ import { Config, getRootType } from 'houdini-common' import * as graphql from 'graphql' import * as recast from 'recast' -import { namedTypes } from 'ast-types/gen/namedTypes' // locals import fieldKey from './fieldKey' +import { CollectedGraphQLDocument } from '../../types' +import type { MutationOperation, SubscriptionSelection } from '../../../runtime' +import { convertValue, deepMerge } from './utils' +import { connectionSelection } from '../../transforms/list' const AST = recast.types.builders @@ -16,60 +19,59 @@ export default function selection({ path = [], includeFragments, document, + markEdges, }: { config: Config rootType: string selectionSet: graphql.SelectionSetNode - operations: { [path: string]: namedTypes.ArrayExpression } + operations: { [path: string]: MutationOperation[] } path?: string[] includeFragments: boolean - document: graphql.DocumentNode -}): namedTypes.ObjectExpression { + document: CollectedGraphQLDocument + markEdges?: string +}): SubscriptionSelection { // we need to build up an object that contains every field in the selection - const object = AST.objectExpression([]) + let object: SubscriptionSelection = {} for (const field of selectionSet.selections) { // ignore fragment spreads if (field.kind === 'FragmentSpread' && includeFragments) { // look up the fragment definition - const fragmentDefinition = document.definitions.find( + const fragmentDefinition = document.document.definitions.find( (defn) => defn.kind === 'FragmentDefinition' && defn.name.value === field.name.value ) as graphql.FragmentDefinitionNode if (!fragmentDefinition) { throw new Error('Could not find definition for fragment ' + field.name.value) } - const fragmentFields = selection({ - config, - rootType: fragmentDefinition.typeCondition.name.value, - operations, - selectionSet: fragmentDefinition.selectionSet, - path, - includeFragments, - document, - }) - for (const property of fragmentFields.properties) { - object.properties.push( - ...fragmentFields.properties.filter( - (prop) => prop.type === 'ObjectProperty' && prop.key - ) - ) - } + // merge the fragments selection into ours + object = deepMerge( + object, + selection({ + config, + rootType: fragmentDefinition.typeCondition.name.value, + operations, + selectionSet: fragmentDefinition.selectionSet, + path, + includeFragments, + document, + }) + ) } // inline fragments should be merged with the parent else if (field.kind === 'InlineFragment') { - const inlineFragment = selection({ - config, - rootType: field.typeCondition?.name.value || rootType, - operations, - selectionSet: field.selectionSet, - path, - includeFragments, - document, - }) - for (const property of inlineFragment.properties) { - object.properties.push(property) - } + object = deepMerge( + object, + selection({ + config, + rootType: field.typeCondition?.name.value || rootType, + operations, + selectionSet: field.selectionSet, + path, + includeFragments, + document, + }) + ) } // fields need their own entry else if (field.kind === 'Field') { @@ -93,34 +95,67 @@ export default function selection({ const pathSoFar = path.concat(attributeName) // the object holding data for this field - const fieldObj = AST.objectExpression([ - AST.objectProperty(AST.literal('type'), AST.stringLiteral(typeName)), - AST.objectProperty(AST.literal('keyRaw'), AST.stringLiteral(fieldKey(field))), - ]) + const fieldObj: SubscriptionSelection['field'] = { + type: typeName, + keyRaw: fieldKey(config, field), + } // is there an operation for this field const operationKey = pathSoFar.join(',') if (operations[operationKey]) { - fieldObj.properties.push( - AST.objectProperty(AST.literal('operations'), operations[operationKey]) - ) + fieldObj.operations = operations[operationKey] } // get the name of the list directive tagging this field - const nameArg = field.directives - ?.find((directive) => directive.name.value === config.listDirective) - ?.arguments?.find((arg) => arg.name.value === 'name') - let list + const listDirective = field.directives?.find((directive) => + [config.listDirective, config.paginateDirective].includes(directive.name.value) + ) + const nameArg = listDirective?.arguments?.find((arg) => arg.name.value === 'name') if (nameArg && nameArg.value.kind === 'StringValue') { - list = nameArg.value.value - fieldObj.properties.push( - AST.objectProperty(AST.literal('list'), AST.stringLiteral(list)) + const { connection, type: connectionType } = connectionSelection( + config, + type.getFields()[field.name.value] as graphql.GraphQLField, + fieldType as graphql.GraphQLObjectType, + field.selectionSet ) + + fieldObj.list = { + name: nameArg.value.value, + connection, + type: connectionType.name, + } + } + + // if the field is marked for pagination we want to leave something behind + // so that cache.write can perform the necessary inserts when appropriate + const paginated = field.directives?.find( + (directive) => directive.name.value === config.paginateDirective + ) + + // if the field is marked for offset pagination we need to mark this field + if (paginated && document.refetch && document.refetch.method === 'offset') { + fieldObj.update = document.refetch.update + } + + // if we are looking at the edges field and we're supposed to mark it for pagination + if (attributeName === 'edges' && markEdges && document.refetch) { + // otherwise mark this field + fieldObj.update = document.refetch.update + + // make sure we don't mark the children + markEdges = '' } // only add the field object if there are properties in it if (field.selectionSet) { - const selectionObj = selection({ + // if this field was marked for cursor based pagination we need to mark + // the edges field that falls underneath it + const edgesMark = + paginated && document.refetch?.method === 'cursor' + ? document.refetch.update + : markEdges + + fieldObj.fields = selection({ config, rootType: typeName, selectionSet: field.selectionSet, @@ -128,251 +163,32 @@ export default function selection({ path: pathSoFar, includeFragments, document, + markEdges: edgesMark, }) - fieldObj.properties.push(AST.objectProperty(AST.literal('fields'), selectionObj)) } // any arguments on the list field can act as a filter - if (field.arguments?.length && list) { - fieldObj.properties.push( - AST.objectProperty( - AST.stringLiteral('filters'), - AST.objectExpression( - (field.arguments || []).flatMap((arg) => { - // figure out the value to use - let value - let kind - - // the value of the arg is always going to be a - - if (arg.value.kind === graphql.Kind.INT) { - value = AST.literal(parseInt(arg.value.value, 10)) - kind = 'Int' - } else if (arg.value.kind === graphql.Kind.FLOAT) { - value = AST.literal(parseFloat(arg.value.value)) - kind = 'Float' - } else if (arg.value.kind === graphql.Kind.BOOLEAN) { - value = AST.booleanLiteral(arg.value.value) - kind = 'Boolean' - } else if (arg.value.kind === graphql.Kind.VARIABLE) { - value = AST.stringLiteral(arg.value.name.value) - kind = 'Variable' - } else if (arg.value.kind === graphql.Kind.STRING) { - value = AST.stringLiteral(arg.value.value) - kind = 'String' - } - - if (!value || !kind) { - return [] - } - - return [ - AST.objectProperty( - AST.stringLiteral(arg.name.value), - AST.objectExpression([ - AST.objectProperty( - AST.literal('kind'), - AST.stringLiteral(kind) - ), - AST.objectProperty(AST.literal('value'), value), - ]) - ), - ] - }) - ) - ) + if (field.arguments?.length && fieldObj.list) { + fieldObj.filters = (field.arguments || []).reduce( + (filters, arg) => ({ + ...filters, + [arg.name.value]: convertValue(arg.value), + }), + {} ) } // if we are looking at an interface if (graphql.isInterfaceType(fieldType) || graphql.isUnionType(fieldType)) { - fieldObj.properties.push( - AST.objectProperty(AST.stringLiteral('abstract'), AST.booleanLiteral(true)) - ) + fieldObj.abstract = true } // add the field data we computed - object.properties.push(AST.objectProperty(AST.stringLiteral(attributeName), fieldObj)) - } - } - - return mergeSelections(config, object.properties as namedTypes.ObjectProperty[]) -} - -// different fragments can provide different selections of the same field -// we need to merge them into one single selection -function mergeSelections( - config: Config, - selections: namedTypes.ObjectProperty[] -): namedTypes.ObjectExpression { - // we need to group together every field in the selection by its name - const fields = selections.reduce<{ - [key: string]: namedTypes.ObjectProperty[] - }>((prev, property) => { - // the key - const key = (property.key as namedTypes.StringLiteral).value - return { - ...prev, - [key]: (prev[key] || []).concat(property), - } - }, {}) - - // build up an object - const obj = AST.objectExpression([]) - - // visit every set of properties - for (const [attributeName, properties] of Object.entries(fields)) { - // the type and key name should all be the same - const types = properties.map( - (property) => - (property.value as namedTypes.ObjectExpression).properties.find( - (prop) => - prop.type === 'ObjectProperty' && - prop.key.type === 'Literal' && - prop.key.value === 'type' - // @ts-ignore - )?.value.value - ) - if (new Set(types).size !== 1) { - throw new Error( - 'Encountered multiple types at the same field. Found ' + - JSON.stringify([...new Set(types)]) - ) - } - const keys = properties.map( - (property) => - (property.value as namedTypes.ObjectExpression).properties.find( - (prop) => - prop.type === 'ObjectProperty' && - prop.key.type === 'Literal' && - prop.key.value === 'keyRaw' - // @ts-ignore - )?.value.value - ) - if (new Set(keys).size !== 1) { - throw new Error( - 'Encountered multiple keys at the same field. Found ' + - JSON.stringify([...new Set(keys)]) - ) - } - const lists = properties - .map( - (property) => - (property.value as namedTypes.ObjectExpression).properties.find( - (prop) => - prop.type === 'ObjectProperty' && - prop.key.type === 'Literal' && - prop.key.value === 'list' - // @ts-ignore - )?.value.value - ) - .filter(Boolean) - const operations = properties - .flatMap( - (property) => - (property.value as namedTypes.ObjectExpression).properties.find( - (prop) => - prop.type === 'ObjectProperty' && - prop.key.type === 'Literal' && - prop.key.value === 'operations' - // @ts-ignore - )?.value.elements - ) - .filter(Boolean) - - const filters = properties - .map((property) => - (property.value as namedTypes.ObjectExpression).properties.find( - (prop) => - prop.type === 'ObjectProperty' && - prop.key.type === 'StringLiteral' && - prop.key.value === 'filters' - ) - ) - .filter(Boolean)[0] as namedTypes.ObjectProperty - - const abstractFlags = properties - .map( - (property) => - (property.value as namedTypes.ObjectExpression).properties.find( - (prop) => - prop.type === 'ObjectProperty' && - prop.key.type === 'StringLiteral' && - prop.key.value === 'abstract' - // @ts-ignore - )?.value.value - ) - .filter(Boolean) - - // look at the first one in the list to check type - const typeProperty = types[0] - const key = keys[0] - const list = lists[0] - const abstractFlag = abstractFlags[0] - - // if the type is a scalar just add the first one and move on - if (config.isSelectionScalar(typeProperty)) { - obj.properties.push(properties[0]) - continue - } - - const fields = properties - .map( - (property) => - (property.value as namedTypes.ObjectExpression).properties.find( - (prop) => - prop.type === 'ObjectProperty' && - prop.key.type === 'Literal' && - prop.key.value === 'fields' - // @ts-ignore - )?.value - ) - .flatMap((obj) => obj && obj.properties) - .filter(Boolean) - - if (fields) { - const fieldObj = AST.objectExpression([ - AST.objectProperty(AST.literal('type'), AST.stringLiteral(typeProperty)), - AST.objectProperty(AST.literal('keyRaw'), AST.stringLiteral(key)), - ]) - - // perform the merge - const merged = mergeSelections(config, fields as namedTypes.ObjectProperty[]) - if (merged.properties.length > 0) { - fieldObj.properties.push(AST.objectProperty(AST.literal('fields'), merged)) - } - - // add the list field if its present - if (list) { - fieldObj.properties.push( - AST.objectProperty(AST.literal('list'), AST.stringLiteral(list)) - ) - } - - // if its marked as a list - if (abstractFlag) { - fieldObj.properties.push( - AST.objectProperty(AST.literal('abstract'), AST.booleanLiteral(abstractFlag)) - ) - } - - // if there are any operations - if (operations.length > 0) { - fieldObj.properties.push( - AST.objectProperty( - AST.literal('operations'), - AST.arrayExpression(operations.reduce((prev, acc) => prev.concat(acc), [])) - ) - ) - } - - if (filters) { - fieldObj.properties.push(filters) - } - - obj.properties.push(AST.objectProperty(AST.literal(attributeName), fieldObj)) + object[attributeName] = deepMerge( + fieldObj, + object[attributeName] || {} + ) as SubscriptionSelection['field'] } } - // we're done - return obj + return object } diff --git a/packages/houdini/cmd/generators/artifacts/utils.test.ts b/packages/houdini/cmd/generators/artifacts/utils.test.ts new file mode 100644 index 000000000..0ef844224 --- /dev/null +++ b/packages/houdini/cmd/generators/artifacts/utils.test.ts @@ -0,0 +1,116 @@ +import { deepMerge } from './utils' + +describe('deep merge', function () { + test('non-conflicting keys', function () { + const one = { + hello: 'world', + } + const two = { + goodbye: 'moon', + } + + expect(deepMerge(one, two)).toEqual({ + hello: 'world', + goodbye: 'moon', + }) + }) + + test('nested objects', function () { + const one = { + hello: { + message: 'world', + }, + } + const two = { + hello: { + anotherMessage: 'moon', + }, + } + expect(deepMerge(one, two)).toEqual({ + hello: { + message: 'world', + anotherMessage: 'moon', + }, + }) + }) + + test('conflicting keys - same value', function () { + const one = { + hello: 'world', + } + const two = { + hello: 'world', + } + expect(deepMerge(one, two)).toEqual({ + hello: 'world', + }) + }) + + test('conflicting keys - different value', function () { + const one = { + hello: 'world', + } + const two = { + hello: 'moon', + } + expect(() => deepMerge(one, two)).toThrow() + }) + + test('three-way merge', function () { + const one = { + message1: 'hello world', + } + const two = { + message2: 'goodbye moon', + } + const three = { + message3: "i don't know", + } + + expect(deepMerge(one, two, three)).toEqual({ + message1: 'hello world', + message2: 'goodbye moon', + message3: "i don't know", + }) + }) + + test('three way deep nested', function () { + const one = { + message1: 'hello world', + nested: { + nestedMessage1: 'another world', + }, + } + const two = { + message2: 'goodbye moon', + nested: { + inner: { + innerMessage2: 'yet another moon', + }, + }, + } + const three = { + message3: "i don't know", + nested: { + nestedMessage3: 'another uncertainty', + inner: { + innerMessage3: 'yet another uncertainty', + }, + }, + } + + expect(deepMerge(one, two, three)).toEqual({ + message1: 'hello world', + message2: 'goodbye moon', + message3: "i don't know", + nested: { + nestedMessage1: 'another world', + nestedMessage3: 'another uncertainty', + inner: { + innerMessage2: 'yet another moon', + innerMessage3: 'yet another uncertainty', + }, + }, + }) + }) +}) diff --git a/packages/houdini/cmd/generators/artifacts/utils.ts b/packages/houdini/cmd/generators/artifacts/utils.ts new file mode 100644 index 000000000..257d62812 --- /dev/null +++ b/packages/houdini/cmd/generators/artifacts/utils.ts @@ -0,0 +1,106 @@ +import * as recast from 'recast' +import { ExpressionKind } from 'ast-types/gen/kinds' +import * as graphql from 'graphql' + +const AST = recast.types.builders + +export function serializeValue(value: any): ExpressionKind { + // if we are serializing a list + if (Array.isArray(value)) { + // return an array expression with every element serialize + return AST.arrayExpression(value.map(serializeValue)) + } + + // if we are serializing an object + if (typeof value === 'object' && value !== null) { + return AST.objectExpression( + Object.entries(value) + .filter(([, value]) => typeof value !== 'undefined') + .map(([key, value]) => + AST.objectProperty(AST.identifier(key), serializeValue(value)) + ) + ) + } + + // if we are serializing a string + if (typeof value === 'string') { + // if there are new lines, use a template. otherwise, just use a string + if (value.indexOf('\n') !== -1) { + return AST.templateLiteral( + [AST.templateElement({ raw: value, cooked: value }, true)], + [] + ) + } + return AST.stringLiteral(value) + } + + // anything else can just use its literal value + return AST.literal(value) +} + +export function deepMerge(...targets: {}[]): {} { + // look at the first target to know what type we're merging + + // if we aren't looking at an object + if (typeof targets[0] !== 'object') { + // make sure all of the values are the same + const matches = targets.filter((val) => val !== targets[0]).length === 0 + if (!matches) { + throw new Error('could not merge: ' + targets) + } + + // return the matching value + return targets[0] + } + + // if we are looking at a list of lists + if (Array.isArray(targets[0])) { + return (targets[0] as {}[]).concat(...targets.slice(1)) + } + + // collect all of the fields that the targets specify and map them to their value + const fields: Record = {} + + for (const target of targets) { + // add every field of the target to the bag + for (const [key, value] of Object.entries(target)) { + // if we haven't seen the key before + if (!fields[key]) { + // save it as a list + fields[key] = [] + } + + fields[key].push(value) + } + } + + return Object.fromEntries( + Object.entries(fields).map(([key, value]) => [key, deepMerge(...value)]) + ) +} + +export function convertValue(val: graphql.ValueNode) { + // figure out the value to use + let value + let kind + + // the value of the arg is always going to be a + if (val.kind === graphql.Kind.INT) { + value = parseInt(val.value, 10) + kind = 'Int' + } else if (val.kind === graphql.Kind.FLOAT) { + value = parseFloat(val.value) + kind = 'Float' + } else if (val.kind === graphql.Kind.BOOLEAN) { + value = val.value + kind = 'Boolean' + } else if (val.kind === graphql.Kind.VARIABLE) { + value = val.name.value + kind = 'Variable' + } else if (val.kind === graphql.Kind.STRING) { + value = val.value + kind = 'String' + } + + return { kind, value } +} diff --git a/packages/houdini/cmd/generators/runtime/copyRuntime.test.ts b/packages/houdini/cmd/generators/runtime/copyRuntime.test.ts index 82d3ed696..97fec9c52 100644 --- a/packages/houdini/cmd/generators/runtime/copyRuntime.test.ts +++ b/packages/houdini/cmd/generators/runtime/copyRuntime.test.ts @@ -30,7 +30,7 @@ test('cache index runtime imports config file - commonjs', async function () { Object.defineProperty(exports, "__esModule", { value: true }); var cache_1 = require("./cache"); // @ts-ignore: config will be defined by the generator - exports.default = new cache_1.Cache(config); + exports.default = new cache_1.Cache(config || {}); `) }) @@ -54,6 +54,6 @@ test('cache index runtime imports config file - kit', async function () { import config from "../../../config.cjs" import { Cache } from './cache'; // @ts-ignore: config will be defined by the generator - export default new Cache(config); + export default new Cache(config || {}); `) }) diff --git a/packages/houdini/cmd/generators/runtime/indexFile.test.ts b/packages/houdini/cmd/generators/runtime/indexFile.test.ts index 87e7769c8..e71f627b5 100644 --- a/packages/houdini/cmd/generators/runtime/indexFile.test.ts +++ b/packages/houdini/cmd/generators/runtime/indexFile.test.ts @@ -69,7 +69,7 @@ test('runtime index file - kit', async function () { }).program // verify contents expect(parsedQuery).toMatchInlineSnapshot(` - export {default as houdiniConfig } from "../config.cjs" + export { default as houdiniConfig } from "../config.cjs" export * from "./runtime" export * from "./artifacts" `) diff --git a/packages/houdini/cmd/generators/runtime/indexFile.ts b/packages/houdini/cmd/generators/runtime/indexFile.ts index f11faceea..3d01f3ff6 100644 --- a/packages/houdini/cmd/generators/runtime/indexFile.ts +++ b/packages/houdini/cmd/generators/runtime/indexFile.ts @@ -29,7 +29,7 @@ ${exportStarFrom(artifactDir)} // otherwise just use esm statements as the final result else { body = ` -export {default as houdiniConfig } from "${configPath}" +export { default as houdiniConfig } from "${configPath}" export * from "${runtimeDir}" export * from "${artifactDir}" ` diff --git a/packages/houdini/cmd/generators/typescript/index.ts b/packages/houdini/cmd/generators/typescript/index.ts index 4699516b9..e44f8a6d7 100644 --- a/packages/houdini/cmd/generators/typescript/index.ts +++ b/packages/houdini/cmd/generators/typescript/index.ts @@ -27,8 +27,8 @@ export default async function typescriptGenerator( // the generated types depend solely on user-provided information // so we need to use the original document that we haven't mutated // as part of the compiler - docs.map(async ({ originalDocument, generated }) => { - if (generated) { + docs.map(async ({ originalDocument, generate }) => { + if (!generate) { return } diff --git a/packages/houdini/cmd/generators/typescript/typescript.test.ts b/packages/houdini/cmd/generators/typescript/typescript.test.ts index 39b5ad0f0..15246f825 100644 --- a/packages/houdini/cmd/generators/typescript/typescript.test.ts +++ b/packages/houdini/cmd/generators/typescript/typescript.test.ts @@ -22,6 +22,7 @@ const config = testConfig({ users: [User] nodes: [Node!]! entities: [Entity] + node(id: ID!): Node } type Mutation { diff --git a/packages/houdini/cmd/testUtils.ts b/packages/houdini/cmd/testUtils.ts index 40ce03b02..d74833e4f 100644 --- a/packages/houdini/cmd/testUtils.ts +++ b/packages/houdini/cmd/testUtils.ts @@ -46,7 +46,7 @@ export function pipelineTest( }) } -export function mockCollectedDoc(query: string) { +export function mockCollectedDoc(query: string): CollectedGraphQLDocument { const parsed = graphql.parse(query) // look at the first definition in the pile for the name @@ -58,7 +58,6 @@ export function mockCollectedDoc(query: string) { document: parsed, originalDocument: parsed, filename: `${name}.ts`, - printed: query, - generated: false, + generate: true, } } diff --git a/packages/houdini/cmd/transforms/addID.test.ts b/packages/houdini/cmd/transforms/addID.test.ts index a04695439..073ddcde2 100644 --- a/packages/houdini/cmd/transforms/addID.test.ts +++ b/packages/houdini/cmd/transforms/addID.test.ts @@ -23,14 +23,14 @@ test('adds ids to selection sets of objects with them', async function () { const config = testConfig() await runPipeline(config, docs) - expect(graphql.print(docs[0].document)).toMatchInlineSnapshot(` - "query Friends { + expect(docs[0].document).toMatchInlineSnapshot(` + query Friends { user { firstName id } } - " + `) }) @@ -51,12 +51,12 @@ test("doesn't add id if there isn't one", async function () { const config = testConfig() await runPipeline(config, docs) - expect(graphql.print(docs[0].document)).toMatchInlineSnapshot(` - "query Friends { + expect(docs[0].document).toMatchInlineSnapshot(` + query Friends { ghost { aka } } - " + `) }) diff --git a/packages/houdini/cmd/transforms/addID.ts b/packages/houdini/cmd/transforms/addID.ts index 42a6f0b85..fcfff4bb2 100644 --- a/packages/houdini/cmd/transforms/addID.ts +++ b/packages/houdini/cmd/transforms/addID.ts @@ -6,7 +6,7 @@ import { CollectedGraphQLDocument } from '../types' import { unwrapType } from '../utils' // typename adds __typename to the selection set of any unions or interfaces -export default async function addTypename( +export default async function addID( config: Config, documents: CollectedGraphQLDocument[] ): Promise { diff --git a/packages/houdini/cmd/transforms/composeQueries.test.ts b/packages/houdini/cmd/transforms/composeQueries.test.ts index 62caec9a0..8653f9b55 100644 --- a/packages/houdini/cmd/transforms/composeQueries.test.ts +++ b/packages/houdini/cmd/transforms/composeQueries.test.ts @@ -25,22 +25,29 @@ const start = [ `, ] -pipelineTest('include fragment definitions', start, true, function (docs) { - // we only care about the Foo document - const fooDoc = docs.find((doc) => doc.name === 'Foo') as CollectedGraphQLDocument +pipelineTest( + 'include fragment definitions', + start, + true, + function (docs: CollectedGraphQLDocument[]) { + // we only care about the Foo document + const fooDoc = docs.find((doc) => doc.name === 'Foo')! - // make sure there are at least three definitions - expect(fooDoc.document.definitions).toHaveLength(3) + // make sure there are at least three definitions + expect(fooDoc.document.definitions).toHaveLength(3) - // make sure that there is one for each fragment - const fragmentADef = fooDoc.document.definitions.find( - (definition) => - definition.kind === graphql.Kind.FRAGMENT_DEFINITION && definition.name.value === 'A' - ) - const fragmentBDef = fooDoc.document.definitions.find( - (definition) => - definition.kind === graphql.Kind.FRAGMENT_DEFINITION && definition.name.value === 'B' - ) - expect(fragmentADef).toBeDefined() - expect(fragmentBDef).toBeDefined() -}) + // make sure that there is one for each fragment + const fragmentADef = fooDoc.document.definitions.find( + (definition) => + definition.kind === graphql.Kind.FRAGMENT_DEFINITION && + definition.name.value === 'A' + ) + const fragmentBDef = fooDoc.document.definitions.find( + (definition) => + definition.kind === graphql.Kind.FRAGMENT_DEFINITION && + definition.name.value === 'B' + ) + expect(fragmentADef).toBeDefined() + expect(fragmentBDef).toBeDefined() + } +) diff --git a/packages/houdini/cmd/transforms/fragmentVariables.test.ts b/packages/houdini/cmd/transforms/fragmentVariables.test.ts index 1e5a018a2..cd8f79af8 100644 --- a/packages/houdini/cmd/transforms/fragmentVariables.test.ts +++ b/packages/houdini/cmd/transforms/fragmentVariables.test.ts @@ -22,7 +22,7 @@ test('pass argument values to generated fragments', async function () { mockCollectedDoc( ` fragment QueryFragment on Query - @arguments(name: {type: "String"} ) { + @arguments(name: {type: "String!"} ) { users(stringValue: $name) { id } @@ -64,14 +64,84 @@ test('pass argument values to generated fragments', async function () { rootType: "Query", selection: { - "users": { - "type": "User", - "keyRaw": "users(stringValue: \\"Hello\\")", - - "fields": { - "id": { - "type": "ID", - "keyRaw": "id" + users: { + type: "User", + keyRaw: "users(stringValue: \\"Hello\\")", + + fields: { + id: { + type: "ID", + keyRaw: "id" + } + } + } + } + }; + `) +}) + +test("nullable arguments with no values don't show up in the query", async function () { + const docs = [ + mockCollectedDoc( + ` + query AllUsers { + ...QueryFragment + } + ` + ), + mockCollectedDoc( + ` + fragment QueryFragment on Query + @arguments(name: {type: "String"} ) { + users(stringValue: $name) { + id + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + const queryContents = await fs.readFile( + path.join(config.artifactPath(docs[0].document)), + 'utf-8' + ) + expect(queryContents).toBeTruthy() + // parse the contents + const parsedQuery: ProgramKind = recast.parse(queryContents, { + parser: typeScriptParser, + }).program + // verify contents + expect(parsedQuery).toMatchInlineSnapshot(` + module.exports = { + name: "AllUsers", + kind: "HoudiniQuery", + + raw: \`query AllUsers { + ...QueryFragment + } + + fragment QueryFragment on Query { + users { + id + } + } + \`, + + rootType: "Query", + + selection: { + users: { + type: "User", + keyRaw: "users", + + fields: { + id: { + type: "ID", + keyRaw: "id" } } } @@ -134,14 +204,14 @@ test("fragment arguments with default values don't rename the fragment", async f rootType: "Query", selection: { - "users": { - "type": "User", - "keyRaw": "users(stringValue: \\"Hello\\")", - - "fields": { - "id": { - "type": "ID", - "keyRaw": "id" + users: { + type: "User", + keyRaw: "users(stringValue: \\"Hello\\")", + + fields: { + id: { + type: "ID", + keyRaw: "id" } } } @@ -216,25 +286,25 @@ test('thread query variables to inner fragments', async function () { rootType: "Query", selection: { - "users": { - "type": "User", - "keyRaw": "users(stringValue: $name)", - - "fields": { - "id": { - "type": "ID", - "keyRaw": "id" + users: { + type: "User", + keyRaw: "users(stringValue: $name)", + + fields: { + id: { + type: "ID", + keyRaw: "id" } } } }, input: { - "fields": { - "name": "String" + fields: { + name: "String" }, - "types": {} + types: {} } }; `) @@ -306,14 +376,14 @@ test('inner fragment with intermediate default value', async function () { rootType: "Query", selection: { - "users": { - "type": "User", - "keyRaw": "users(stringValue: \\"Hello\\", intValue: 2)", - - "fields": { - "id": { - "type": "ID", - "keyRaw": "id" + users: { + type: "User", + keyRaw: "users(stringValue: \\"Hello\\", intValue: 2)", + + fields: { + id: { + type: "ID", + keyRaw: "id" } } } @@ -388,14 +458,14 @@ test("default values don't overwrite unless explicitly passed", async function ( rootType: "Query", selection: { - "users": { - "type": "User", - "keyRaw": "users(stringValue: \\"Goodbye\\")", - - "fields": { - "id": { - "type": "ID", - "keyRaw": "id" + users: { + type: "User", + keyRaw: "users(stringValue: \\"Goodbye\\")", + + fields: { + id: { + type: "ID", + keyRaw: "id" } } } @@ -458,14 +528,14 @@ test('default arguments', async function () { rootType: "Query", selection: { - "users": { - "type": "User", - "keyRaw": "users(boolValue: true, stringValue: \\"Hello\\")", - - "fields": { - "id": { - "type": "ID", - "keyRaw": "id" + users: { + type: "User", + keyRaw: "users(boolValue: true, stringValue: \\"Hello\\")", + + fields: { + id: { + type: "ID", + keyRaw: "id" } } } @@ -528,14 +598,14 @@ test('multiple with directives - no overlap', async function () { rootType: "Query", selection: { - "users": { - "type": "User", - "keyRaw": "users(boolValue: false, stringValue: \\"Goodbye\\")", - - "fields": { - "id": { - "type": "ID", - "keyRaw": "id" + users: { + type: "User", + keyRaw: "users(boolValue: false, stringValue: \\"Goodbye\\")", + + fields: { + id: { + type: "ID", + keyRaw: "id" } } } diff --git a/packages/houdini/cmd/transforms/fragmentVariables.ts b/packages/houdini/cmd/transforms/fragmentVariables.ts index 76abea81e..3c941bfbd 100644 --- a/packages/houdini/cmd/transforms/fragmentVariables.ts +++ b/packages/houdini/cmd/transforms/fragmentVariables.ts @@ -56,48 +56,66 @@ export default async function fragmentVariables( ) || {} // inline any fragment arguments in the document - doc.document = inlineFragmentArgs( + doc.document = inlineFragmentArgs({ config, - fragments, - doc.document, + fragmentDefinitions: fragments, + document: doc.document, generatedFragments, visitedFragments, - rootScope - ) + scope: rootScope, + }) } // once we've handled every fragment in every document we need to add any // new fragment definitions to the list of collected docs so they can be picked up - if (documents.length > 0) { - const doc: graphql.DocumentNode = { - kind: 'Document', - definitions: Object.values(generatedFragments), - } - - documents.push({ - name: 'generated::fragmentVariables', - document: doc, - originalDocument: doc, - generated: true, - filename: '__generated__', - }) + const doc: graphql.DocumentNode = { + kind: 'Document', + definitions: Object.values(generatedFragments), } + + documents.push({ + name: 'generated::fragmentVariables', + document: doc, + originalDocument: doc, + generate: false, + filename: 'generated::fragmentVariables', + }) } type ValueMap = Record -function inlineFragmentArgs( - config: Config, - fragmentDefinitions: Record, - document: graphql.ASTNode, - generatedFragments: Record = {}, - visitedFragments: Set, - scope: ValueMap | undefined | null, +function inlineFragmentArgs({ + config, + fragmentDefinitions, + document, + generatedFragments, + visitedFragments, + scope, + newName, +}: { + config: Config + fragmentDefinitions: Record + document: graphql.ASTNode + generatedFragments: Record + visitedFragments: Set + scope: ValueMap | undefined | null newName?: string -): any { +}): any { + // look up the arguments for the fragment + const definitionArgs = fragmentArguments( + config, + document as graphql.FragmentDefinitionNode + ).reduce>((acc, arg) => ({ ...acc, [arg.name]: arg }), {}) + const result = graphql.visit(document, { - Variable(node) { - // if there is no scope + Argument(node) { + // look at the arguments value to see if its a variable + const value = node.value + if (value.kind !== 'Variable') { + return + } + + // if there's no scope we can't evaluate it if (!scope) { throw new Error( node.name.value + @@ -106,15 +124,22 @@ function inlineFragmentArgs( ) } - // look up the variable in the scope - const newValue = scope[node.name.value] - - // if we don't have a new value, it's a unknown variable - if (!newValue) { - throw new Error(node.name.value + ' has no value in the current scope') + // is the variable in scope + const newValue = scope[value.name.value] + // if it is just use it + if (newValue) { + return { + ...node, + value: newValue, + } + } + // if the argument is required + if (definitionArgs[value.name.value] && definitionArgs[value.name.value].required) { + throw new Error('Missing value for required arg: ' + value.name.value) } - return newValue + // if we got this far, theres no value for a non-required arg, remove the node + return null }, FragmentSpread(node) { // look at the fragment spread to see if there are any default arguments @@ -143,15 +168,15 @@ function inlineFragmentArgs( } } - generatedFragments[newFragmentName] = inlineFragmentArgs( + generatedFragments[newFragmentName] = inlineFragmentArgs({ config, fragmentDefinitions, - fragmentDefinitions[node.name.value].definition, + document: fragmentDefinitions[node.name.value].definition, generatedFragments, visitedFragments, - args, - newFragmentName - ) + scope: args, + newName: newFragmentName, + }) } // there are no local arguments to the fragment so we need to // walk down the definition and apply any default args as well @@ -171,15 +196,15 @@ function inlineFragmentArgs( const localDefinitions = [...doc.document.definitions] localDefinitions.splice(definitionIndex, 1) localDefinitions.push( - inlineFragmentArgs( + inlineFragmentArgs({ config, fragmentDefinitions, - fragmentDefinitions[node.name.value].definition, + document: fragmentDefinitions[node.name.value].definition, generatedFragments, visitedFragments, - defaultArguments, - '' - ) + scope: defaultArguments, + newName: '', + }) ) doc.document = { @@ -230,10 +255,17 @@ export function withArguments( return withDirectives.flatMap((directive) => directive.arguments || []) } +export type FragmentArgument = { + name: string + type: string + required: boolean + defaultValue: graphql.ValueNode | null +} + export function fragmentArguments( config: Config, definition: graphql.FragmentDefinitionNode -): graphql.ArgumentNode[] { +): FragmentArgument[] { const directives = definition.directives?.filter( (directive) => directive.name.value === config.argumentsDirective ) @@ -242,8 +274,46 @@ export function fragmentArguments( return [] } - let result: ValueMap = {} - return directives.flatMap((directive) => directive.arguments || []) + return directives.flatMap( + (directive) => + // every argument to the directive specifies an argument to the fragment + directive.arguments?.flatMap((arg) => { + // arguments must be object + if (arg.value.kind !== 'ObjectValue') { + throw new Error('values of @argument must be objects') + } + + // look for the type field + const typeArg = arg.value.fields?.find((arg) => arg.name.value === 'type')?.value + // if theres no type arg, ignore it + if (!typeArg || typeArg.kind !== 'StringValue') { + return [] + } + + let type = typeArg.value + let name = arg.name.value + let required = false + let defaultValue = + arg.value.fields?.find((arg) => arg.name.value === 'default')?.value || null + + // if the name of the type ends in a ! we need to mark it as required + if (type[type.length - 1] === '!') { + type = type.slice(0, -1) + required = true + // there is no default value for a required argument + defaultValue = null + } + + return [ + { + name, + type, + required, + defaultValue, + }, + ] + }) || [] + ) } function collectDefaultArgumentValues( @@ -251,16 +321,13 @@ function collectDefaultArgumentValues( definition: graphql.FragmentDefinitionNode ): ValueMap | null { let result: ValueMap = {} - for (const arg of fragmentArguments(config, definition)) { - // look up the default value key - let argObject = arg.value as graphql.ObjectValueNode - - // if there is no default value, dont consider this argument - const defaultValue = argObject.fields.find((field) => field.name.value === 'default')?.value - if (!defaultValue) { + for (const { name, required, defaultValue } of fragmentArguments(config, definition)) { + // if the argument is required, there's no default value + if (required || !defaultValue) { continue } - result[arg.name.value] = defaultValue + + result[name] = defaultValue } return result diff --git a/packages/houdini/cmd/transforms/index.ts b/packages/houdini/cmd/transforms/index.ts index ade3e327f..decd3b77e 100644 --- a/packages/houdini/cmd/transforms/index.ts +++ b/packages/houdini/cmd/transforms/index.ts @@ -3,4 +3,5 @@ export { default as internalSchema } from './schema' export { default as list } from './list' export { default as typename } from './typename' export { default as addID } from './addID' +export { default as paginate } from './paginate' export { default as fragmentVariables } from './fragmentVariables' diff --git a/packages/houdini/cmd/transforms/list.ts b/packages/houdini/cmd/transforms/list.ts index f7767ffd1..e697c87d7 100644 --- a/packages/houdini/cmd/transforms/list.ts +++ b/packages/houdini/cmd/transforms/list.ts @@ -1,8 +1,9 @@ -// externals +// externalsr import { Config, parentTypeFromAncestors } from 'houdini-common' import * as graphql from 'graphql' // locals import { CollectedGraphQLDocument, HoudiniError, HoudiniErrorTodo } from '../types' +import { unwrapType } from '../utils' // addListFragments adds fragments for the fields tagged with @list export default async function addListFragments( @@ -12,7 +13,7 @@ export default async function addListFragments( // collect all of the fields that have the list applied const lists: { [name: string]: { - field: graphql.FieldNode + selection: graphql.SelectionSetNode | undefined type: graphql.GraphQLNamedType filename: string } @@ -21,12 +22,11 @@ export default async function addListFragments( const errors: HoudiniError[] = [] // look at every document - for (const { document, filename } of documents) { - graphql.visit(document, { + for (const doc of documents) { + doc.document = graphql.visit(doc.document, { Directive(node, key, parent, path, ancestors) { - // TODO: remove @connection guard // if we found a @list applied (old applications will call this @connection) - if (node.name.value === config.listDirective) { + if ([config.listDirective, config.paginateDirective].includes(node.name.value)) { // look up the name passed to the directive const nameArg = node.arguments?.find((arg) => arg.name.value === 'name') @@ -39,54 +39,85 @@ export default async function addListFragments( node.loc ? [node.loc.start, node.loc.end] : null, path ), - filepath: filename, + filepath: doc.filename, } // if there is no name argument if (!nameArg) { - error.message = '@list must have a name argument' - errors.push(error) + // if we are looking at a @list we need a name argument + if (node.name.value === config.listDirective) { + error.message = `@${node.name.value} must have a name argument` + errors.push(error) + } + + // regardless, we dont need to process this node any more return } // make sure it was a string if (nameArg.value.kind !== 'StringValue') { - error.message = '@list name must be a string' + error.message = `@${node.name.value} name must be a string` errors.push(error) return } // if we've already seen this list if (lists[nameArg.value.value]) { - error.message = '@list name must be unique' + error.message = `@${node.name.value} name must be unique` errors.push(error) } - const type = parentTypeFromAncestors(config.schema, ancestors) - // look up the parent's type - const parentType = parentTypeFromAncestors(config.schema, ancestors.slice(1)) - - // if id is not a valid field on the parent, we won't be able to add or remove - // from this list if it doesn't fall under root - if ( - !(parentType instanceof graphql.GraphQLObjectType) || - (parentType.name !== config.schema.getQueryType()?.name && - !parentType.getFields().id) - ) { - throw { - ...new graphql.GraphQLError( - 'Can only use a list field on fragment on a type with id' - ), - filepath: filename, - } - } + const parentType = parentTypeFromAncestors( + config.schema, + ancestors.slice(0, -1) + ) + + // a non-connection list can just use the selection set of the tagged field + // but if this is a connection tagged with list we need to use the selection + // of the edges.node field + const targetField = ancestors[ancestors.length - 1] as graphql.FieldNode + const targetFieldDefinition = parentType.getFields()[ + targetField.name.value + ] as graphql.GraphQLField + + const { selection, type, connection } = connectionSelection( + config, + targetFieldDefinition, + parentTypeFromAncestors( + config.schema, + ancestors + ) as graphql.GraphQLObjectType, + (ancestors[ancestors.length - 1] as graphql.FieldNode).selectionSet + ) // add the target of the directive to the list lists[nameArg.value.value] = { - field: ancestors[ancestors.length - 1] as graphql.FieldNode, + selection, type, - filename, + filename: doc.filename, + } + + // if the list is marking a connection we need to add the flag in a place we can track when + // generating the artifact + if (connection) { + return { + ...node, + arguments: [ + ...node.arguments!, + { + kind: 'Argument', + name: { + kind: 'Name', + value: 'connection', + }, + value: { + kind: 'BooleanValue', + value: true, + }, + } as graphql.ArgumentNode, + ], + } } } }, @@ -101,7 +132,7 @@ export default async function addListFragments( // we need to add a delete directive for every type that is the target of a list const listTargets = [ ...new Set( - Object.values(lists).map(({ type, field }) => { + Object.values(lists).map(({ type }) => { // only consider object types if (!(type instanceof graphql.GraphQLObjectType)) { return '' @@ -122,33 +153,32 @@ export default async function addListFragments( const generatedDoc: graphql.DocumentNode = { kind: 'Document', definitions: Object.entries(lists).flatMap( - ([name, { field, type, filename }]) => { + ([name, { selection, type }]) => { // look up the type const schemaType = config.schema.getType(type.name) as graphql.GraphQLObjectType // if there is no selection set - if (!field.selectionSet) { + if (!selection) { throw new HoudiniErrorTodo('Lists must have a selection') } // we need a copy of the field's selection set that we can mutate - const selection: graphql.SelectionSetNode = { + const fragmentSelection: graphql.SelectionSetNode = { kind: 'SelectionSet', - selections: [...field.selectionSet.selections], - loc: field.selectionSet.loc, + selections: [...selection.selections], } // is there no id selection if ( schemaType && - selection && - !selection?.selections.find( - (selection) => selection.kind === 'Field' && selection.name.value === 'id' + fragmentSelection && + !fragmentSelection?.selections.find( + (field) => field.kind === 'Field' && field.name.value === 'id' ) ) { // add the id field to the selection - selection.selections = [ - ...selection.selections, + fragmentSelection.selections = [ + ...fragmentSelection.selections, { kind: 'Field', name: { @@ -163,14 +193,14 @@ export default async function addListFragments( return [ // a fragment to insert items into this list { - kind: graphql.Kind.FRAGMENT_DEFINITION, - // in order to insert an item into this list, it must - // have the same selection as the field - selectionSet: selection, name: { - kind: 'Name', value: config.listInsertFragment(name), + kind: 'Name', }, + kind: graphql.Kind.FRAGMENT_DEFINITION, + // in order to insert an item into this list, it must + // have the same selection as the field + selectionSet: fragmentSelection, typeCondition: { kind: 'NamedType', name: { @@ -183,8 +213,8 @@ export default async function addListFragments( { kind: graphql.Kind.FRAGMENT_DEFINITION, name: { - kind: 'Name', value: config.listRemoveFragment(name), + kind: 'Name', }, // deleting an entity just takes its id and the parent selectionSet: { @@ -233,9 +263,71 @@ export default async function addListFragments( documents.push({ name: 'generated::lists', - generated: true, + generate: false, document: generatedDoc, originalDocument: generatedDoc, - filename: '__generated__', + filename: 'generated::lists', }) } + +// a field is considered a connection if it has one of the required connection arguments +// as well as an edges > node selection +export function connectionSelection( + config: Config, + field: graphql.GraphQLField, + type: graphql.GraphQLObjectType, + selection: graphql.SelectionSetNode | undefined +): { + selection: graphql.SelectionSetNode | undefined + type: graphql.GraphQLObjectType + connection: boolean +} { + // make sure the field has the fields for either forward or backwards pagination + const fieldArgs = field.args.reduce>( + (args, arg) => ({ + ...args, + [arg.name]: unwrapType(config, arg.type).type.name, + }), + {} + ) + const forwardPagination = fieldArgs['first'] === 'Int' && fieldArgs['after'] === 'String' + const backwardsPagination = fieldArgs['last'] === 'Int' && fieldArgs['before'] === 'String' + if (!forwardPagination && !backwardsPagination) { + return { selection, type, connection: false } + } + + // we need to make sure that there is an edges field + const edgesField = selection?.selections.find( + (selection) => selection.kind === 'Field' && selection.name.value === 'edges' + ) as graphql.FieldNode + if (!edgesField) { + return { selection, type, connection: false } + } + + const nodeSelection = edgesField.selectionSet?.selections.find( + (selection) => selection.kind === 'Field' && selection.name.value === 'node' + ) as graphql.FieldNode + if (!nodeSelection.selectionSet) { + return { selection, type, connection: false } + } + + // now that we have the correct selection, we have to lookup node type + // we need to make sure that there is an edges field + const edgeField = (unwrapType(config, field.type) + .type as graphql.GraphQLObjectType).getFields()['edges'] + const { list, type: edgeFieldType } = unwrapType(config, edgeField.type) + if (!list) { + return { selection, type, connection: false } + } + + const nodeField = (edgeFieldType as graphql.GraphQLObjectType).getFields()['node'] + if (!nodeField) { + return { selection, type, connection: false } + } + + return { + selection: nodeSelection.selectionSet, + type: unwrapType(config, nodeField.type).type as graphql.GraphQLObjectType, + connection: true, + } +} diff --git a/packages/houdini/cmd/transforms/lists.test.ts b/packages/houdini/cmd/transforms/lists.test.ts index 1b126ca56..03c95ddbf 100644 --- a/packages/houdini/cmd/transforms/lists.test.ts +++ b/packages/houdini/cmd/transforms/lists.test.ts @@ -35,8 +35,8 @@ test('insert fragments on query selection set', async function () { const config = testConfig() await runPipeline(config, docs) - expect(graphql.print(docs[0].document)).toMatchInlineSnapshot(` - "mutation UpdateUser { + expect(docs[0].document).toMatchInlineSnapshot(` + mutation UpdateUser { updateUser { ...User_Friends_insert id @@ -47,7 +47,7 @@ test('insert fragments on query selection set', async function () { firstName id } - " + `) }) @@ -80,8 +80,8 @@ test('delete fragments on query selection set', async function () { const config = testConfig() await runPipeline(config, docs) - expect(graphql.print(docs[0].document)).toMatchInlineSnapshot(` - "mutation UpdateUser { + expect(docs[0].document).toMatchInlineSnapshot(` + mutation UpdateUser { updateUser { ...User_Friends_remove id @@ -91,7 +91,7 @@ test('delete fragments on query selection set', async function () { fragment User_Friends_remove on User { id } - " + `) }) @@ -122,10 +122,10 @@ test('list fragments on fragment selection set', async function () { const config = testConfig() await runPipeline(config, docs) - expect(graphql.print(docs[0].document)).toMatchInlineSnapshot(` - "mutation UpdateUser { + expect(docs[0].document).toMatchInlineSnapshot(` + mutation UpdateUser { updateUser { - ...User_Friends_insert @prepend(parentID: \\"1234\\") + ...User_Friends_insert @prepend(parentID: "1234") id } } @@ -134,7 +134,7 @@ test('list fragments on fragment selection set', async function () { firstName id } - " + `) }) @@ -165,6 +165,43 @@ test('delete node', async function () { await expect(runPipeline(testConfig(), docs)).resolves.toBeUndefined() }) +test('delete node from connection', async function () { + const docs = [ + mockCollectedDoc( + ` + mutation DeleteUser { + deleteUser(id: "1234") { + userID @User_delete + } + } + ` + ), + mockCollectedDoc( + ` + fragment AllUsers on User{ + friendsByCursor @list(name:"User_Friends") { + edges { + node { + firstName + id + } + } + } + } + ` + ), + ] + + expect(docs[0].document).toMatchInlineSnapshot(` + mutation DeleteUser { + deleteUser(id: "1234") { + userID @User_delete + } + } + + `) +}) + test('list fragments must be unique', async function () { const docs = [ mockCollectedDoc( @@ -239,10 +276,60 @@ test('includes `id` in list fragment', async function () { const config = testConfig() await runPipeline(config, docs) - expect(graphql.print(docs[0].document)).toMatchInlineSnapshot(` - "mutation UpdateUser { + expect(docs[0].document).toMatchInlineSnapshot(` + mutation UpdateUser { + updateUser { + ...User_Friends_insert @prepend(parentID: "1234") + id + } + } + + fragment User_Friends_insert on User { + id + firstName + } + + `) +}) + +test('includes node selection on connection', async function () { + const docs = [ + mockCollectedDoc( + ` + mutation UpdateUser { + updateUser { + ...User_Friends_insert @prepend(parentID: "1234") + } + } + ` + ), + mockCollectedDoc( + ` + fragment AllUsers on User{ + friendsByCursor @list(name:"User_Friends") { + edges { + node { + id + firstName + friends { + id + } + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + expect(docs[0].document).toMatchInlineSnapshot(` + mutation UpdateUser { updateUser { - ...User_Friends_insert @prepend(parentID: \\"1234\\") + ...User_Friends_insert @prepend(parentID: "1234") id } } @@ -250,8 +337,54 @@ test('includes `id` in list fragment', async function () { fragment User_Friends_insert on User { id firstName + friends { + id + } } - " + + `) +}) + +test('list flags connections', async function () { + const docs = [ + mockCollectedDoc( + ` + fragment AllUsers on User{ + friendsByCursor @list(name:"User_Friends") { + edges { + node { + id + firstName + friends { + id + } + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + expect(docs[0].document).toMatchInlineSnapshot(` + fragment AllUsers on User { + friendsByCursor @list(name: "User_Friends", connection: true) { + edges { + node { + id + firstName + friends { + id + } + } + } + } + } + `) }) @@ -276,3 +409,56 @@ test('cannot use list directive if id is not a valid field', async function () { const config = testConfig() await expect(runPipeline(config, docs)).rejects.toBeTruthy() }) + +test('paginate with name also gets treated as a list', async function () { + const docs = [ + mockCollectedDoc( + ` + mutation UpdateUser { + updateUser { + ...User_Friends_insert @prepend(parentID: "1234") + } + } + ` + ), + mockCollectedDoc( + ` + fragment AllUsers on User{ + friendsByCursor(first: 10) @paginate(name:"User_Friends") { + edges { + node { + id + firstName + friends { + id + } + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + expect(docs[0].document).toMatchInlineSnapshot(` + mutation UpdateUser { + updateUser { + ...User_Friends_insert @prepend(parentID: "1234") + id + } + } + + fragment User_Friends_insert on User { + id + firstName + friends { + id + } + } + + `) +}) diff --git a/packages/houdini/cmd/transforms/paginate.test.ts b/packages/houdini/cmd/transforms/paginate.test.ts new file mode 100644 index 000000000..e8e4d6b81 --- /dev/null +++ b/packages/houdini/cmd/transforms/paginate.test.ts @@ -0,0 +1,1228 @@ +// external imports +import { testConfig } from 'houdini-common' +// local imports +import '../../../../jest.setup' +import { runPipeline } from '../generate' +import { mockCollectedDoc } from '../testUtils' + +test('adds pagination info to full', async function () { + const docs = [ + mockCollectedDoc( + ` + fragment UserFriends on Query { + usersByCursor(first: 10) @paginate { + edges { + node { + id + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + // load the contents of the file + expect(docs[0].document).toMatchInlineSnapshot(` + fragment UserFriends on Query @arguments(first: {type: "Int", default: 10}, after: {type: "String"}) { + usersByCursor(first: $first, after: $after) @paginate { + edges { + node { + id + } + } + edges { + cursor + node { + __typename + } + } + pageInfo { + hasPreviousPage + hasNextPage + startCursor + endCursor + } + } + } + + `) + + expect(docs[0].refetch).toMatchInlineSnapshot(` + { + "update": "append", + "path": [ + "usersByCursor" + ], + "method": "cursor", + "pageSize": 10, + "embedded": false + } + `) +}) + +test('paginated fragments on node pull data from one field deeper', async function () { + const docs = [ + mockCollectedDoc( + ` + fragment UserFriends on User { + friendsByCursor(first: 10) @paginate { + edges { + node { + id + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + expect(docs[0].refetch).toMatchInlineSnapshot(` + { + "update": "append", + "path": [ + "friendsByCursor" + ], + "method": "cursor", + "pageSize": 10, + "embedded": true + } + `) +}) + +test("doesn't add pagination info to offset pagination", async function () { + const docs = [ + mockCollectedDoc( + ` + fragment UserFriends on Query { + usersByOffset(limit: 10) @paginate { + id + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + // load the contents of the file + expect(docs[0].document).toMatchInlineSnapshot(` + fragment UserFriends on Query @arguments(limit: {type: "Int", default: 10}, offset: {type: "Int"}) { + usersByOffset(limit: $limit, offset: $offset) @paginate { + id + } + } + + `) +}) + +test('paginate adds forwards cursor args to the full cursor fragment', async function () { + const docs = [ + mockCollectedDoc( + ` + fragment UserFriends on Query { + usersByCursor(first: 10) @paginate { + edges { + node { + id + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + // load the contents of the file + expect(docs[0].document).toMatchInlineSnapshot(` + fragment UserFriends on Query @arguments(first: {type: "Int", default: 10}, after: {type: "String"}) { + usersByCursor(first: $first, after: $after) @paginate { + edges { + node { + id + } + } + edges { + cursor + node { + __typename + } + } + pageInfo { + hasPreviousPage + hasNextPage + startCursor + endCursor + } + } + } + + `) +}) + +test('paginate adds backwards cursor args to the full cursor fragment', async function () { + const docs = [ + mockCollectedDoc( + ` + fragment UserFriends on Query { + usersByCursor(last: 10) @paginate { + edges { + node { + id + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + // load the contents of the file + expect(docs[0].document).toMatchInlineSnapshot(` + fragment UserFriends on Query @arguments(last: {type: "Int", default: 10}, before: {type: "String"}) { + usersByCursor(last: $last, before: $before) @paginate { + edges { + node { + id + } + } + edges { + cursor + node { + __typename + } + } + pageInfo { + hasPreviousPage + hasNextPage + startCursor + endCursor + } + } + } + + `) +}) + +test('paginate adds forwards cursor args to the fragment', async function () { + const docs = [ + mockCollectedDoc( + ` + fragment UserFriends on Query { + usersByForwardsCursor(first: 10) @paginate { + edges { + node { + id + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + // load the contents of the file + expect(docs[0].document).toMatchInlineSnapshot(` + fragment UserFriends on Query @arguments(first: {type: "Int", default: 10}, after: {type: "String"}) { + usersByForwardsCursor(first: $first, after: $after) @paginate { + edges { + node { + id + } + } + edges { + cursor + node { + __typename + } + } + pageInfo { + hasPreviousPage + hasNextPage + startCursor + endCursor + } + } + } + + `) +}) + +test('paginate adds backwards cursor args to the fragment', async function () { + const docs = [ + mockCollectedDoc( + ` + fragment UserFriends on Query { + usersByBackwardsCursor(last: 10) @paginate { + edges { + node { + id + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + // load the contents of the file + expect(docs[0].document).toMatchInlineSnapshot(` + fragment UserFriends on Query @arguments(last: {type: "Int", default: 10}, before: {type: "String"}) { + usersByBackwardsCursor(last: $last, before: $before) @paginate { + edges { + node { + id + } + } + edges { + cursor + node { + __typename + } + } + pageInfo { + hasPreviousPage + hasNextPage + startCursor + endCursor + } + } + } + + `) +}) + +test('sets before with default value', async function () { + const docs = [ + mockCollectedDoc( + ` + fragment UserFriends on Query { + usersByCursor(last: 10, before: "cursor") @paginate { + edges { + node { + id + } + } + } + } + ` + ), + // mockCollectedDoc('') + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + // load the contents of the file + expect(docs[0].document).toMatchInlineSnapshot(` + fragment UserFriends on Query @arguments(last: {type: "Int", default: 10}, before: {type: "String", default: "cursor"}) { + usersByCursor(last: $last, before: $before) @paginate { + edges { + node { + id + } + } + edges { + cursor + node { + __typename + } + } + pageInfo { + hasPreviousPage + hasNextPage + startCursor + endCursor + } + } + } + + `) +}) + +test('embeds pagination query as a separate document', async function () { + const docs = [ + mockCollectedDoc( + ` + fragment UserFriends on Query { + usersByForwardsCursor(first: 10) @paginate { + edges { + node { + id + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + // load the contents of the file + expect(docs[1]?.document).toMatchInlineSnapshot(` + query UserFriends_Houdini_Paginate($first: Int = 10, $after: String) { + ...UserFriends_jrGTj @with(first: $first, after: $after) + } + + fragment UserFriends_jrGTj on Query @arguments(first: {type: "Int", default: 10}, after: {type: "String"}) { + usersByForwardsCursor(first: $first, after: $after) @paginate { + edges { + node { + id + } + } + edges { + cursor + node { + __typename + } + } + pageInfo { + hasPreviousPage + hasNextPage + startCursor + endCursor + } + } + } + + `) +}) + +test('embeds node pagination query as a separate document', async function () { + const docs = [ + mockCollectedDoc( + ` + fragment UserFriends on User { + friendsByForwardsCursor(first: 10) @paginate { + edges { + node { + id + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + // load the contents of the file + await expect(docs[1]).toMatchArtifactSnapshot(` + module.exports = { + name: "UserFriends_Pagination_Query", + kind: "HoudiniQuery", + + refetch: { + update: "append", + path: ["friendsByForwardsCursor"], + method: "cursor", + pageSize: 10, + embedded: true + }, + + raw: \`query UserFriends_Houdini_Paginate($first: Int = 10, $after: String, $id: ID!) { + node(id: $id) { + ...UserFriends_jrGTj + } + } + + fragment UserFriends_jrGTj on User { + friendsByForwardsCursor(first: $first, after: $after) { + edges { + node { + id + } + } + edges { + cursor + node { + __typename + } + } + pageInfo { + hasPreviousPage + hasNextPage + startCursor + endCursor + } + } + } + \`, + + rootType: "Query", + + selection: { + node: { + type: "Node", + keyRaw: "node(id: $id)", + + fields: { + friendsByForwardsCursor: { + type: "UserConnection", + keyRaw: "friendsByForwardsCursor::paginated", + + fields: { + edges: { + type: "UserEdge", + keyRaw: "edges", + + fields: { + cursor: { + type: "String", + keyRaw: "cursor" + }, + + node: { + type: "User", + keyRaw: "node", + + fields: { + __typename: { + type: "String", + keyRaw: "__typename" + }, + + id: { + type: "ID", + keyRaw: "id" + } + } + } + }, + + update: "append" + }, + + pageInfo: { + type: "PageInfo", + keyRaw: "pageInfo", + + fields: { + hasPreviousPage: { + type: "Boolean", + keyRaw: "hasPreviousPage" + }, + + hasNextPage: { + type: "Boolean", + keyRaw: "hasNextPage" + }, + + startCursor: { + type: "String", + keyRaw: "startCursor" + }, + + endCursor: { + type: "String", + keyRaw: "endCursor" + } + } + } + } + } + }, + + abstract: true + } + }, + + input: { + fields: { + first: "Int", + after: "String", + id: "ID" + }, + + types: {} + } + }; + `) +}) + +test('query with forwards cursor paginate', async function () { + const docs = [ + mockCollectedDoc( + ` + query Users { + usersByForwardsCursor(first: 10) @paginate { + edges { + node { + id + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + // load the contents of the file + expect(docs[0]?.document).toMatchInlineSnapshot(` + query Users($first: Int = 10, $after: String) { + usersByForwardsCursor(first: $first, after: $after) @paginate { + edges { + node { + id + } + } + edges { + cursor + node { + __typename + } + } + pageInfo { + hasPreviousPage + hasNextPage + startCursor + endCursor + } + } + } + + `) +}) + +test('query with backwards cursor paginate', async function () { + const docs = [ + mockCollectedDoc( + ` + query Users { + usersByBackwardsCursor(last: 10) @paginate { + edges { + node { + id + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + // load the contents of the file + expect(docs[0]?.document).toMatchInlineSnapshot(` + query Users($last: Int = 10, $before: String) { + usersByBackwardsCursor(last: $last, before: $before) @paginate { + edges { + node { + id + } + } + edges { + cursor + node { + __typename + } + } + pageInfo { + hasPreviousPage + hasNextPage + startCursor + endCursor + } + } + } + + `) +}) + +test('query with offset paginate', async function () { + const docs = [ + mockCollectedDoc( + ` + query Users { + usersByOffset(limit: 10) @paginate { + id + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + // load the contents of the file + expect(docs[0]?.document).toMatchInlineSnapshot(` + query Users($limit: Int = 10, $offset: Int) { + usersByOffset(limit: $limit, offset: $offset) @paginate { + id + } + } + + `) +}) + +test('query with backwards cursor on full paginate', async function () { + const docs = [ + mockCollectedDoc( + ` + query Users { + usersByCursor(last: 10) @paginate { + edges { + node { + id + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + // load the contents of the file + expect(docs[0]?.document).toMatchInlineSnapshot(` + query Users($last: Int = 10, $before: String) { + usersByCursor(last: $last, before: $before) @paginate { + edges { + node { + id + } + } + edges { + cursor + node { + __typename + } + } + pageInfo { + hasPreviousPage + hasNextPage + startCursor + endCursor + } + } + } + + `) +}) + +test('query with forwards cursor on full paginate', async function () { + const docs = [ + mockCollectedDoc( + ` + query Users { + usersByCursor(first: 10) @paginate { + edges { + node { + id + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + // load the contents of the file + expect(docs[0]?.document).toMatchInlineSnapshot(` + query Users($first: Int = 10, $after: String) { + usersByCursor(first: $first, after: $after) @paginate { + edges { + node { + id + } + } + edges { + cursor + node { + __typename + } + } + pageInfo { + hasPreviousPage + hasNextPage + startCursor + endCursor + } + } + } + + `) +}) + +test("forwards cursor paginated query doesn't overlap variables", async function () { + const docs = [ + mockCollectedDoc( + ` + query Users($first: Int!) { + usersByCursor(first: $first) @paginate { + edges { + node { + id + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + // load the contents of the file + expect(docs[0]?.document).toMatchInlineSnapshot(` + query Users($first: Int!, $after: String) { + usersByCursor(first: $first, after: $after) @paginate { + edges { + node { + id + } + } + edges { + cursor + node { + __typename + } + } + pageInfo { + hasPreviousPage + hasNextPage + startCursor + endCursor + } + } + } + + `) +}) + +test("backwards cursor paginated query doesn't overlap variables", async function () { + const docs = [ + mockCollectedDoc( + ` + query Users($last: Int!) { + usersByCursor(last: $last) @paginate { + edges { + node { + id + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + // load the contents of the file + expect(docs[0]?.document).toMatchInlineSnapshot(` + query Users($last: Int!, $before: String) { + usersByCursor(last: $last, before: $before) @paginate { + edges { + node { + id + } + } + edges { + cursor + node { + __typename + } + } + pageInfo { + hasPreviousPage + hasNextPage + startCursor + endCursor + } + } + } + + `) +}) + +test("offset paginated query doesn't overlap variables", async function () { + const docs = [ + mockCollectedDoc( + ` + query Users($limit: Int! = 10) { + usersByOffset(limit: $limit) @paginate { + id + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + // load the contents of the file + expect(docs[0]?.document).toMatchInlineSnapshot(` + query Users($limit: Int! = 10, $offset: Int) { + usersByOffset(limit: $limit, offset: $offset) @paginate { + id + } + } + + `) +}) + +test('refetch specification with backwards pagination', async function () { + const docs = [ + mockCollectedDoc( + ` + fragment UserFriends on Query { + usersByCursor(last: 10) @paginate { + edges { + node { + id + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + expect(docs[0].refetch).toMatchInlineSnapshot(` + { + "update": "prepend", + "path": [ + "usersByCursor" + ], + "method": "cursor", + "pageSize": 10, + "embedded": false + } + `) +}) + +test('refetch entry with initial backwards', async function () { + const docs = [ + mockCollectedDoc( + ` + fragment UserFriends on Query { + usersByCursor(last: 10, before: "1234") @paginate { + edges { + node { + id + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + expect(docs[0].refetch).toMatchInlineSnapshot(` + { + "update": "prepend", + "path": [ + "usersByCursor" + ], + "method": "cursor", + "pageSize": 10, + "embedded": false, + "start": "1234" + } + `) +}) + +test('refetch entry with initial forwards', async function () { + const docs = [ + mockCollectedDoc( + ` + fragment UserFriends on Query { + usersByCursor(first: 10, after: "1234") @paginate { + edges { + node { + id + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + expect(docs[0].refetch).toMatchInlineSnapshot(` + { + "update": "append", + "path": [ + "usersByCursor" + ], + "method": "cursor", + "pageSize": 10, + "embedded": false, + "start": "1234" + } + `) +}) + +test('generated query has same refetch spec', async function () { + const docs = [ + mockCollectedDoc( + ` + fragment UserFriends on Query { + usersByCursor(first: 10, after: "1234") @paginate { + edges { + node { + id + } + } + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + await expect(docs[1]).toMatchArtifactSnapshot(` + module.exports = { + name: "UserFriends_Pagination_Query", + kind: "HoudiniQuery", + + refetch: { + update: "append", + path: ["usersByCursor"], + method: "cursor", + pageSize: 10, + embedded: false, + start: "1234" + }, + + raw: \`query UserFriends_Houdini_Paginate($first: Int = 10, $after: String = "1234") { + ...UserFriends_jrGTj + } + + fragment UserFriends_jrGTj on Query { + usersByCursor(first: $first, after: $after) { + edges { + node { + id + } + } + edges { + cursor + node { + __typename + } + } + pageInfo { + hasPreviousPage + hasNextPage + startCursor + endCursor + } + } + } + \`, + + rootType: "Query", + + selection: { + usersByCursor: { + type: "UserConnection", + keyRaw: "usersByCursor::paginated", + + fields: { + edges: { + type: "UserEdge", + keyRaw: "edges", + + fields: { + cursor: { + type: "String", + keyRaw: "cursor" + }, + + node: { + type: "User", + keyRaw: "node", + + fields: { + __typename: { + type: "String", + keyRaw: "__typename" + }, + + id: { + type: "ID", + keyRaw: "id" + } + } + } + }, + + update: "append" + }, + + pageInfo: { + type: "PageInfo", + keyRaw: "pageInfo", + + fields: { + hasPreviousPage: { + type: "Boolean", + keyRaw: "hasPreviousPage" + }, + + hasNextPage: { + type: "Boolean", + keyRaw: "hasNextPage" + }, + + startCursor: { + type: "String", + keyRaw: "startCursor" + }, + + endCursor: { + type: "String", + keyRaw: "endCursor" + } + } + } + } + } + }, + + input: { + fields: { + first: "Int", + after: "String" + }, + + types: {} + } + }; + `) +}) + +test('refetch specification with offset pagination', async function () { + const docs = [ + mockCollectedDoc( + ` + fragment UserFriends on Query { + usersByOffset(limit: 10) @paginate { + id + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + expect(docs[0].refetch).toMatchInlineSnapshot(` + { + "update": "append", + "path": [ + "usersByOffset" + ], + "method": "offset", + "pageSize": 10, + "embedded": false + } + `) +}) + +test('refetch specification with initial offset', async function () { + const docs = [ + mockCollectedDoc( + ` + fragment UserFriends on Query { + usersByOffset(limit: 10, offset: 10) @paginate { + id + } + } + ` + ), + ] + + // run the pipeline + const config = testConfig() + await runPipeline(config, docs) + + expect(docs[0].refetch).toMatchInlineSnapshot(` + { + "update": "append", + "path": [ + "usersByOffset" + ], + "method": "offset", + "pageSize": 10, + "embedded": false, + "start": 10 + } + `) +}) diff --git a/packages/houdini/cmd/transforms/paginate.ts b/packages/houdini/cmd/transforms/paginate.ts new file mode 100644 index 000000000..5b9998b5b --- /dev/null +++ b/packages/houdini/cmd/transforms/paginate.ts @@ -0,0 +1,673 @@ +// externals +import * as graphql from 'graphql' +import { Config, parentTypeFromAncestors } from 'houdini-common' +// locals +import { CollectedGraphQLDocument, RefetchUpdateMode } from '../types' + +// the paginate transform is responsible for preparing a fragment marked for pagination +// to be embedded in the query that will be used to fetch additional data. That means it +// is responsible for adding additional arguments to the paginated field and hoisting +// all of the pagination args to arguments of the fragment itself. It then generates +// a query that threads query variables to the updated fragment and lets the fragment +// argument transform do the rest. This whole process happens in a few steps: + +// - walk through the document and look for a field marked for pagination. if one is found, +// add the necessary arguments to the field, referencing variables that will be injected +// and compute what kind of pagination (toggling an object of flags) +// - if the @paginate directive was found, add the @arguments directive to the fragment +// definition and use any fields that were previously set as the default value. that +// will cause the fragment arguments directive to inline the default values if one isn't +// given, preserving the original definition for the first query +// - generate the query with the fragment embedded using @with to pass query variables through + +type PaginationFlags = { + [fieldName: string]: { enabled: boolean; type: 'String' | 'Int'; defaultValue?: any } +} + +// paginate transform adds the necessary fields for a paginated field +export default async function paginate( + config: Config, + documents: CollectedGraphQLDocument[] +): Promise { + // we're going to have to add documents to the list so collect them here and we'll add them when we're done + const newDocs: CollectedGraphQLDocument[] = [] + + // visit every document + for (const doc of documents) { + // remember if we ran into a paginate argument + let paginated = false + + // store the pagination state to coordinate what we define as args to the field and the argument definitions of + // the fragment and operation. we'll fill in the enabled state and default values once we encounter @paginate + const flags: PaginationFlags = { + first: { + enabled: false, + type: 'Int', + }, + after: { + enabled: false, + type: 'String', + }, + last: { + enabled: false, + type: 'Int', + }, + before: { + enabled: false, + type: 'String', + }, + limit: { + enabled: false, + type: 'Int', + }, + offset: { + enabled: false, + type: 'Int', + }, + } + + // we need to know the path where the paginate directive shows up so we can distinguish updated + // values from data that needs to be added to the list + let paginationPath: string[] = [] + + // we need to add page info to the selection + doc.document = graphql.visit(doc.document, { + Field(node, _, __, ___, ancestors) { + // if there's no paginate directive, ignore the field + const paginateDirective = node.directives?.find( + (directive) => directive.name.value === config.paginateDirective + ) + if (!paginateDirective || !node.selectionSet) { + return + } + + // remember we saw this directive + paginated = true + + // loop over the args of the field once so we can check their existence + const args = new Set( + (parentTypeFromAncestors(config.schema, ancestors) as + | graphql.GraphQLObjectType + | graphql.GraphQLInterfaceType) + .getFields() + [node.name.value].args.map((arg) => arg.name) + ) + + // also look to see if the user wants to do forward pagination + const passedArgs = new Set(node.arguments?.map((arg) => arg.name.value)) + const specifiedForwards = passedArgs.has('first') + const specifiedBackwards = passedArgs.has('last') + + // figure out what kind of pagination the field supports + const forwardPagination = + !specifiedBackwards && args.has('first') && args.has('after') + const backwardsPagination = + !specifiedForwards && args.has('last') && args.has('before') + const offsetPagination = + !forwardPagination && + !backwardsPagination && + args.has('offset') && + args.has('limit') + + // update the flags based on what the tagged field supports + flags.first.enabled = forwardPagination + flags.after.enabled = forwardPagination + flags.last.enabled = backwardsPagination + flags.before.enabled = backwardsPagination + flags.offset.enabled = offsetPagination + flags.limit.enabled = offsetPagination + + paginationPath = (ancestors + .filter( + (ancestor) => + // @ts-ignore + !Array.isArray(ancestor) && ancestor.kind === graphql.Kind.FIELD + ) + .concat(node) as graphql.FieldNode[]).map( + (field) => field.alias?.value || field.name.value + ) + + // if the field supports cursor based pagination we need to make sure we have the + // page info field + return { + ...node, + // any pagination arguments we run into will need to be replaced with variables + // since they will be hoisted into the arguments for the fragment or query + arguments: replaceArgumentsWithVariables(node.arguments, flags), + selectionSet: offsetPagination + ? // no need to add any fields to the selection if we're dealing with offset pagination + node.selectionSet + : // add the page info if we are dealing with cursor-based pagination + { + ...node.selectionSet, + selections: [...node.selectionSet.selections, ...pageInfoSelection], + }, + } + }, + }) + + // if we saw the paginate directive we need to add arguments to the fragment or query that contain the + // field that is marked for pagination + if (paginated) { + let fragmentName = '' + let refetchQueryName = '' + // check if we have to embed the fragment in Node + let nodeQuery = false + + // figure out the right refetch + let refetchUpdate = RefetchUpdateMode.append + if (flags.last.enabled) { + refetchUpdate = RefetchUpdateMode.prepend + } + + // remember if we found a fragment or operation + let fragment = false + + doc.document = graphql.visit(doc.document, { + // if we are dealing with a query, we'll need to add the variables to the definition + OperationDefinition(node) { + // make sure its a query + if (node.operation !== 'query') { + throw new Error( + `@${config.paginateDirective} can only show up in a query or fragment document` + ) + } + + refetchQueryName = node.name?.value || '' + + // build a map from existing variables to their value so we can compare with the ones we need to inject + const operationVariables: Record = + node.variableDefinitions?.reduce( + (vars, definition) => ({ + ...vars, + [definition.variable.name.value]: definition, + }), + {} + ) || {} + + // figure out the variables we want on the query + let newVariables: Record< + string, + graphql.VariableDefinitionNode + > = Object.fromEntries( + Object.entries(flags) + .filter(([, spec]) => spec.enabled) + .map(([fieldName, spec]) => [ + fieldName, + staticVariableDefinition(fieldName, spec.type, spec.defaultValue), + ]) + ) + + // the full list of variables comes from both source + const variableNames = new Set( + Object.keys(operationVariables).concat(Object.keys(newVariables)) + ) + + // we need to build a unique set of variable definitions + const finalVariables = [...variableNames].map( + (name) => operationVariables[name] || newVariables[name] + ) + + return { + ...node, + variableDefinitions: finalVariables, + } as graphql.OperationDefinitionNode + }, + // if we are dealing with a fragment definition we'll need to add the arguments directive if it doesn't exist + FragmentDefinition(node) { + fragment = true + + fragmentName = node.name.value + refetchQueryName = fragmentName + '_Houdini_Paginate' + + // a fragment has to be embedded in Node if its not on the query type + nodeQuery = node.typeCondition.name.value !== config.schema.getQueryType()?.name + + // look at the fragment definition for an arguments directive + const argDirective = node.directives?.find( + (directive) => directive.name.value === config.argumentsDirective + ) + + // if there isn't an arguments directive, add it and we'll add arguments to it when + // we run into it again + if (!argDirective) { + return { + ...node, + directives: [ + ...(node.directives || []), + { + kind: 'Directive', + name: { + kind: 'Name', + value: config.argumentsDirective, + }, + }, + ] as graphql.DirectiveNode[], + } + } + }, + Directive(node) { + // if we are not looking at the arguments directive, ignore it + if (node.name.value !== config.argumentsDirective) { + return + } + + // turn the set of enabled pagination args into arg definitions for the directive + let newArgs = [ + ...Object.entries(flags) + .filter(([, spec]) => spec.enabled) + .map(([key, spec]) => + argumentNode(key, [spec.type, spec.defaultValue]) + ), + ] + + // add non-null versions of the arguments we'll use to paginate + return { + ...node, + arguments: [...(node.arguments || []), ...newArgs], + } as graphql.DirectiveNode + }, + }) + + // now that we've mutated the document to be flexible for @paginate's needs + // we need to add a document to perform the query if we are paginating on a + // fragment + + // add the paginate info to the collected document + doc.refetch = { + update: refetchUpdate, + path: paginationPath, + method: flags.first.enabled || flags.last.enabled ? 'cursor' : 'offset', + pageSize: 0, + embedded: nodeQuery, + } + + // add the correct default page size + if (flags.first.enabled) { + doc.refetch.pageSize = flags.first.defaultValue + doc.refetch.start = flags.after.defaultValue + } else if (flags.last.enabled) { + doc.refetch.pageSize = flags.last.defaultValue + doc.refetch.start = flags.before.defaultValue + } else if (flags.limit.enabled) { + doc.refetch.pageSize = flags.limit.defaultValue + doc.refetch.start = flags.offset.defaultValue + } + + // if we're not paginating a fragment, there's nothing more to do. we mutated + // the query's definition to contain the arguments we need to get more data + // and we can just use it for refetches + if (!fragment) { + continue + } + // grab the enabled fields to create the list of arguments for the directive + const paginationArgs = Object.entries(flags) + .filter(([_, { enabled }]) => enabled) + .map(([key, value]) => ({ name: key, ...value })) + + const fragmentSpreadSelection = [ + { + kind: 'FragmentSpread', + name: { + kind: 'Name', + value: fragmentName, + }, + directives: [ + { + kind: 'Directive', + name: { + kind: 'Name', + value: config.withDirective, + }, + arguments: paginationArgs.map(({ name }) => variableAsArgument(name)), + }, + ], + }, + ] as graphql.SelectionNode[] + + const queryDoc: graphql.DocumentNode = { + kind: 'Document', + definitions: [ + { + kind: 'OperationDefinition', + name: { + kind: 'Name', + value: refetchQueryName, + }, + operation: 'query', + variableDefinitions: paginationArgs + .map( + (arg) => + ({ + kind: 'VariableDefinition', + type: { + kind: 'NamedType', + name: { + kind: 'Name', + value: arg.type, + }, + }, + variable: { + kind: 'Variable', + name: { + kind: 'Name', + value: arg.name, + }, + }, + defaultValue: !flags[arg.name].defaultValue + ? undefined + : { + kind: (arg.type + 'Value') as + | 'IntValue' + | 'StringValue', + value: flags[arg.name].defaultValue, + }, + } as graphql.VariableDefinitionNode) + ) + .concat( + !nodeQuery + ? [] + : [ + { + kind: 'VariableDefinition', + type: { + kind: 'NonNullType', + type: { + kind: 'NamedType', + name: { + kind: 'Name', + value: 'ID', + }, + }, + }, + variable: { + kind: 'Variable', + name: { + kind: 'Name', + value: 'id', + }, + }, + }, + ] + ), + selectionSet: { + kind: 'SelectionSet', + selections: !nodeQuery + ? fragmentSpreadSelection + : [ + { + kind: 'Field', + name: { + kind: 'Name', + value: 'node', + }, + arguments: [ + { + kind: 'Argument', + name: { + kind: 'Name', + value: 'id', + }, + value: { + kind: 'Variable', + name: { + kind: 'Name', + value: 'id', + }, + }, + }, + ], + selectionSet: { + kind: 'SelectionSet', + selections: fragmentSpreadSelection, + }, + }, + ], + }, + }, + ], + } + + // add a document to the list + newDocs.push({ + filename: doc.filename, + name: config.paginationQueryName(fragmentName), + document: queryDoc, + originalDocument: queryDoc, + generate: true, + refetch: doc.refetch, + }) + } + } + + // add every new doc we generated to the list + documents.push(...newDocs) +} + +function replaceArgumentsWithVariables( + args: readonly graphql.ArgumentNode[] | undefined, + flags: PaginationFlags +): graphql.ArgumentNode[] { + const seenArgs: Record = {} + + const newArgs = (args || []).map((arg) => { + // the specification for this variable + const spec = flags[arg.name.value] + // if the arg is not something we care about or is disabled we need to leave it alone + if (!spec || !spec.enabled) { + return arg + } + + // if the argument isn't being passed a variable, we will need to set a default value + if (arg.value.kind !== 'Variable') { + const oldValue = (arg.value as graphql.StringValueNode).value + + // transform the value if we have to and save the default value + flags[arg.name.value].defaultValue = spec.type === 'Int' ? parseInt(oldValue) : oldValue + } + + seenArgs[arg.name.value] = true + + // turn the field into a variable + return variableAsArgument(arg.name.value) + }) + + // any fields that are enabled but don't have values need to have variable references add + for (const name of Object.keys(flags)) { + // the specification for this variable + const spec = flags[name] + + // if we have a value or its disabled, ignore it + if (flags[name].defaultValue || !spec.enabled || seenArgs[name]) { + continue + } + + // if we are looking at forward pagination args when backwards is enabled ignore it + if (['first', 'after'].includes(name) && flags['before'].enabled) { + continue + } + // same but opposite for backwards pagination + if (['last', 'before'].includes(name) && flags['first'].enabled) { + continue + } + + // we need to add a variable referencing the argument + newArgs.push(variableAsArgument(name)) + } + + return newArgs +} + +function variableAsArgument(name: string): graphql.ArgumentNode { + return { + kind: 'Argument', + name: { + kind: 'Name', + value: name, + }, + value: { + kind: 'Variable', + name: { + kind: 'Name', + value: name, + }, + }, + } +} + +function staticVariableDefinition(name: string, type: string, defaultValue?: string) { + return { + kind: 'VariableDefinition', + type: { + kind: 'NamedType', + name: { + kind: 'Name', + value: type, + }, + }, + variable: { + kind: 'Variable', + name: { + kind: 'Name', + value: name, + }, + }, + defaultValue: !defaultValue + ? undefined + : { + kind: (type + 'Value') as 'IntValue' | 'StringValue', + value: defaultValue, + }, + } as graphql.VariableDefinitionNode +} + +function argumentNode( + name: string, + value: [string, number | string | undefined] +): graphql.ArgumentNode { + return { + kind: 'Argument', + name: { + kind: 'Name', + value: name, + }, + value: objectNode(value), + } +} + +function objectNode([type, defaultValue]: [ + string, + number | string | undefined +]): graphql.ObjectValueNode { + const node = { + kind: 'ObjectValue' as 'ObjectValue', + fields: [ + { + kind: 'ObjectField', + name: { + kind: 'Name', + value: 'type', + }, + value: { + kind: 'StringValue', + value: type, + }, + }, + ] as graphql.ObjectFieldNode[], + } + + // if there's a default value, add it + if (defaultValue) { + node.fields.push({ + kind: 'ObjectField', + name: { kind: 'Name', value: 'default' } as graphql.NameNode, + value: { + kind: typeof defaultValue === 'number' ? 'IntValue' : 'StringValue', + value: defaultValue.toString(), + }, + } as graphql.ObjectFieldNode) + } + + return node +} + +const pageInfoSelection = [ + { + kind: 'Field', + name: { + kind: 'Name', + value: 'edges', + }, + selectionSet: { + kind: 'SelectionSet', + selections: [ + { + kind: 'Field', + name: { + kind: 'Name', + value: 'cursor', + }, + }, + { + kind: 'Field', + name: { + kind: 'Name', + value: 'node', + }, + selectionSet: { + kind: 'SelectionSet', + selections: [ + { + kind: 'Field', + name: { + kind: 'Name', + value: '__typename', + }, + }, + ], + }, + }, + ], + }, + }, + { + kind: 'Field', + name: { + kind: 'Name', + value: 'pageInfo', + }, + selectionSet: { + kind: 'SelectionSet', + selections: [ + { + kind: 'Field', + name: { + kind: 'Name', + value: 'hasPreviousPage', + }, + }, + { + kind: 'Field', + name: { + kind: 'Name', + value: 'hasNextPage', + }, + }, + { + kind: 'Field', + name: { + kind: 'Name', + value: 'startCursor', + }, + }, + { + kind: 'Field', + name: { + kind: 'Name', + value: 'endCursor', + }, + }, + ], + }, + }, +] diff --git a/packages/houdini/cmd/transforms/schema.test.ts b/packages/houdini/cmd/transforms/schema.test.ts index 4f8688777..e0f500136 100644 --- a/packages/houdini/cmd/transforms/schema.test.ts +++ b/packages/houdini/cmd/transforms/schema.test.ts @@ -62,7 +62,7 @@ describe('schema transform', function () { ` mutation Update { updateUser { - ...A @prepend(when: { argument: "value", value: "value" }) + ...A @prepend(when: { value: "value" }) } } `, @@ -116,7 +116,7 @@ describe('schema transform', function () { ` mutation Update { updateUser { - ...A @append(when: { argument: "value", value: "value" }) + ...A @append(when: { value: "value" }) } } `, @@ -131,6 +131,6 @@ describe('schema transform', function () { ] for (const row of table) { - pipelineTest(row.title, row.documents, row.pass, null) + pipelineTest(row.title, row.documents, row.pass) } }) diff --git a/packages/houdini/cmd/transforms/schema.ts b/packages/houdini/cmd/transforms/schema.ts index 2192432b8..397451235 100644 --- a/packages/houdini/cmd/transforms/schema.ts +++ b/packages/houdini/cmd/transforms/schema.ts @@ -17,30 +17,23 @@ export default async function graphqlExtensions( config.schema, graphql.buildSchema(` - input HoudiniListWhen { - argument: String - value: String - } - """ @${config.listDirective} is used to mark a field for the runtime as a place to add or remove entities in mutations """ - directive @${config.listDirective}(${config.listNameArg}: String!) on FIELD + directive @${config.listDirective}(${config.listNameArg}: String!, connection: Boolean) on FIELD """ @${config.listPrependDirective} is used to tell the runtime to add the result to the end of the list """ directive @${config.listPrependDirective}( - ${config.listDirectiveParentIDArg}: ID, - when: HoudiniListWhen, - when_not: HoudiniListWhen + ${config.listDirectiveParentIDArg}: ID ) on FRAGMENT_SPREAD """ @${config.listAppendDirective} is used to tell the runtime to add the result to the start of the list """ - directive @${config.listAppendDirective}(${config.listDirectiveParentIDArg}: ID, when: HoudiniListWhen, when_not: HoudiniListWhen) on FRAGMENT_SPREAD + directive @${config.listAppendDirective}(${config.listDirectiveParentIDArg}: ID) on FRAGMENT_SPREAD """ @${config.listParentDirective} is used to provide a parentID without specifying position or in situations @@ -51,12 +44,12 @@ export default async function graphqlExtensions( """ @${config.whenDirective} is used to provide a conditional or in situations where it doesn't make sense (eg when removing or deleting a node.) """ - directive @${config.whenDirective}(argument: String!, value: String!) on FRAGMENT_SPREAD + directive @${config.whenDirective} on FRAGMENT_SPREAD """ @${config.whenNotDirective} is used to provide a conditional or in situations where it doesn't make sense (eg when removing or deleting a node.) """ - directive @${config.whenNotDirective}(argument: String!, value: String!) on FRAGMENT_SPREAD + directive @${config.whenNotDirective} on FRAGMENT_SPREAD """ @${config.argumentsDirective} is used to define the arguments of a fragment diff --git a/packages/houdini/cmd/transforms/typename.test.ts b/packages/houdini/cmd/transforms/typename.test.ts index 21cac6cea..271cab166 100644 --- a/packages/houdini/cmd/transforms/typename.test.ts +++ b/packages/houdini/cmd/transforms/typename.test.ts @@ -28,8 +28,8 @@ test('adds __typename on interface selection sets under query', async function ( const config = testConfig() await runPipeline(config, docs) - expect(graphql.print(docs[0].document)).toMatchInlineSnapshot(` - "query Friends { + expect(docs[0].document).toMatchInlineSnapshot(` + query Friends { friends { ... on Cat { id @@ -40,7 +40,7 @@ test('adds __typename on interface selection sets under query', async function ( __typename } } - " + `) }) @@ -69,9 +69,9 @@ test('adds __typename on interface selection sets under an object', async functi const config = testConfig() await runPipeline(config, docs) - expect(graphql.print(docs[0].document)).toMatchInlineSnapshot(` - "query Friends { - users(stringValue: \\"hello\\") { + expect(docs[0].document).toMatchInlineSnapshot(` + query Friends { + users(stringValue: "hello") { friendsInterface { ... on Cat { id @@ -85,7 +85,7 @@ test('adds __typename on interface selection sets under an object', async functi id } } - " + `) }) @@ -111,8 +111,8 @@ test('adds __typename on unions', async function () { const config = testConfig() await runPipeline(config, docs) - expect(graphql.print(docs[0].document)).toMatchInlineSnapshot(` - "query Friends { + expect(docs[0].document).toMatchInlineSnapshot(` + query Friends { entities { ... on Cat { id @@ -123,6 +123,6 @@ test('adds __typename on unions', async function () { __typename } } - " + `) }) diff --git a/packages/houdini/cmd/types.ts b/packages/houdini/cmd/types.ts index 07b690e92..ea84a0bc7 100644 --- a/packages/houdini/cmd/types.ts +++ b/packages/houdini/cmd/types.ts @@ -3,6 +3,7 @@ import type * as graphql from 'graphql' export type { ConfigFile } from 'houdini-common' export * from '../runtime/types' +import { BaseCompiledDocument } from '../runtime/types' // the result of collecting documents from source code export type CollectedGraphQLDocument = { @@ -10,7 +11,8 @@ export type CollectedGraphQLDocument = { name: string document: graphql.DocumentNode originalDocument: graphql.DocumentNode - generated: boolean + generate: boolean + refetch?: BaseCompiledDocument['refetch'] } // an error pertaining to a specific graphql document diff --git a/packages/houdini/cmd/validators/typeCheck.test.ts b/packages/houdini/cmd/validators/typeCheck.test.ts index 5a30cc802..19657c103 100755 --- a/packages/houdini/cmd/validators/typeCheck.test.ts +++ b/packages/houdini/cmd/validators/typeCheck.test.ts @@ -388,6 +388,34 @@ const table: Row[] = [ `, ], }, + { + title: 'known connection directives', + pass: true, + // note: we pass parentID here to ensure we're not getting caught on the + // free lists check + documents: [ + ` + query UserFriends { + user { + friendsByCursor @list(name: "Friends") { + edges { + node { + id + } + } + } + } + } + `, + ` + mutation Bar { + deleteUser(id: "2") { + userID @User_delete + } + } + `, + ], + }, { title: 'unknown list directives errors before generation', pass: false, @@ -415,7 +443,7 @@ const table: Row[] = [ pass: false, documents: [ ` - fragment Foo on Query @arguments(name: { type: "String"}) { + fragment Foo on Query @arguments(name: { type: "String!" }) { users(stringValue: $name) { id } } `, @@ -436,7 +464,7 @@ const table: Row[] = [ pass: false, documents: [ ` - fragment Foo on Query @arguments(name: { type: "String"}) { + fragment Foo on Query @arguments(name: { type: "String" }) { users(stringValue: $name) { id } } `, @@ -457,7 +485,7 @@ const table: Row[] = [ pass: false, documents: [ ` - fragment Foo on Query @arguments(name: { type: "String"}) { + fragment Foo on Query @arguments(name: { type: "String" }) { users(stringValue: $name) { id } } `, @@ -478,17 +506,245 @@ const table: Row[] = [ pass: false, documents: [ ` - fragment Foo on Query @arguments(name: { type: "String", default: true}) { + fragment FooA on Query @arguments(name: { type: "String", default: true}) { users(stringValue: $name) { id } } `, ` - fragment Foo on Query @arguments(name: { type: "String", default: true}) { + fragment FooB on Query @arguments(name: { type: "String", default: true}) { users(stringValue: $name) { id } } `, ], }, + { + title: '@paginate offset happy path', + pass: true, + documents: [ + ` + fragment UserPaginatedA on User { + friendsByOffset(limit: 10) @paginate { + id + } + } + `, + ` + fragment UserPaginatedB on User { + friendsByOffset(limit: 10) @paginate { + id + } + } + `, + ], + }, + { + title: '@paginate cursor happy path', + pass: true, + documents: [ + ` + fragment UserPaginatedA on User { + friendsByCursor(first: 10) @paginate { + edges { + node { + id + } + } + } + } + `, + ` + fragment UserPaginatedB on User { + friendsByCursor(first: 10) @paginate { + edges { + node { + id + } + } + } + } + `, + ], + }, + { + title: 'cursor pagination requires first', + pass: false, + documents: [ + ` + fragment UserCursorPaginatedA on User { + friendsByCursor @paginate { + edges { + node { + id + } + } + } + } + `, + ` + fragment UserCursorPaginatedB on User { + friendsByCursor @paginate { + edges { + node { + id + } + } + } + } + `, + ` + fragment UserCursorPaginatedC on User { + friendsByCursor(first: 10) @paginate { + edges { + node { + id + } + } + } + } + `, + ], + }, + { + title: "@paginate cursor can't go both ways", + pass: false, + documents: [ + ` + fragment UserPaginatedA on User { + friendsByCursor(first: 10, last: 10) @paginate { + edges { + node { + id + } + } + } + } + `, + ` + fragment UserPaginatedB on User { + friendsByCursor(first: 10, last: 10) @paginate { + edges { + node { + id + } + } + } + } + `, + ], + }, + { + title: "@paginate can't show up in a document with required args", + pass: false, + documents: [ + ` + fragment UserPaginatedA on User @arguments(foo: { type: "String!" }) { + friendsByCursor(first: 10) @paginate { + edges { + node { + id + } + } + } + } + `, + ` + fragment UserPaginatedB on User @arguments(foo: { type: "String!" }) { + friendsByCursor(first: 10) @paginate { + edges { + node { + id + } + } + } + } + `, + ], + }, + { + title: 'offset pagination requires limit', + pass: false, + documents: [ + ` + fragment UserPaginatedA on User { + friendsByOffset @paginate { + id + } + } + `, + ` + fragment UserPaginatedB on User { + friendsByOffset @paginate { + id + } + } + `, + ` + fragment UserPaginatedC on User { + friendsByOffset(limit: 10) @paginate { + id + } + } + `, + ], + }, + { + title: 'multiple @paginate', + pass: false, + documents: [ + ` + fragment UserPaginatedA on User { + friendsByOffset(limit: 10) @paginate { + id + } + friendsByCursor(first: 10) @paginate { + edges { + node { + id + } + } + } + } + `, + ` + fragment UserPaginatedB on User { + friendsByOffset(limit: 10) @paginate { + id + } + friendsByCursor(first: 10) @paginate { + edges { + node { + id + } + } + } + } + `, + ], + }, + { + title: "@paginate can't fall under lists", + pass: false, + documents: [ + ` + fragment UserPaginatedA on User { + friends { + friendsByOffset(limit: 10) @paginate { + id + } + } + } + `, + ` + fragment UserPaginatedB on User { + friends { + friendsByOffset(limit: 10) @paginate { + id + } + } + } + `, + ], + }, ] type Row = diff --git a/packages/houdini/cmd/validators/typeCheck.ts b/packages/houdini/cmd/validators/typeCheck.ts index cdd74641c..d88af795a 100755 --- a/packages/houdini/cmd/validators/typeCheck.ts +++ b/packages/houdini/cmd/validators/typeCheck.ts @@ -1,13 +1,15 @@ // externals -import { Config, parentTypeFromAncestors } from 'houdini-common' +import { Config, definitionFromAncestors, parentTypeFromAncestors } from 'houdini-common' import * as graphql from 'graphql' // locals import { CollectedGraphQLDocument, HoudiniError, HoudiniErrorTodo } from '../types' import { + FragmentArgument, fragmentArguments as collectFragmentArguments, withArguments, } from '../transforms/fragmentVariables' import { unwrapType } from '../utils' +import { connectionSelection } from '../transforms/list' // typeCheck verifies that the documents are valid instead of waiting // for the compiler to fail later down the line. @@ -18,6 +20,9 @@ export default async function typeCheck( // wrap the errors we run into in a HoudiniError const errors: HoudiniError[] = [] + // verify the node interface (if it exists) + verifyNodeInterface(config) + // we need to catch errors in the list API. this means that a user // must provide parentID if they are using a list that is not all-objects // from root. figure out which lists are "free" (ie, can be applied without a parentID arg) @@ -36,26 +41,10 @@ export default async function typeCheck( fragments[definition.name.value] = definition }, [graphql.Kind.DIRECTIVE](directive, _, parent, __, ancestors) { - // if the fragment is a list fragment - if (directive.name.value !== config.listDirective) { - return - } - - // look up the name of the list - const nameArg = directive.arguments?.find( - ({ name }) => name.value === config.listNameArg - ) - - if (!nameArg) { - errors.push(new HoudiniErrorTodo('Could not find name arg')) - return - } - if (nameArg.value.kind !== 'StringValue') { - errors.push( - new HoudiniErrorTodo( - 'Name arg must be a static string, it cannot be set to a variable.' - ) - ) + // only consider @paginate or @list + if ( + ![config.listDirective, config.paginateDirective].includes(directive.name.value) + ) { return } @@ -81,7 +70,7 @@ export default async function typeCheck( } // look at the list of ancestors to see if we required a parent ID - let needsParent = definition.kind === 'FragmentDefinition' + let needsParent = false // if we are looking at an operation that's not query if ( @@ -89,7 +78,11 @@ export default async function typeCheck( definition.kind !== 'FragmentDefinition') || (definition.kind === 'OperationDefinition' && definition.operation !== 'query') ) { - errors.push(new Error('@list can only appear in queries or fragments')) + errors.push( + new Error( + `@${directive.name.value} can only appear in queries or fragments` + ) + ) return } @@ -148,7 +141,45 @@ export default async function typeCheck( rootType = rootType?.getFields()[parent.name.value].type } - const parentType = parentTypeFromAncestors(config.schema, ancestors) + // if we found a pagination directive, make sure that it doesn't + // fall under a list (same logic as @list needing a parent) + if (directive.name.value === config.paginateDirective) { + // if we need a parent, we can't paginate it + if (needsParent) { + errors.push( + new HoudiniErrorTodo( + `@${config.paginateDirective} cannot be below a list` + ) + ) + } + } + + // if we got this far, we need a parent if we're under any fragment + // since a list mutation can't compute the parent from the owner of the fragment + needsParent = needsParent || definition.kind === 'FragmentDefinition' + + // look up the name of the list + const nameArg = directive.arguments?.find( + ({ name }) => name.value === config.listNameArg + ) + + if (!nameArg) { + // if we are looking at @list there is an error + if (directive.name.value === config.listDirective) { + errors.push(new HoudiniErrorTodo('Could not find name arg')) + } + + // regardless there's nothing more to process + return + } + if (nameArg.value.kind !== 'StringValue') { + errors.push( + new HoudiniErrorTodo( + 'Name arg must be a static string, it cannot be set to a variable.' + ) + ) + return + } // if we have already seen the list name there's a problem const listName = nameArg.value.value @@ -157,9 +188,24 @@ export default async function typeCheck( return } + // in order to figure out the targets for the list we need to look at the field + // definition + const parentType = parentTypeFromAncestors(config.schema, ancestors.slice(0, -1)) + const targetField = ancestors[ancestors.length - 1] as graphql.FieldNode + const targetFieldDefinition = parentType.getFields()[ + targetField.name.value + ] as graphql.GraphQLField + + const { type } = connectionSelection( + config, + targetFieldDefinition, + parentTypeFromAncestors(config.schema, ancestors) as graphql.GraphQLObjectType, + targetField.selectionSet + ) + // add the list to the list lists.push(listName) - listTypes.push(parentType.name) + listTypes.push(type.name) // if we still don't need a parent by now, add it to the list of free lists if (!needsParent) { @@ -201,10 +247,14 @@ export default async function typeCheck( listTypes, fragments, }), + // pagination directive can only show up on nodes or the query type + nodeDirectives(config, [config.paginateDirective]), // this replaces KnownArgumentNamesRule knownArguments(config), // validate any fragment arguments - fragmentArguments(config, fragments) + validateFragmentArguments(config, fragments), + // make sure there are pagination args on fields marked with @paginate + paginateArgs(config) ) for (const { filename, document: parsed } of docs) { @@ -371,7 +421,16 @@ function knownArguments(config: Config) { // if the directive points to the arguments or with directive, we don't // need the arguments to be defined - if ([config.argumentsDirective, config.withDirective].includes(directiveName)) { + if ( + [ + config.argumentsDirective, + config.withDirective, + config.whenDirective, + config.whenNotDirective, + config.listAppendDirective, + config.listPrependDirective, + ].includes(directiveName) + ) { return false } @@ -382,7 +441,7 @@ function knownArguments(config: Config) { } } -function fragmentArguments( +function validateFragmentArguments( config: Config, fragments: Record ) { @@ -391,7 +450,7 @@ function fragmentArguments( // map fragment name to the list of all the args const fragmentArgumentNames: Record = {} // map fragment names to the argument nodes - const fragmentArguments: Record = {} + const fragmentArguments: Record = {} return function (ctx: graphql.ValidationContext): graphql.ASTVisitor { return { @@ -466,19 +525,20 @@ function fragmentArguments( // if we haven't computed the required arguments for the fragment, do it now if (!requiredArgs[fragmentName]) { - // look up the arguments for the fragment - const args = collectFragmentArguments(config, fragments[fragmentName]) + let args: FragmentArgument[] + try { + // look up the arguments for the fragment + args = collectFragmentArguments(config, fragments[fragmentName]) + } catch (e) { + ctx.reportError(new graphql.GraphQLError((e as Error).message)) + return + } fragmentArguments[fragmentName] = args requiredArgs[fragmentName] = args - .filter( - (arg) => - arg.value.kind === 'ObjectValue' && - // any arg without a default value key in its body is required - !arg.value.fields.find((field) => field.name.value === 'default') - ) - .map((arg) => arg.name.value) - fragmentArgumentNames[fragmentName] = args.map((arg) => arg.name.value) + .filter((arg) => arg && arg.required) + .map((arg) => arg.name) + fragmentArgumentNames[fragmentName] = args.map((arg) => arg.name) } // get the arguments applied through with @@ -506,6 +566,7 @@ function fragmentArguments( JSON.stringify(missing) ) ) + return } // look for any args that we don't recognize @@ -524,12 +585,10 @@ function fragmentArguments( // zip together the provided argument with the one in the fragment definition const zipped: [ graphql.ArgumentNode, - graphql.ArgumentNode + string ][] = appliedArgumentNames.map((name) => [ appliedArguments[name], - fragmentArguments[fragmentName].find( - (arg) => arg.name.value === name - ) as graphql.ArgumentNode, + fragmentArguments[fragmentName].find((arg) => arg.name === name)!.type, ]) for (const [applied, target] of zipped) { @@ -549,31 +608,256 @@ function fragmentArguments( applied.value.kind.length - 'Value'.length ) - // find the type argument - const typeField = (target.value as graphql.ObjectValueNode).fields.find( - (field) => field.name.value === 'type' - )?.value - if (typeField?.kind !== 'StringValue') { + // if the two don't match up, its not a valid argument type + if (appliedType !== target) { ctx.reportError( new graphql.GraphQLError( - 'type field of @arguments must be a string' + `Invalid argument type. Expected ${target}, found ${appliedType}` ) ) - return } - const targetType = typeField.value + } + } + }, + } + } +} - // if the two don't match up, its not a valid argument type - if (appliedType !== targetType) { - ctx.reportError( - new graphql.GraphQLError( - `Invalid argument type. Expected ${targetType}, found ${appliedType}` - ) +function paginateArgs(config: Config) { + return function (ctx: graphql.ValidationContext): graphql.ASTVisitor { + // track if we have seen a paginate directive (to error on the second one) + let alreadyPaginated = false + + return { + Directive(node, _, __, ___, ancestors) { + // only consider pagination directives + if (node.name.value !== config.paginateDirective) { + return + } + + // if we have already run into a paginated field, yell loudly + if (alreadyPaginated) { + ctx.reportError( + new graphql.GraphQLError( + `@${config.paginateDirective} can only appear in a document once.` + ) + ) + } + + // make sure we fail if we see another paginated field + alreadyPaginated = true + + // find the definition containing the directive + const definition = definitionFromAncestors(ancestors) + + // look at the fragment arguments + const definitionArgs = collectFragmentArguments( + config, + definition as graphql.FragmentDefinitionNode + ) + + // a fragment marked for pagination can't have requried args + const hasRequiredArgs = definitionArgs.find((arg) => arg.required) + if (hasRequiredArgs) { + ctx.reportError( + new graphql.GraphQLError( + '@paginate cannot appear on a document with required args' + ) + ) + return + } + + // look at the field the directive is applied to + const targetFieldType = parentTypeFromAncestors( + config.schema, + ancestors.slice(0, -1) + ) + const targetField = ancestors.slice(-1)[0] as graphql.FieldNode + + // look at the possible args for the type to figure out if its a cursor-based + const type = targetFieldType.getFields()[ + targetField.name.value + ] as graphql.GraphQLField + + // if the type doesn't exist, don't do anything someone else will pick up the error + if (!type) { + return + } + + // get a summary of the types defined on the field + const fieldArgs = type.args.reduce>( + (args, arg) => ({ + ...args, + [arg.name]: unwrapType(config, arg.type).type.name, + }), + {} + ) + + const forwardPagination = + fieldArgs['first'] === 'Int' && fieldArgs['after'] === 'String' + + const backwardsPagination = + fieldArgs['last'] === 'Int' && fieldArgs['before'] === 'String' + + // a field with cursor based pagination must have the first arg and one of before or after + const cursorPagination = forwardPagination || backwardsPagination + + // create a summary of the applied args + const appliedArgs = new Set(targetField.arguments?.map((arg) => arg.name.value)) + + // if the field supports cursor based pagination, there must be a first argument applied + if (cursorPagination) { + const forward = appliedArgs.has('first') + const backwards = appliedArgs.has('last') + + if (!forward && !backwards) { + ctx.reportError( + new graphql.GraphQLError( + 'A field with cursor-based pagination must have a first or last argument' ) - } + ) + } + + if (forward && backwards) { + ctx.reportError( + new graphql.GraphQLError( + `A field with cursor pagination cannot go forwards an backwards simultaneously` + ) + ) + } + + return + } + + // a field with offset based paginate must have offset and limit args + const offsetPagination = + fieldArgs['offset'] === 'Int' && fieldArgs['limit'] === 'Int' + if (offsetPagination) { + const appliedLimitArg = targetField.arguments?.find( + (arg) => arg.name.value === 'limit' + ) + if (!appliedLimitArg) { + ctx.reportError( + new graphql.GraphQLError( + 'A field with offset-based pagination must have a limit argument' + ) + ) } + + return } }, } } } + +function nodeDirectives(config: Config, directives: string[]) { + const queryType = config.schema.getQueryType() + + let possibleNodes = [queryType?.name || ''] + // check if there's a node interface + const nodeInterface = config.schema.getType('Node') as graphql.GraphQLInterfaceType + if (nodeInterface) { + const { objects, interfaces } = config.schema.getImplementations(nodeInterface) + possibleNodes.push( + ...objects.map((object) => object.name), + ...interfaces.map((object) => object.name), + 'Node' + ) + } + + return function (ctx: graphql.ValidationContext): graphql.ASTVisitor { + // if there is no node + return { + Directive(node, _, __, ___, ancestors) { + // only look at the rarget directives + if (!directives.includes(node.name.value)) { + return + } + + // look through the ancestor list for the definition node + let definition = definitionFromAncestors(ancestors) + + // if the definition points to an operation, it must point to a query + let definitionType = '' + if (definition.kind === 'OperationDefinition') { + // if the definition is for something other than a query + if (definition.operation !== 'query') { + ctx.reportError( + new graphql.GraphQLError( + `@${node.name.value} must fall on a fragment or query document` + ) + ) + return + } + definitionType = config.schema.getQueryType()?.name || '' + } else if (definition.kind === 'FragmentDefinition') { + definitionType = definition.typeCondition.name.value + } + + // if the fragment is not on the query type or an implementor of node + if (!possibleNodes.includes(definitionType)) { + ctx.reportError( + new graphql.GraphQLError( + `@${node.name.value} must be applied to the query type or Node.` + ) + ) + } + }, + } + } +} + +function verifyNodeInterface(config: Config) { + const { schema } = config + + // look for Node + const nodeInterface = schema.getType('Node') + + // if there is no node interface don't do anything else + if (!nodeInterface) { + return + } + + // make sure its an interface + if (!graphql.isInterfaceType(nodeInterface)) { + throw new Error('Node must be an interface') + } + + // look for a field on the query type to look up a node by id + const queryType = schema.getQueryType() + if (!queryType) { + throw new Error('There must be a query type if you define a Node interface') + } + + // look for a node field + const nodeField = queryType.getFields()['node'] + if (!nodeField) { + throw new Error('There must be a node field if you define a Node interface') + } + + // there needs to be an arg on the field called id + const args = nodeField.args + if (args.length === 0) { + throw new Error('The node field must have args') + } + + // look for the id arg + const idArg = args.find((arg) => arg.name === 'id') + if (!idArg) { + throw new Error('The node field must have an id argument') + } + + // make sure that the id arg takes an ID + const idType = unwrapType(config, idArg.type) + // make sure its an ID + if (idType.type.name !== 'ID') { + throw new Error('The id arg of the node field must be an ID') + } + + // make sure that the node field returns a Node + const fieldReturnType = unwrapType(config, nodeField.type) + if (fieldReturnType.type.name !== 'Node') { + throw new Error('The node field must return a Node') + } +} diff --git a/packages/houdini/package-lock.json b/packages/houdini/package-lock.json index 82f36800a..93c308042 100644 --- a/packages/houdini/package-lock.json +++ b/packages/houdini/package-lock.json @@ -1,6 +1,6 @@ { "name": "houdini", - "version": "0.9.11", + "version": "0.10.0-alpha.13", "lockfileVersion": 2, "requires": true, "packages": { diff --git a/packages/houdini/package.json b/packages/houdini/package.json index aa92426db..76634ccbc 100755 --- a/packages/houdini/package.json +++ b/packages/houdini/package.json @@ -1,6 +1,6 @@ { "name": "houdini", - "version": "0.9.11", + "version": "0.10.0-alpha.13", "description": "The disappearing graphql client for SvelteKit", "scripts": { "build:runtime": "npm run build:runtime:cjs && npm run build:runtime:esm", @@ -49,7 +49,7 @@ "estree-walker": "^2.0.2", "glob": "^7.1.6", "graphql": "^15.5.0", - "houdini-common": "^0.9.0", + "houdini-common": "^0.10.0-alpha.8", "inquirer": "^7.3.3", "mkdirp": "^1.0.4", "node-fetch": "^2.6.1", @@ -57,5 +57,5 @@ "rollup-plugin-preserve-shebangs": "^0.2.0", "svelte": "^3.34.0" }, - "gitHead": "5c8d7507e445cdfb5db7e31ce724a9be9672452c" + "gitHead": "53c9c521029f9539e63fdaf5a9cd11244ef12cd5" } diff --git a/packages/houdini/runtime/cache/cache.test.ts b/packages/houdini/runtime/cache/cache.test.ts index 2e74db868..99299e59f 100644 --- a/packages/houdini/runtime/cache/cache.test.ts +++ b/packages/houdini/runtime/cache/cache.test.ts @@ -29,8 +29,8 @@ test('save root object', function () { firstName: 'bob', }, } - cache.write( - { + cache.write({ + selection: { viewer: { type: 'User', keyRaw: 'viewer', @@ -47,8 +47,7 @@ test('save root object', function () { }, }, data, - {} - ) + }) // make sure we can get back what we wrote expect(cache.internal.getRecord(cache.id('User', data.viewer)!)?.fields).toEqual({ @@ -62,8 +61,8 @@ test('partial update existing record', function () { const cache = new Cache(config) // save the data - cache.write( - { + cache.write({ + selection: { viewer: { type: 'User', keyRaw: 'viewer', @@ -79,17 +78,16 @@ test('partial update existing record', function () { }, }, }, - { + data: { viewer: { id: '1', firstName: 'bob', }, }, - {} - ) + }) - cache.write( - { + cache.write({ + selection: { viewer: { type: 'User', keyRaw: 'viewer', @@ -105,14 +103,13 @@ test('partial update existing record', function () { }, }, }, - { + data: { viewer: { id: '1', lastName: 'barker', }, }, - {} - ) + }) // make sure we can get back what we wrote expect(cache.internal.getRecord(cache.id('User', '1')!)?.fields).toEqual({ @@ -127,8 +124,8 @@ test('linked records with updates', function () { const cache = new Cache(config) // save the data - cache.write( - { + cache.write({ + selection: { viewer: { type: 'User', keyRaw: 'viewer', @@ -158,7 +155,7 @@ test('linked records with updates', function () { }, }, }, - { + data: { viewer: { id: '1', firstName: 'bob', @@ -168,8 +165,7 @@ test('linked records with updates', function () { }, }, }, - {} - ) + }) // check user 1 const user1 = cache.internal.getRecord(cache.id('User', '1')!) @@ -191,8 +187,8 @@ test('linked records with updates', function () { expect(user2?.linkedRecord('parent')).toBeFalsy() // associate user2 with a new parent - cache.write( - { + cache.write({ + selection: { viewer: { type: 'User', keyRaw: 'viewer', @@ -222,7 +218,7 @@ test('linked records with updates', function () { }, }, }, - { + data: { viewer: { id: '2', firstName: 'jane-prime', @@ -232,8 +228,7 @@ test('linked records with updates', function () { }, }, }, - {} - ) + }) // make sure we updated user 2 expect(user2?.fields).toEqual({ @@ -251,8 +246,8 @@ test('linked lists', function () { const cache = new Cache(config) // add some data to the cache - cache.write( - { + cache.write({ + selection: { viewer: { type: 'User', keyRaw: 'viewer', @@ -282,7 +277,7 @@ test('linked lists', function () { }, }, }, - { + data: { viewer: { id: '1', firstName: 'bob', @@ -298,8 +293,7 @@ test('linked lists', function () { ], }, }, - {} - ) + }) // make sure we can get the linked lists back const friendData = cache.internal @@ -323,8 +317,8 @@ test('list as value with args', function () { const cache = new Cache(config) // add some data to the cache - cache.write( - { + cache.write({ + selection: { viewer: { type: 'User', keyRaw: 'viewer', @@ -344,15 +338,14 @@ test('list as value with args', function () { }, }, }, - { + data: { viewer: { id: '1', firstName: 'bob', favoriteColors: ['red', 'green', 'blue'], }, }, - {} - ) + }) // look up the value expect( @@ -386,17 +379,16 @@ test('root subscribe - field change', function () { } // write some data - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', firstName: 'bob', favoriteColors: ['red', 'green', 'blue'], }, }, - {} - ) + }) // a function to spy on that will play the role of set const set = jest.fn() @@ -409,16 +401,15 @@ test('root subscribe - field change', function () { }) // somehow write a user to the cache with the same id, but a different name - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', firstName: 'mary', }, }, - {} - ) + }) // make sure that set got called with the full response expect(set).toHaveBeenCalledWith({ @@ -456,17 +447,16 @@ test('root subscribe - linked object changed', function () { } // start off associated with one object - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', firstName: 'bob', favoriteColors: ['red', 'green', 'blue'], }, }, - {} - ) + }) // a function to spy on that will play the role of set const set = jest.fn() @@ -479,17 +469,16 @@ test('root subscribe - linked object changed', function () { }) // somehow write a user to the cache with a different id - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '2', firstName: 'mary', // ignoring favoriteColors as a sanity check (should get undefined) }, }, - {} - ) + }) // make sure that set got called with the full response expect(set).toHaveBeenCalledWith({ @@ -533,13 +522,12 @@ test("subscribing to null object doesn't explode", function () { } // start off associated with one object - cache.write( + cache.write({ selection, - { + data: { viewer: null, }, - {} - ) + }) // a function to spy on that will play the role of set const set = jest.fn() @@ -552,16 +540,15 @@ test("subscribing to null object doesn't explode", function () { }) // somehow write a user to the cache with a different id - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '2', firstName: 'mary', }, }, - {} - ) + }) // make sure that set got called with the full response expect(set).toHaveBeenCalledWith({ @@ -606,9 +593,9 @@ test('root subscribe - linked list lost entry', function () { } // start off associated with one object - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', friends: [ @@ -623,8 +610,7 @@ test('root subscribe - linked list lost entry', function () { ], }, }, - {} - ) + }) // a function to spy on that will play the role of set const set = jest.fn() @@ -637,9 +623,9 @@ test('root subscribe - linked list lost entry', function () { }) // somehow write a user to the cache with a new friends list - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', friends: [ @@ -649,8 +635,7 @@ test('root subscribe - linked list lost entry', function () { ], }, }, - {} - ) + }) // make sure that set got called with the full response expect(set).toHaveBeenCalledWith({ @@ -703,9 +688,9 @@ test("subscribing to list with null values doesn't explode", function () { } // start off associated with one object - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', friends: [ @@ -717,8 +702,7 @@ test("subscribing to list with null values doesn't explode", function () { ], }, }, - {} - ) + }) // a function to spy on that will play the role of set const set = jest.fn() @@ -731,9 +715,9 @@ test("subscribing to list with null values doesn't explode", function () { }) // somehow write a user to the cache with a new friends list - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', friends: [ @@ -743,8 +727,7 @@ test("subscribing to list with null values doesn't explode", function () { ], }, }, - {} - ) + }) // make sure that set got called with the full response expect(set).toHaveBeenCalledWith({ @@ -792,9 +775,9 @@ test('root subscribe - linked list reorder', function () { } // start off associated with one object - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', friends: [ @@ -809,8 +792,7 @@ test('root subscribe - linked list reorder', function () { ], }, }, - {} - ) + }) // a function to spy on that will play the role of set const set = jest.fn() @@ -823,9 +805,9 @@ test('root subscribe - linked list reorder', function () { }) // somehow write a user to the cache with the same id, but a different name - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', friends: [ @@ -838,8 +820,7 @@ test('root subscribe - linked list reorder', function () { ], }, }, - {} - ) + }) // make sure that set got called with the full response expect(set).toHaveBeenCalledWith({ @@ -893,17 +874,16 @@ test('unsubscribe', function () { } // write some data - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', firstName: 'bob', favoriteColors: ['red', 'green', 'blue'], }, }, - {} - ) + }) // the spec we will register/unregister const spec = { @@ -945,7 +925,11 @@ test('append in list', function () { friends: { type: 'User', keyRaw: 'friends', - list: 'All_Users', + list: { + name: 'All_Users', + connection: false, + type: 'User', + }, fields: { id: { type: 'ID', @@ -962,9 +946,9 @@ test('append in list', function () { } // start off associated with one object - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', friends: [ @@ -975,8 +959,7 @@ test('append in list', function () { ], }, }, - {} - ) + }) // a function to spy on that will play the role of set const set = jest.fn() @@ -1031,7 +1014,11 @@ test('prepend in list', function () { friends: { type: 'User', keyRaw: 'friends', - list: 'All_Users', + list: { + name: 'All_Users', + connection: false, + type: 'User', + }, fields: { id: { type: 'ID', @@ -1048,9 +1035,9 @@ test('prepend in list', function () { } // start off associated with one object - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', friends: [ @@ -1061,8 +1048,7 @@ test('prepend in list', function () { ], }, }, - {} - ) + }) // a function to spy on that will play the role of set const set = jest.fn() @@ -1101,7 +1087,7 @@ test('prepend in list', function () { }) }) -test('list filter - must_not positive', function () { +test('remove from connection', function () { // instantiate a cache const cache = new Cache(config) @@ -1117,21 +1103,36 @@ test('list filter - must_not positive', function () { friends: { type: 'User', keyRaw: 'friends', - list: 'All_Users', - filters: { - foo: { - kind: 'String', - value: 'bar', - }, + list: { + name: 'All_Users', + connection: true, + type: 'User', }, fields: { - id: { - type: 'ID', - keyRaw: 'id', - }, - firstName: { - type: 'String', - keyRaw: 'firstName', + edges: { + type: 'UserEdge', + keyRaw: 'edges', + fields: { + node: { + type: 'Node', + keyRaw: 'node', + abstract: true, + fields: { + __typename: { + type: 'String', + keyRaw: '__typename', + }, + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + }, + }, + }, }, }, }, @@ -1140,21 +1141,32 @@ test('list filter - must_not positive', function () { } // start off associated with one object - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', - friends: [ - { - id: '2', - firstName: 'jane', - }, - ], + friends: { + edges: [ + { + node: { + __typename: 'User', + id: '2', + firstName: 'jane', + }, + }, + { + node: { + __typename: 'User', + id: '3', + firstName: 'jane', + }, + }, + ], + }, }, }, - {} - ) + }) // a function to spy on that will play the role of set const set = jest.fn() @@ -1163,43 +1175,46 @@ test('list filter - must_not positive', function () { cache.subscribe({ rootType: 'Query', set, - selection, + selection: selection, }) - // insert an element into the list (no parent ID) - cache - .list('All_Users') - .when({ must_not: { foo: 'not-bar' } }) - .prepend( - { - id: { type: 'ID', keyRaw: 'id' }, - firstName: { type: 'String', keyRaw: 'firstName' }, - }, - { - id: '3', - firstName: 'mary', - } - ) + // remove user 2 from the list + cache.list('All_Users').remove({ + id: '2', + }) - // make sure we got the new value + // the first time set was called, a new entry was added. + // the second time it's called, we get a new value for mary-prime expect(set).toHaveBeenCalledWith({ viewer: { id: '1', - friends: [ - { - firstName: 'mary', - id: '3', - }, - { - firstName: 'jane', - id: '2', - }, - ], + friends: { + edges: [ + { + node: { + __typename: 'User', + id: '3', + firstName: 'jane', + }, + }, + ], + }, }, }) + + // make sure we aren't subscribing to user 2 any more + expect( + cache.internal.getRecord(cache.id('User', '2')!)?.getSubscribers('firstName') + ).toHaveLength(0) + // but we're still subscribing to user 3 + expect( + cache.internal.getRecord(cache.id('User', '3')!)?.getSubscribers('firstName') + ).toHaveLength(1) + // make sure we deleted the edge holding onto this information + expect(cache.internal.getRecord('User:1.friends.edges[0]#User:2')).toBeUndefined() }) -test('list filter - must_not negative', function () { +test('append in connection', function () { // instantiate a cache const cache = new Cache(config) @@ -1215,21 +1230,36 @@ test('list filter - must_not negative', function () { friends: { type: 'User', keyRaw: 'friends', - list: 'All_Users', - filters: { - foo: { - kind: 'String', - value: 'bar', - }, + list: { + name: 'All_Users', + connection: true, + type: 'User', }, fields: { - id: { - type: 'ID', - keyRaw: 'id', - }, - firstName: { - type: 'String', - keyRaw: 'firstName', + edges: { + type: 'UserEdge', + keyRaw: 'edges', + fields: { + node: { + type: 'Node', + keyRaw: 'node', + abstract: true, + fields: { + __typename: { + type: 'String', + keyRaw: '__typename', + }, + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + }, + }, + }, }, }, }, @@ -1238,21 +1268,25 @@ test('list filter - must_not negative', function () { } // start off associated with one object - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', - friends: [ - { - id: '2', - firstName: 'jane', - }, - ], + friends: { + edges: [ + { + node: { + __typename: 'User', + id: '2', + firstName: 'jane', + }, + }, + ], + }, }, }, - {} - ) + }) // a function to spy on that will play the role of set const set = jest.fn() @@ -1265,25 +1299,229 @@ test('list filter - must_not negative', function () { }) // insert an element into the list (no parent ID) - cache - .list('All_Users') - .when({ must_not: { foo: 'bar' } }) - .prepend( - { - id: { type: 'ID', keyRaw: 'id' }, - firstName: { type: 'String', keyRaw: 'firstName' }, - }, - { - id: '3', - firstName: 'mary', - } - ) + cache.list('All_Users').append( + { id: { type: 'ID', keyRaw: 'id' }, firstName: { type: 'String', keyRaw: 'firstName' } }, + { + id: '3', + firstName: 'mary', + } + ) // make sure we got the new value - expect(set).not.toHaveBeenCalled() + expect(set).toHaveBeenCalledWith({ + viewer: { + id: '1', + friends: { + edges: [ + { + node: { + __typename: 'User', + id: '2', + firstName: 'jane', + }, + }, + { + node: { + __typename: 'User', + id: '3', + firstName: 'mary', + }, + }, + ], + }, + }, + }) }) -test('list filter - must positive', function () { +test('inserting data with an update overwrites a record inserted with list.append', function () { + // instantiate a cache + const cache = new Cache(config) + + const selection: SubscriptionSelection = { + viewer: { + type: 'User', + keyRaw: 'viewer', + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + friends: { + type: 'User', + keyRaw: 'friends', + list: { + name: 'All_Users', + connection: true, + type: 'User', + }, + fields: { + edges: { + type: 'UserEdge', + keyRaw: 'edges', + fields: { + node: { + type: 'Node', + keyRaw: 'node', + abstract: true, + fields: { + __typename: { + type: 'String', + keyRaw: '__typename', + }, + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + }, + }, + }, + }, + }, + }, + }, + }, + } + + // start off associated with one object + cache.write({ + selection, + data: { + viewer: { + id: '1', + friends: { + edges: [ + { + node: { + __typename: 'User', + id: '2', + firstName: 'jane', + }, + }, + ], + }, + }, + }, + }) + + // a function to spy on that will play the role of set + const set = jest.fn() + + // subscribe to the fields + cache.subscribe({ + rootType: 'Query', + set, + selection, + }) + + // insert an element into the list (no parent ID) + cache.list('All_Users').append( + { id: { type: 'ID', keyRaw: 'id' }, firstName: { type: 'String', keyRaw: 'firstName' } }, + { + id: '3', + firstName: 'mary', + } + ) + + // insert a record with a query update + cache.write({ + applyUpdates: true, + data: { + viewer: { + id: '1', + firstName: 'John', + friends: { + edges: [ + { + cursor: '1234', + node: { + id: '3', + firstName: 'mary', + }, + }, + ], + }, + }, + }, + selection: { + viewer: { + type: 'User', + keyRaw: 'viewer', + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + friends: { + type: 'User', + keyRaw: 'friends', + fields: { + edges: { + type: 'UserEdge', + keyRaw: 'edges', + update: 'append', + fields: { + cursor: { + type: 'String', + keyRaw: 'cursor', + }, + node: { + type: 'User', + keyRaw: 'node', + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + }, + }, + }, + }, + }, + }, + }, + }, + } as SubscriptionSelection, + }) + + // make sure the duplicate has been removed + expect(set).toHaveBeenNthCalledWith(2, { + viewer: { + id: '1', + friends: { + edges: [ + { + node: { + __typename: 'User', + id: '2', + firstName: 'jane', + }, + }, + { + node: { + __typename: 'User', + id: '3', + firstName: 'mary', + }, + }, + ], + }, + }, + }) +}) + +test('list filter - must_not positive', function () { // instantiate a cache const cache = new Cache(config) @@ -1299,7 +1537,11 @@ test('list filter - must positive', function () { friends: { type: 'User', keyRaw: 'friends', - list: 'All_Users', + list: { + name: 'All_Users', + connection: false, + type: 'User', + }, filters: { foo: { kind: 'String', @@ -1322,9 +1564,9 @@ test('list filter - must positive', function () { } // start off associated with one object - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', friends: [ @@ -1335,8 +1577,7 @@ test('list filter - must positive', function () { ], }, }, - {} - ) + }) // a function to spy on that will play the role of set const set = jest.fn() @@ -1351,7 +1592,7 @@ test('list filter - must positive', function () { // insert an element into the list (no parent ID) cache .list('All_Users') - .when({ must: { foo: 'bar' } }) + .when({ must_not: { foo: 'not-bar' } }) .prepend( { id: { type: 'ID', keyRaw: 'id' }, @@ -1381,7 +1622,7 @@ test('list filter - must positive', function () { }) }) -test('list filter - must negative', function () { +test('list filter - must_not negative', function () { // instantiate a cache const cache = new Cache(config) @@ -1397,7 +1638,11 @@ test('list filter - must negative', function () { friends: { type: 'User', keyRaw: 'friends', - list: 'All_Users', + list: { + name: 'All_Users', + connection: false, + type: 'User', + }, filters: { foo: { kind: 'String', @@ -1420,9 +1665,9 @@ test('list filter - must negative', function () { } // start off associated with one object - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', friends: [ @@ -1433,8 +1678,7 @@ test('list filter - must negative', function () { ], }, }, - {} - ) + }) // a function to spy on that will play the role of set const set = jest.fn() @@ -1449,7 +1693,7 @@ test('list filter - must negative', function () { // insert an element into the list (no parent ID) cache .list('All_Users') - .when({ must: { foo: 'not-bar' } }) + .when({ must_not: { foo: 'bar' } }) .prepend( { id: { type: 'ID', keyRaw: 'id' }, @@ -1465,11 +1709,11 @@ test('list filter - must negative', function () { expect(set).not.toHaveBeenCalled() }) -test('subscribe to new list nodes', function () { +test('list filter - must positive', function () { // instantiate a cache const cache = new Cache(config) - const selection = { + const selection: SubscriptionSelection = { viewer: { type: 'User', keyRaw: 'viewer', @@ -1481,7 +1725,17 @@ test('subscribe to new list nodes', function () { friends: { type: 'User', keyRaw: 'friends', - list: 'All_Users', + list: { + name: 'All_Users', + connection: false, + type: 'User', + }, + filters: { + foo: { + kind: 'String', + value: 'bar', + }, + }, fields: { id: { type: 'ID', @@ -1498,9 +1752,9 @@ test('subscribe to new list nodes', function () { } // start off associated with one object - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', friends: [ @@ -1511,8 +1765,7 @@ test('subscribe to new list nodes', function () { ], }, }, - {} - ) + }) // a function to spy on that will play the role of set const set = jest.fn() @@ -1525,59 +1778,43 @@ test('subscribe to new list nodes', function () { }) // insert an element into the list (no parent ID) - cache.list('All_Users').append( - { id: { type: 'ID', keyRaw: 'id' }, firstName: { type: 'String', keyRaw: 'firstName' } }, - { - id: '3', - firstName: 'mary', - } - ) - - // update the user we just added - cache.write( - selection, - { - viewer: { - id: '1', - friends: [ - { - id: '2', - firstName: 'jane', - }, - { - id: '3', - firstName: 'mary-prime', - }, - ], + cache + .list('All_Users') + .when({ must: { foo: 'bar' } }) + .prepend( + { + id: { type: 'ID', keyRaw: 'id' }, + firstName: { type: 'String', keyRaw: 'firstName' }, }, - }, - {} - ) + { + id: '3', + firstName: 'mary', + } + ) - // the first time set was called, a new entry was added. - // the second time it's called, we get a new value for mary-prime - expect(set).toHaveBeenNthCalledWith(2, { + // make sure we got the new value + expect(set).toHaveBeenCalledWith({ viewer: { id: '1', friends: [ { - firstName: 'jane', - id: '2', + firstName: 'mary', + id: '3', }, { - firstName: 'mary-prime', - id: '3', + firstName: 'jane', + id: '2', }, ], }, }) }) -test('remove from list', function () { +test('list filter - must negative', function () { // instantiate a cache const cache = new Cache(config) - const selection = { + const selection: SubscriptionSelection = { viewer: { type: 'User', keyRaw: 'viewer', @@ -1589,7 +1826,17 @@ test('remove from list', function () { friends: { type: 'User', keyRaw: 'friends', - list: 'All_Users', + list: { + name: 'All_Users', + connection: false, + type: 'User', + }, + filters: { + foo: { + kind: 'String', + value: 'bar', + }, + }, fields: { id: { type: 'ID', @@ -1606,9 +1853,9 @@ test('remove from list', function () { } // start off associated with one object - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', friends: [ @@ -1619,8 +1866,7 @@ test('remove from list', function () { ], }, }, - {} - ) + }) // a function to spy on that will play the role of set const set = jest.fn() @@ -1629,30 +1875,29 @@ test('remove from list', function () { cache.subscribe({ rootType: 'Query', set, - selection: selection, - }) - - // remove user 2 from the list - cache.list('All_Users').remove({ - id: '2', + selection, }) - // the first time set was called, a new entry was added. - // the second time it's called, we get a new value for mary-prime - expect(set).toHaveBeenCalledWith({ - viewer: { - id: '1', - friends: [], - }, - }) + // insert an element into the list (no parent ID) + cache + .list('All_Users') + .when({ must: { foo: 'not-bar' } }) + .prepend( + { + id: { type: 'ID', keyRaw: 'id' }, + firstName: { type: 'String', keyRaw: 'firstName' }, + }, + { + id: '3', + firstName: 'mary', + } + ) - // make sure we aren't subscribing to user 2 any more - expect( - cache.internal.getRecord(cache.id('User', '2')!)?.getSubscribers('firstName') - ).toHaveLength(0) + // make sure we got the new value + expect(set).not.toHaveBeenCalled() }) -test('delete node', function () { +test('subscribe to new list nodes', function () { // instantiate a cache const cache = new Cache(config) @@ -1668,7 +1913,11 @@ test('delete node', function () { friends: { type: 'User', keyRaw: 'friends', - list: 'All_Users', + list: { + name: 'All_Users', + connection: false, + type: 'User', + }, fields: { id: { type: 'ID', @@ -1685,9 +1934,9 @@ test('delete node', function () { } // start off associated with one object - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', friends: [ @@ -1698,8 +1947,7 @@ test('delete node', function () { ], }, }, - {} - ) + }) // a function to spy on that will play the role of set const set = jest.fn() @@ -1711,60 +1959,75 @@ test('delete node', function () { selection, }) - // remove user 2 from the list - cache.delete( - cache.id('User', { - id: '2', - })! + // insert an element into the list (no parent ID) + cache.list('All_Users').append( + { id: { type: 'ID', keyRaw: 'id' }, firstName: { type: 'String', keyRaw: 'firstName' } }, + { + id: '3', + firstName: 'mary', + } ) - // we should have been updated with an empty list - expect(set).toHaveBeenCalledWith({ + // update the user we just added + cache.write({ + selection, + data: { + viewer: { + id: '1', + friends: [ + { + id: '2', + firstName: 'jane', + }, + { + id: '3', + firstName: 'mary-prime', + }, + ], + }, + }, + }) + + // the first time set was called, a new entry was added. + // the second time it's called, we get a new value for mary-prime + expect(set).toHaveBeenNthCalledWith(2, { viewer: { id: '1', - friends: [], + friends: [ + { + firstName: 'jane', + id: '2', + }, + { + firstName: 'mary-prime', + id: '3', + }, + ], }, }) - - // make sure its empty now - expect(cache.internal.getRecord('User:2')).toBeFalsy() }) -test('append operation', function () { +test('remove from list', function () { // instantiate a cache const cache = new Cache(config) - // create a list we will add to - cache.write( - { - viewer: { - type: 'User', - keyRaw: 'viewer', - fields: { - id: { - type: 'ID', - keyRaw: 'id', - }, + const selection = { + viewer: { + type: 'User', + keyRaw: 'viewer', + fields: { + id: { + type: 'ID', + keyRaw: 'id', }, - }, - }, - { - viewer: { - id: '1', - }, - }, - {} - ) - - // subscribe to the data to register the list - cache.subscribe( - { - rootType: 'User', - selection: { friends: { type: 'User', keyRaw: 'friends', - list: 'All_Users', + list: { + name: 'All_Users', + connection: false, + type: 'User', + }, fields: { id: { type: 'ID', @@ -1777,89 +2040,75 @@ test('append operation', function () { }, }, }, - parentID: cache.id('User', '1')!, - set: jest.fn(), }, - {} - ) + } - // write some data to a different location with a new user - // that should be added to the list - cache.write( - { - newUser: { - type: 'User', - keyRaw: 'newUser', - operations: [ + // start off associated with one object + cache.write({ + selection, + data: { + viewer: { + id: '1', + friends: [ { - action: 'insert', - list: 'All_Users', - parentID: { - kind: 'String', - value: cache.id('User', '1')!, - }, + id: '2', + firstName: 'jane', }, ], - fields: { - id: { - type: 'ID', - keyRaw: 'id', - }, - }, }, }, - { - newUser: { - id: '3', - }, + }) + + // a function to spy on that will play the role of set + const set = jest.fn() + + // subscribe to the fields + cache.subscribe({ + rootType: 'Query', + set, + selection: selection, + }) + + // remove user 2 from the list + cache.list('All_Users').remove({ + id: '2', + }) + + // the first time set was called, a new entry was added. + // the second time it's called, we get a new value for mary-prime + expect(set).toHaveBeenCalledWith({ + viewer: { + id: '1', + friends: [], }, - {} - ) + }) - // make sure we just added to the list - expect([...cache.list('All_Users', cache.id('User', '1')!)]).toHaveLength(1) + // make sure we aren't subscribing to user 2 any more + expect( + cache.internal.getRecord(cache.id('User', '2')!)?.getSubscribers('firstName') + ).toHaveLength(0) }) -test('append when operation', function () { +test('delete node', function () { // instantiate a cache const cache = new Cache(config) - // create a list we will add to - cache.write( - { - viewer: { - type: 'User', - keyRaw: 'viewer', - fields: { - id: { - type: 'ID', - keyRaw: 'id', - }, + const selection = { + viewer: { + type: 'User', + keyRaw: 'viewer', + fields: { + id: { + type: 'ID', + keyRaw: 'id', }, - }, - }, - { - viewer: { - id: '1', - }, - }, - {} - ) - - // subscribe to the data to register the list - cache.subscribe( - { - rootType: 'User', - selection: { friends: { type: 'User', keyRaw: 'friends', - list: 'All_Users', - filters: { - value: { - kind: 'String', - value: 'foo', - }, + list: { + name: 'All_Users', + connection: false, + type: 'User', }, fields: { id: { @@ -1873,61 +2122,168 @@ test('append when operation', function () { }, }, }, - parentID: cache.id('User', '1')!, - set: jest.fn(), }, - {} - ) + } - // write some data to a different location with a new user - // that should be added to the list - cache.write( - { - newUser: { - type: 'User', - keyRaw: 'newUser', - operations: [ + // start off associated with one object + cache.write({ + selection, + data: { + viewer: { + id: '1', + friends: [ { - action: 'insert', - list: 'All_Users', - parentID: { - kind: 'String', - value: cache.id('User', '1')!, - }, - when: { - must: { - value: 'not-foo', - }, - }, + id: '2', + firstName: 'jane', }, ], - fields: { - id: { - type: 'ID', - keyRaw: 'id', + }, + }, + }) + + // a function to spy on that will play the role of set + const set = jest.fn() + + // subscribe to the fields + cache.subscribe({ + rootType: 'Query', + set, + selection, + }) + + // remove user 2 from the list + cache.delete( + 'User', + cache.id('User', { + id: '2', + })! + ) + + // we should have been updated with an empty list + expect(set).toHaveBeenCalledWith({ + viewer: { + id: '1', + friends: [], + }, + }) + + // make sure its empty now + expect(cache.internal.getRecord('User:2')).toBeFalsy() +}) + +test('delete node from connection', function () { + // instantiate a cache + const cache = new Cache(config) + + const selection: SubscriptionSelection = { + viewer: { + type: 'User', + keyRaw: 'viewer', + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + friends: { + type: 'User', + keyRaw: 'friends', + list: { + name: 'All_Users', + connection: true, + type: 'User', + }, + fields: { + edges: { + type: 'UserEdge', + keyRaw: 'edges', + fields: { + node: { + type: 'Node', + keyRaw: 'node', + abstract: true, + fields: { + __typename: { + type: 'String', + keyRaw: '__typename', + }, + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + }, + }, + }, + }, }, }, }, }, - { - newUser: { - id: '3', + } + + // start off associated with one object + cache.write({ + selection, + data: { + viewer: { + id: '1', + friends: { + edges: [ + { + node: { + __typename: 'User', + id: '2', + firstName: 'jane', + }, + }, + ], + }, }, }, - {} + }) + + // a function to spy on that will play the role of set + const set = jest.fn() + + // subscribe to the fields + cache.subscribe({ + rootType: 'Query', + set, + selection, + }) + + // remove user 2 from the list + cache.delete( + 'User', + cache.id('User', { + id: '2', + })! ) - // make sure we just added to the list - expect([...cache.list('All_Users', cache.id('User', '1')!)]).toHaveLength(0) + // we should have been updated with an empty list + expect(set).toHaveBeenCalledWith({ + viewer: { + id: '1', + friends: { + edges: [], + }, + }, + }) + + // make sure its empty now + expect(cache.internal.getRecord('User:2')).toBeFalsy() }) -test('prepend when operation', function () { +test('append operation', function () { // instantiate a cache const cache = new Cache(config) // create a list we will add to - cache.write( - { + cache.write({ + selection: { viewer: { type: 'User', keyRaw: 'viewer', @@ -1939,13 +2295,12 @@ test('prepend when operation', function () { }, }, }, - { + data: { viewer: { id: '1', }, }, - {} - ) + }) // subscribe to the data to register the list cache.subscribe( @@ -1955,12 +2310,10 @@ test('prepend when operation', function () { friends: { type: 'User', keyRaw: 'friends', - list: 'All_Users', - filters: { - value: { - kind: 'String', - value: 'foo', - }, + list: { + name: 'All_Users', + connection: false, + type: 'User', }, fields: { id: { @@ -1982,8 +2335,8 @@ test('prepend when operation', function () { // write some data to a different location with a new user // that should be added to the list - cache.write( - { + cache.write({ + selection: { newUser: { type: 'User', keyRaw: 'newUser', @@ -1995,12 +2348,6 @@ test('prepend when operation', function () { kind: 'String', value: cache.id('User', '1')!, }, - position: 'first', - when: { - must: { - value: 'not-foo', - }, - }, }, ], fields: { @@ -2011,25 +2358,24 @@ test('prepend when operation', function () { }, }, }, - { + data: { newUser: { id: '3', }, }, - {} - ) + }) // make sure we just added to the list - expect([...cache.list('All_Users', cache.id('User', '1')!)]).toHaveLength(0) + expect([...cache.list('All_Users', cache.id('User', '1')!)]).toHaveLength(1) }) -test('prepend operation', function () { +test('append when operation', function () { // instantiate a cache const cache = new Cache(config) // create a list we will add to - cache.write( - { + cache.write({ + selection: { viewer: { type: 'User', keyRaw: 'viewer', @@ -2038,36 +2384,15 @@ test('prepend operation', function () { type: 'ID', keyRaw: 'id', }, - friends: { - type: 'User', - keyRaw: 'friends', - fields: { - id: { - type: 'String', - keyRaw: 'id', - }, - firstName: { - type: 'String', - keyRaw: 'firstName', - }, - }, - }, }, }, }, - { + data: { viewer: { id: '1', - friends: [ - { - id: '2', - firstName: 'mary', - }, - ], }, }, - {} - ) + }) // subscribe to the data to register the list cache.subscribe( @@ -2077,7 +2402,17 @@ test('prepend operation', function () { friends: { type: 'User', keyRaw: 'friends', - list: 'All_Users', + list: { + name: 'All_Users', + connection: false, + type: 'User', + }, + filters: { + value: { + kind: 'String', + value: 'foo', + }, + }, fields: { id: { type: 'ID', @@ -2098,8 +2433,8 @@ test('prepend operation', function () { // write some data to a different location with a new user // that should be added to the list - cache.write( - { + cache.write({ + selection: { newUser: { type: 'User', keyRaw: 'newUser', @@ -2111,7 +2446,11 @@ test('prepend operation', function () { kind: 'String', value: cache.id('User', '1')!, }, - position: 'first', + when: { + must: { + value: 'not-foo', + }, + }, }, ], fields: { @@ -2122,27 +2461,24 @@ test('prepend operation', function () { }, }, }, - { + data: { newUser: { id: '3', }, }, - {} - ) + }) // make sure we just added to the list - expect( - [...cache.list('All_Users', cache.id('User', '1')!)].map((record) => record!.fields.id) - ).toEqual(['3', '2']) + expect([...cache.list('All_Users', cache.id('User', '1')!)]).toHaveLength(0) }) -test('remove operation', function () { +test('prepend when operation', function () { // instantiate a cache const cache = new Cache(config) // create a list we will add to - cache.write( - { + cache.write({ + selection: { viewer: { type: 'User', keyRaw: 'viewer', @@ -2151,31 +2487,15 @@ test('remove operation', function () { type: 'ID', keyRaw: 'id', }, - friends: { - type: 'User', - keyRaw: 'friends', - fields: { - id: { - type: 'ID', - keyRaw: 'id', - }, - firstName: { - type: 'String', - keyRaw: 'firstName', - }, - }, - }, }, }, }, - { + data: { viewer: { id: '1', - friends: [{ id: '2', firstName: 'jane' }], }, }, - {} - ) + }) // subscribe to the data to register the list cache.subscribe( @@ -2185,7 +2505,17 @@ test('remove operation', function () { friends: { type: 'User', keyRaw: 'friends', - list: 'All_Users', + list: { + name: 'All_Users', + connection: false, + type: 'User', + }, + filters: { + value: { + kind: 'String', + value: 'foo', + }, + }, fields: { id: { type: 'ID', @@ -2205,20 +2535,26 @@ test('remove operation', function () { ) // write some data to a different location with a new user - // that should be removed from the operation - cache.write( - { + // that should be added to the list + cache.write({ + selection: { newUser: { type: 'User', keyRaw: 'newUser', operations: [ { - action: 'remove', + action: 'insert', list: 'All_Users', parentID: { kind: 'String', value: cache.id('User', '1')!, }, + position: 'first', + when: { + must: { + value: 'not-foo', + }, + }, }, ], fields: { @@ -2229,14 +2565,235 @@ test('remove operation', function () { }, }, }, + data: { + newUser: { + id: '3', + }, + }, + }) + + // make sure we just added to the list + expect([...cache.list('All_Users', cache.id('User', '1')!)]).toHaveLength(0) +}) + +test('prepend operation', function () { + // instantiate a cache + const cache = new Cache(config) + + // create a list we will add to + cache.write({ + selection: { + viewer: { + type: 'User', + keyRaw: 'viewer', + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + friends: { + type: 'User', + keyRaw: 'friends', + fields: { + id: { + type: 'String', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + }, + }, + }, + }, + }, + data: { + viewer: { + id: '1', + friends: [ + { + id: '2', + firstName: 'mary', + }, + ], + }, + }, + }) + + // subscribe to the data to register the list + cache.subscribe( { + rootType: 'User', + selection: { + friends: { + type: 'User', + keyRaw: 'friends', + list: { + name: 'All_Users', + connection: false, + type: 'User', + }, + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + }, + }, + }, + parentID: cache.id('User', '1')!, + set: jest.fn(), + }, + {} + ) + + // write some data to a different location with a new user + // that should be added to the list + cache.write({ + selection: { newUser: { - id: '2', + type: 'User', + keyRaw: 'newUser', + operations: [ + { + action: 'insert', + list: 'All_Users', + parentID: { + kind: 'String', + value: cache.id('User', '1')!, + }, + position: 'first', + }, + ], + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + }, + }, + }, + data: { + newUser: { + id: '3', + }, + }, + }) + + // make sure we just added to the list + expect( + [...cache.list('All_Users', cache.id('User', '1')!)].map((record) => record!.fields.id) + ).toEqual(['3', '2']) +}) + +test('remove operation', function () { + // instantiate a cache + const cache = new Cache(config) + + // create a list we will add to + cache.write({ + selection: { + viewer: { + type: 'User', + keyRaw: 'viewer', + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + friends: { + type: 'User', + keyRaw: 'friends', + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + }, + }, + }, + }, + }, + data: { + viewer: { + id: '1', + friends: [{ id: '2', firstName: 'jane' }], + }, + }, + }) + + // subscribe to the data to register the list + cache.subscribe( + { + rootType: 'User', + selection: { + friends: { + type: 'User', + keyRaw: 'friends', + list: { + name: 'All_Users', + connection: false, + type: 'User', + }, + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + }, + }, }, + parentID: cache.id('User', '1')!, + set: jest.fn(), }, {} ) + // write some data to a different location with a new user + // that should be removed from the operation + cache.write({ + selection: { + newUser: { + type: 'User', + keyRaw: 'newUser', + operations: [ + { + action: 'remove', + list: 'All_Users', + parentID: { + kind: 'String', + value: cache.id('User', '1')!, + }, + }, + ], + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + }, + }, + }, + data: { + newUser: { + id: '2', + }, + }, + }) + // make sure we removed the element from the list expect([...cache.list('All_Users', cache.id('User', '1')!)]).toHaveLength(0) }) @@ -2246,8 +2803,8 @@ test('delete operation', function () { const cache = new Cache(config) // create a list we will add to - cache.write( - { + cache.write({ + selection: { viewer: { type: 'User', keyRaw: 'viewer', @@ -2273,14 +2830,13 @@ test('delete operation', function () { }, }, }, - { + data: { viewer: { id: '1', friends: [{ id: '2', firstName: 'jane' }], }, }, - {} - ) + }) // subscribe to the data to register the list cache.subscribe( @@ -2290,7 +2846,11 @@ test('delete operation', function () { friends: { type: 'User', keyRaw: 'friends', - list: 'All_Users', + list: { + name: 'All_Users', + connection: false, + type: 'User', + }, fields: { id: { type: 'ID', @@ -2311,8 +2871,8 @@ test('delete operation', function () { // write some data to a different location with a new user // that should be added to the list - cache.write( - { + cache.write({ + selection: { deleteUser: { type: 'User', keyRaw: 'deleteUser', @@ -2330,13 +2890,12 @@ test('delete operation', function () { }, }, }, - { + data: { deleteUser: { id: '2', }, }, - {} - ) + }) // make sure we removed the element from the list expect([...cache.list('All_Users', cache.id('User', '1')!)]).toHaveLength(0) @@ -2360,7 +2919,11 @@ test('variables in query and subscription', function () { friends: { type: 'User', keyRaw: 'friends(filter: $filter)', - list: 'All_Users', + list: { + name: 'All_Users', + connection: false, + type: 'User', + }, fields: { id: { type: 'ID', @@ -2377,9 +2940,9 @@ test('variables in query and subscription', function () { } // start off associated with one object - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', friends: [ @@ -2394,10 +2957,10 @@ test('variables in query and subscription', function () { ], }, }, - { + variables: { filter: 'foo', - } - ) + }, + }) // a function to spy on that will play the role of set const set = jest.fn() @@ -2419,9 +2982,9 @@ test('variables in query and subscription', function () { expect(cache.list('All_Users').key).toEqual('friends(filter: "foo")') // somehow write a user to the cache with a new friends list - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', friends: [ @@ -2431,10 +2994,10 @@ test('variables in query and subscription', function () { ], }, }, - { + variables: { filter: 'foo', - } - ) + }, + }) // make sure that set got called with the full response expect(set).toHaveBeenCalledWith({ @@ -2471,7 +3034,11 @@ test('deleting a node removes nested subscriptions', function () { friends: { type: 'User', keyRaw: 'friends', - list: 'All_Users', + list: { + name: 'All_Users', + connection: false, + type: 'User', + }, fields: { id: { type: 'ID', @@ -2488,15 +3055,18 @@ test('deleting a node removes nested subscriptions', function () { } // start off associated with one object - cache.write(selection, { - viewer: { - id: '1', - friends: [ - { - id: '2', - firstName: 'jane', - }, - ], + cache.write({ + selection, + data: { + viewer: { + id: '1', + friends: [ + { + id: '2', + firstName: 'jane', + }, + ], + }, }, }) @@ -2514,7 +3084,7 @@ test('deleting a node removes nested subscriptions', function () { expect(cache.internal.getRecord('User:2')?.getSubscribers('firstName')).toHaveLength(1) // delete the parent - cache.delete('User:1') + cache.delete('User', 'User:1') // sanity check expect(cache.internal.getRecord('User:2')?.getSubscribers('firstName')).toHaveLength(0) @@ -2540,7 +3110,11 @@ test('same record twice in a query survives one unsubscribe (reference counting) friends: { type: 'User', keyRaw: 'friends', - list: 'All_Users', + list: { + name: 'All_Users', + connection: false, + type: 'User', + }, fields: { id: { type: 'ID', @@ -2557,9 +3131,9 @@ test('same record twice in a query survives one unsubscribe (reference counting) } // start off associated with one object - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', firstName: 'bob', @@ -2571,10 +3145,10 @@ test('same record twice in a query survives one unsubscribe (reference counting) ], }, }, - { + variables: { filter: 'foo', - } - ) + }, + }) // a function to spy on that will play the role of set const set = jest.fn() @@ -2645,24 +3219,27 @@ test('embedded references', function () { } // write an embedded list of embedded objects holding references to an object - cache.write(selection, { - viewer: { - id: '1', - friends: { - edges: [ - { - node: { - id: '2', - firstName: 'jane', + cache.write({ + selection, + data: { + viewer: { + id: '1', + friends: { + edges: [ + { + node: { + id: '2', + firstName: 'jane', + }, }, - }, - { - node: { - id: '3', - firstName: 'mary', + { + node: { + id: '3', + firstName: 'mary', + }, }, - }, - ], + ], + }, }, }, }) @@ -2683,8 +3260,8 @@ test('embedded references', function () { ) // update one of the embedded references - cache.write( - { + cache.write({ + selection: { user: { type: 'User', keyRaw: 'user', @@ -2700,13 +3277,13 @@ test('embedded references', function () { }, }, }, - { + data: { user: { id: '2', firstName: 'not-jane', }, - } - ) + }, + }) // make sure we got the updated data expect(set).toHaveBeenCalledWith({ @@ -2781,8 +3358,8 @@ test('writing abstract objects', function () { firstName: 'bob', }, } - cache.write( - { + cache.write({ + selection: { viewer: { type: 'Node', abstract: true, @@ -2804,8 +3381,7 @@ test('writing abstract objects', function () { }, }, data, - {} - ) + }) // make sure we can get back what we wrote expect(cache.internal.getRecord(cache.id('User', data.viewer)!)?.fields).toEqual({ @@ -2834,8 +3410,8 @@ test('writing abstract lists', function () { }, ], } - cache.write( - { + cache.write({ + selection: { nodes: { type: 'Node', abstract: true, @@ -2857,8 +3433,7 @@ test('writing abstract lists', function () { }, }, data, - {} - ) + }) // make sure we can get back what we wrote expect(cache.internal.getRecord(cache.id('User', data.nodes[0])!)?.fields).toEqual({ @@ -2899,7 +3474,7 @@ test('extracting data with custom scalars unmarshals the value', () => { } // write the data to cache - cache.write(selection, data, {}) + cache.write({ selection, data }) // pull the data out of the cache expect(cache.internal.getData(cache.internal.record(rootID), selection, {})).toEqual({ @@ -2930,29 +3505,529 @@ test('can pull enum from cached values', function () { }, }, }, - } + } + + // save the data + const data = { + node: { + id: '1', + enumValue: 'Hello', + }, + } + + // write the data to cache + cache.write({ selection, data }) + + // pull the data out of the cache + expect(cache.internal.getData(cache.internal.record(rootID), selection, {})).toEqual({ + node: { + id: '1', + enumValue: 'Hello', + }, + }) +}) + +test('can store and retrieve lists with null values', function () { + // instantiate the cache + const cache = new Cache(config) + + const selection = { + viewer: { + type: 'User', + keyRaw: 'viewer', + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + friends: { + type: 'User', + keyRaw: 'friends', + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + }, + }, + }, + }, + } + + // add some data to the cache + cache.write({ + selection, + data: { + viewer: { + id: '1', + firstName: 'bob', + friends: [ + { + id: '2', + firstName: 'jane', + }, + null, + ], + }, + }, + }) + + // make sure we can get the linked lists back + expect(cache.internal.getData(cache.internal.record(rootID), selection, {})).toEqual({ + viewer: { + id: '1', + firstName: 'bob', + friends: [ + { + id: '2', + firstName: 'jane', + }, + null, + ], + }, + }) +}) + +test('can store and retrieve links with null values', function () { + // instantiate the cache + const cache = new Cache(config) + + const selection = { + viewer: { + type: 'User', + keyRaw: 'viewer', + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + friends: { + type: 'User', + keyRaw: 'friends', + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + }, + }, + }, + }, + } + + // add some data to the cache + cache.write({ + selection, + data: { + viewer: null, + }, + }) + + // make sure we can get the linked record back + expect(cache.internal.getData(cache.internal.record(rootID), selection, {})).toEqual({ + viewer: null, + }) +}) + +test('can write list of just null', function () { + // instantiate the cache + const cache = new Cache(config) + + const selection = { + viewer: { + type: 'User', + keyRaw: 'viewer', + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + friends: { + type: 'User', + keyRaw: 'friends', + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + }, + }, + }, + }, + } + + // add some data to the cache + cache.write({ + selection, + data: { + viewer: { + id: '1', + firstName: 'bob', + friends: [null], + }, + }, + }) + + // make sure we can get the linked lists back + expect(cache.internal.getData(cache.internal.record(rootID), selection, {})).toEqual({ + viewer: { + id: '1', + firstName: 'bob', + friends: [null], + }, + }) +}) + +test('can write list of scalars', function () { + // instantiate the cache + const cache = new Cache(config) + + const selection = { + viewer: { + type: 'User', + keyRaw: 'viewer', + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + friends: { + type: 'Int', + keyRaw: 'friends', + }, + }, + }, + } + + // add some data to the cache + cache.write({ + selection, + data: { + viewer: { + id: '1', + firstName: 'bob', + friends: [1], + }, + }, + }) + + // make sure we can get the linked lists back + expect(cache.internal.getData(cache.internal.record(rootID), selection, {})).toEqual({ + viewer: { + id: '1', + firstName: 'bob', + friends: [1], + }, + }) +}) + +test('writing a scalar marked with a disabled update overwrites', function () { + // instantiate the cache + const cache = new Cache(config) + + const selection = { + viewer: { + type: 'User', + keyRaw: 'viewer', + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + friends: { + type: 'Int', + keyRaw: 'friends', + update: 'append', + }, + }, + }, + } as SubscriptionSelection + + // add some data to the cache + cache.write({ + selection, + data: { + viewer: { + id: '1', + firstName: 'bob', + friends: [1], + }, + }, + }) + + // make sure we can get the linked lists back + expect(cache.internal.getData(cache.internal.record(rootID), selection, {})).toEqual({ + viewer: { + id: '1', + firstName: 'bob', + friends: [1], + }, + }) + + // add some data to the cache + cache.write({ + selection, + data: { + viewer: { + id: '1', + firstName: 'bob', + friends: [2], + }, + }, + }) + + // make sure we can get the updated lists back + expect(cache.internal.getData(cache.internal.record(rootID), selection, {})).toEqual({ + viewer: { + id: '1', + firstName: 'bob', + friends: [2], + }, + }) +}) + +test('writing a scalar marked with a prepend', function () { + // instantiate the cache + const cache = new Cache(config) + + const selection = { + viewer: { + type: 'User', + keyRaw: 'viewer', + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + friends: { + type: 'Int', + keyRaw: 'friends', + update: 'prepend', + }, + }, + }, + } as SubscriptionSelection + + // add some data to the cache + cache.write({ + selection, + data: { + viewer: { + id: '1', + firstName: 'bob', + friends: [1], + }, + }, + }) + + // make sure we can get the linked lists back + expect(cache.internal.getData(cache.internal.record(rootID), selection, {})).toEqual({ + viewer: { + id: '1', + firstName: 'bob', + friends: [1], + }, + }) + + // add some data to the cache + cache.write({ + selection, + data: { + viewer: { + id: '1', + firstName: 'bob', + friends: [2], + }, + }, + applyUpdates: true, + }) + + // make sure we can get the updated lists back + expect(cache.internal.getData(cache.internal.record(rootID), selection, {})).toEqual({ + viewer: { + id: '1', + firstName: 'bob', + friends: [2, 1], + }, + }) +}) + +test('writing a scalar marked with an append', function () { + // instantiate the cache + const cache = new Cache(config) + + const selection = { + viewer: { + type: 'User', + keyRaw: 'viewer', + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + friends: { + type: 'Int', + keyRaw: 'friends', + update: 'append', + }, + }, + }, + } as SubscriptionSelection + + // add some data to the cache + cache.write({ + selection, + data: { + viewer: { + id: '1', + firstName: 'bob', + friends: [1], + }, + }, + }) + + // make sure we can get the linked lists back + expect(cache.internal.getData(cache.internal.record(rootID), selection, {})).toEqual({ + viewer: { + id: '1', + firstName: 'bob', + friends: [1], + }, + }) + + // add some data to the cache + cache.write({ + selection, + data: { + viewer: { + id: '1', + firstName: 'bob', + friends: [2], + }, + }, + applyUpdates: true, + }) + + // make sure we can get the updated lists back + expect(cache.internal.getData(cache.internal.record(rootID), selection, {})).toEqual({ + viewer: { + id: '1', + firstName: 'bob', + friends: [1, 2], + }, + }) +}) + +test('writing a scalar marked with replace', function () { + // instantiate the cache + const cache = new Cache(config) + + const selection = { + viewer: { + type: 'User', + keyRaw: 'viewer', + fields: { + id: { + type: 'ID', + keyRaw: 'id', + }, + firstName: { + type: 'String', + keyRaw: 'firstName', + }, + friends: { + type: 'Int', + keyRaw: 'friends', + update: 'append', + }, + }, + }, + } as SubscriptionSelection + + // add some data to the cache + cache.write({ + selection, + data: { + viewer: { + id: '1', + firstName: 'bob', + friends: [1], + }, + }, + }) - // save the data - const data = { - node: { + // make sure we can get the linked lists back + expect(cache.internal.getData(cache.internal.record(rootID), selection, {})).toEqual({ + viewer: { id: '1', - enumValue: 'Hello', + firstName: 'bob', + friends: [1], }, - } + }) - // write the data to cache - cache.write(selection, data, {}) + // add some data to the cache + cache.write({ + selection, + data: { + viewer: { + id: '1', + firstName: 'bob', + friends: [2], + }, + }, + }) - // pull the data out of the cache + // make sure we can get the updated lists back expect(cache.internal.getData(cache.internal.record(rootID), selection, {})).toEqual({ - node: { + viewer: { id: '1', - enumValue: 'Hello', + firstName: 'bob', + friends: [2], }, }) }) -test('can store and retrieve lists with null values', function () { +test('disabled linked lists update', function () { // instantiate the cache const cache = new Cache(config) @@ -2972,6 +4047,7 @@ test('can store and retrieve lists with null values', function () { friends: { type: 'User', keyRaw: 'friends', + update: 'append', fields: { id: { type: 'ID', @@ -2985,12 +4061,12 @@ test('can store and retrieve lists with null values', function () { }, }, }, - } + } as SubscriptionSelection // add some data to the cache - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', firstName: 'bob', @@ -2999,30 +4075,72 @@ test('can store and retrieve lists with null values', function () { id: '2', firstName: 'jane', }, - null, + { + id: '3', + firstName: 'mary', + }, ], }, }, - {} - ) + }) // make sure we can get the linked lists back - expect(cache.internal.getData(cache.internal.record(rootID), selection, {})).toEqual({ - viewer: { - id: '1', - firstName: 'bob', - friends: [ - { - id: '2', - firstName: 'jane', - }, - null, - ], + expect( + cache.internal + .getRecord(cache.id('User', '1')!) + ?.linkedList('friends') + .map((data) => data!.fields) + ).toEqual([ + { + id: '2', + firstName: 'jane', + }, + { + id: '3', + firstName: 'mary', + }, + ]) + + // add some data to the cache + cache.write({ + selection, + data: { + viewer: { + id: '1', + firstName: 'bob', + friends: [ + { + id: '3', + firstName: 'jane', + }, + { + id: '4', + firstName: 'mary', + }, + ], + }, }, }) + + // make sure we can get the linked lists back + expect( + cache.internal + .getRecord(cache.id('User', '1')!) + ?.linkedList('friends') + .map((data) => data!.fields) + ).toEqual([ + { + id: '3', + firstName: 'jane', + }, + { + id: '4', + firstName: 'mary', + }, + ]) }) -test('can store and retrieve links with null values', function () { +test('append linked lists update', function () { // instantiate the cache const cache = new Cache(config) @@ -3042,6 +4160,7 @@ test('can store and retrieve links with null values', function () { friends: { type: 'User', keyRaw: 'friends', + update: 'append', fields: { id: { type: 'ID', @@ -3055,24 +4174,95 @@ test('can store and retrieve links with null values', function () { }, }, }, - } + } as SubscriptionSelection // add some data to the cache - cache.write( + cache.write({ selection, + data: { + viewer: { + id: '1', + firstName: 'bob', + friends: [ + { + id: '2', + firstName: 'jane', + }, + { + id: '3', + firstName: 'mary', + }, + ], + }, + }, + }) + + // make sure we can get the linked lists back + expect( + cache.internal + .getRecord(cache.id('User', '1')!) + ?.linkedList('friends') + .map((data) => data!.fields) + ).toEqual([ { - viewer: null, + id: '2', + firstName: 'jane', }, - {} - ) + { + id: '3', + firstName: 'mary', + }, + ]) - // make sure we can get the linked record back - expect(cache.internal.getData(cache.internal.record(rootID), selection, {})).toEqual({ - viewer: null, + // add some data to the cache + cache.write({ + selection, + data: { + viewer: { + id: '1', + firstName: 'bob', + friends: [ + { + id: '4', + firstName: 'jane', + }, + { + id: '5', + firstName: 'mary', + }, + ], + }, + }, + applyUpdates: true, }) + + // make sure we can get the linked lists back + expect( + cache.internal + .getRecord(cache.id('User', '1')!) + ?.linkedList('friends') + .map((data) => data!.fields) + ).toEqual([ + { + id: '2', + firstName: 'jane', + }, + { + id: '3', + firstName: 'mary', + }, + { + id: '4', + firstName: 'jane', + }, + { + id: '5', + firstName: 'mary', + }, + ]) }) -test('can write list of just null', function () { +test('prepend linked lists update', function () { // instantiate the cache const cache = new Cache(config) @@ -3092,6 +4282,7 @@ test('can write list of just null', function () { friends: { type: 'User', keyRaw: 'friends', + update: 'prepend', fields: { id: { type: 'ID', @@ -3105,77 +4296,93 @@ test('can write list of just null', function () { }, }, }, - } + } as SubscriptionSelection // add some data to the cache - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', firstName: 'bob', - friends: [null], + friends: [ + { + id: '2', + firstName: 'jane', + }, + { + id: '3', + firstName: 'mary', + }, + ], }, }, - {} - ) + applyUpdates: true, + }) // make sure we can get the linked lists back - expect(cache.internal.getData(cache.internal.record(rootID), selection, {})).toEqual({ - viewer: { - id: '1', - firstName: 'bob', - friends: [null], + expect( + cache.internal + .getRecord(cache.id('User', '1')!) + ?.linkedList('friends') + .map((data) => data!.fields) + ).toEqual([ + { + id: '2', + firstName: 'jane', }, - }) -}) - -test('can write list of scalars', function () { - // instantiate the cache - const cache = new Cache(config) - - const selection = { - viewer: { - type: 'User', - keyRaw: 'viewer', - fields: { - id: { - type: 'ID', - keyRaw: 'id', - }, - firstName: { - type: 'String', - keyRaw: 'firstName', - }, - friends: { - type: 'Int', - keyRaw: 'friends', - }, - }, + { + id: '3', + firstName: 'mary', }, - } + ]) // add some data to the cache - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', firstName: 'bob', - friends: [1], + friends: [ + { + id: '4', + firstName: 'jane', + }, + { + id: '5', + firstName: 'mary', + }, + ], }, }, - {} - ) + applyUpdates: true, + }) // make sure we can get the linked lists back - expect(cache.internal.getData(cache.internal.record(rootID), selection, {})).toEqual({ - viewer: { - id: '1', - firstName: 'bob', - friends: [1], + expect( + cache.internal + .getRecord(cache.id('User', '1')!) + ?.linkedList('friends') + .map((data) => data!.fields) + ).toEqual([ + { + id: '4', + firstName: 'jane', }, - }) + { + id: '5', + firstName: 'mary', + }, + { + id: '2', + firstName: 'jane', + }, + { + id: '3', + firstName: 'mary', + }, + ]) }) test('self-referencing linked lists can be unsubscribed (avoid infinite recursion)', function () { @@ -3228,9 +4435,9 @@ test('self-referencing linked lists can be unsubscribed (avoid infinite recursio } // add some data to the cache - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', firstName: 'bob', @@ -3248,8 +4455,7 @@ test('self-referencing linked lists can be unsubscribed (avoid infinite recursio ], }, }, - {} - ) + }) // subscribe to the list const spec = { @@ -3330,9 +4536,9 @@ test('self-referencing links can be unsubscribed (avoid infinite recursion)', fu } // add some data to the cache - cache.write( + cache.write({ selection, - { + data: { viewer: { id: '1', firstName: 'bob', @@ -3350,8 +4556,7 @@ test('self-referencing links can be unsubscribed (avoid infinite recursion)', fu }, }, }, - {} - ) + }) // subscribe to the list const spec = { diff --git a/packages/houdini/runtime/cache/cache.ts b/packages/houdini/runtime/cache/cache.ts index 6f2290aa5..fc604d217 100644 --- a/packages/houdini/runtime/cache/cache.ts +++ b/packages/houdini/runtime/cache/cache.ts @@ -25,22 +25,47 @@ export class Cache { private _lists: Map> = new Map() // save the response in the local store and notify any subscribers - write( - selection: SubscriptionSelection, - data: { [key: string]: GraphQLValue }, - variables: {} = {}, - id?: string - ) { + write({ + selection, + data, + variables = {}, + parent = rootID, + applyUpdates = false, + }: { + selection: SubscriptionSelection + data: { [key: string]: GraphQLValue } + variables?: {} + parent?: string + applyUpdates?: boolean + }) { const specs: SubscriptionSpec[] = [] - const parentID = id || rootID - // recursively walk down the payload and update the store. calls to update atomic fields // will build up different specs of subscriptions that need to be run against the current state - this._write(parentID, parentID, selection, parentID, data, variables, specs) + this._write({ + rootID: parent, + selection, + recordID: parent, + data, + variables, + specs, + applyUpdates, + }) + + // the same spec will likely need to be updated multiple times, create the unique list by using the set + // function's identity + const uniqueSpecs: SubscriptionSpec[] = [] + const assignedSets: SubscriptionSpec['set'][] = [] + for (const spec of specs) { + // if we haven't added the set yet + if (!assignedSets.includes(spec.set)) { + uniqueSpecs.push(spec) + assignedSets.push(spec.set) + } + } // compute new values for every spec that needs to be run - this.notifySubscribers(specs, variables) + this.notifySubscribers(uniqueSpecs, variables) } // returns the global id of the specified field (used to access the record in the cache) @@ -50,7 +75,7 @@ export class Cache { id(type: string, id: string): string | null id(type: string, data: any): string | null { // try to compute the id of the record - const id = typeof data === 'string' ? data : this.computeID(data) + const id = typeof data === 'string' ? data : this.computeID(type, data) if (!id) { return null } @@ -101,22 +126,27 @@ export class Cache { } // remove the record from every list we know of and the cache itself - delete(id: string, variables: {} = {}): boolean { + delete(type: string, id: string, variables: {} = {}): boolean { const record = this.record(id) // remove any related subscriptions record.removeAllSubscribers() - for (const { name, parentID } of record.lists) { - // look up the list - const list = this.list(name, parentID) + // look at every list we know of with the matching type + for (const parentMap of this._lists.values()) { + for (const handler of parentMap.values()) { + // only consider the list if it holds the matching type + if (handler.listType !== type) { + continue + } - // remove the entity from the list - list.removeID(id, variables) + // remove the id from the list + handler.removeID(id, variables) + } } // remove the entry from the cache - return this._data.delete(id) + return this.clear(id) } // grab the record specified by {id}. @@ -141,10 +171,12 @@ export class Cache { record: this.record.bind(this), getRecord: this.getRecord.bind(this), getData: this.getData.bind(this), + clear: this.clear.bind(this), + computeID: this.computeID.bind(this), } } - private computeID(data: { [key: string]: GraphQLValue }) { + private computeID(type: string, data: { [key: string]: GraphQLValue }) { return data.id } @@ -236,19 +268,20 @@ export class Cache { // if this field is marked as a list, register it if (list && fields) { // if we haven't seen this list before - if (!this._lists.has(list)) { - this._lists.set(list, new Map()) + if (!this._lists.has(list.name)) { + this._lists.set(list.name, new Map()) } // if we haven't already registered a handler to this list in the cache - this._lists.get(list)?.set( + this._lists.get(list.name)?.set( spec.parentID || rootID, new ListHandler({ - name: list, + name: list.name, + connection: list.connection, parentID: spec.parentID, cache: this, record: rootRecord, - listType: type, + listType: list.type, key, selection: fields, filters: Object.entries(filters || {}).reduce( @@ -276,15 +309,6 @@ export class Cache { continue } - // the children of a list need the reference back - if (list) { - // add the list reference to record - child.addListReference({ - name: list, - parentID: spec.parentID, - }) - } - // make sure the children update this subscription this.addSubscribers(child, spec, fields, variables) } @@ -305,13 +329,9 @@ export class Cache { // remove the subscriber to the field rootRecord.forgetSubscribers(spec) - // if this field is marked as a list remove it from teh cache + // if this field is marked as a list remove it from the cache if (list) { - this._lists.delete(list) - rootRecord.removeListReference({ - name: list, - parentID: spec.parentID, - }) + this._lists.delete(list.name) } // if the field points to a link, we need to remove any subscribers on any fields of that @@ -338,15 +358,23 @@ export class Cache { } } - private _write( - rootID: string, // the ID that anchors any lists - parentID: string, // the ID that can be used to build up the key for embedded data - selection: SubscriptionSelection, - recordID: string, // the ID of the record that we are updating in cache - data: { [key: string]: GraphQLValue }, - variables: { [key: string]: GraphQLValue }, + private _write({ + rootID, + selection, + recordID, + data, + variables, + specs, + applyUpdates, + }: { + rootID: string // the ID that anchors any lists + selection: SubscriptionSelection + recordID: string // the ID of the record that we are updating in cache + data: { [key: string]: GraphQLValue } + variables: { [key: string]: GraphQLValue } specs: SubscriptionSpec[] - ) { + applyUpdates: boolean + }) { // the record we are storing information about this object const record = this.record(recordID) @@ -361,14 +389,15 @@ export class Cache { '' ) } + // look up the field in our schema let { type: linkedType, keyRaw, fields, operations, - list, abstract: isAbstract, + update, } = selection[field] const key = this.evaluateKey(keyRaw, variables) @@ -412,7 +441,7 @@ export class Cache { ).length > 0 // figure out the id of the new linked record - const linkedID = !embedded ? this.id(linkedType, value) : `${parentID}.${key}` + const linkedID = !embedded ? this.id(linkedType, value) : `${recordID}.${key}` // if we are now linked to a new object we need to record the new value if (linkedID && oldID !== linkedID) { @@ -432,32 +461,42 @@ export class Cache { // only update the data if there is an id for the record if (linkedID) { // update the linked fields too - this._write(rootID, recordID, fields, linkedID, value, variables, specs) + this._write({ + rootID, + selection: fields, + recordID: linkedID, + data: value, + variables, + specs, + applyUpdates, + }) } } // the value could be a list else if (!isScalar(this._config, linkedType) && Array.isArray(value) && fields) { - // build up the list of linked ids - const linkedIDs: (string | null)[] = [] // look up the current known link id - const oldIDs = record.linkedListIDs(this.evaluateKey(key, variables)) + let oldIDs = record.linkedListIDs(this.evaluateKey(key, variables)) + + // if we are supposed to prepend or append and the mutation is enabled + // the new list of IDs for this link will start with an existing value - // the ids that have been added since the last time - const newIDs: string[] = [] + // build up the list of linked ids + let linkedIDs: (string | null)[] = [] - // figure out if this is an embedded list or a linked one by looking for all of the fields marked as - // required to compute the entity's id in the first non-null value we can find + // keep track of the records we are adding + const newIDs: (string | null)[] = [] // visit every entry in the list for (const [i, entry] of value.entries()) { // if the entry is a null value, just add it to the list if (entry === null) { - linkedIDs.push(null) + newIDs.push(null) continue } - // figure out if this record is embedded + // figure out if this is an embedded list or a linked one by looking for all of the fields marked as + // required to compute the entity's id const embedded = this.idFields(linkedType)?.filter( (field) => typeof (entry as GraphQLObject)[field] === 'undefined' @@ -482,9 +521,29 @@ export class Cache { } // build up an - const linkedID = !embedded + let linkedID = !embedded ? this.id(innerType, entry) - : `${parentID}.${key}[${i}]` + : `${recordID}.${key}[${i}]` + + // if the field is marked for pagination and we are looking at edges, we need + // to use the underlying node for the id because the embedded key will conflict + // with entries in the previous loaded value. + // NOTE: this approach might cause weird behavior of a node is loaded in the same + // location in two different pages. In practice, nodes rarely show up in the same + // connection so it might not be a problem. + if ( + key === 'edges' && + entry['node'] && + (entry['node'] as { __typename: string }).__typename + ) { + const node = entry['node'] as {} + // @ts-ignore + const typename = node.__typename + let nodeID = this.id(typename, node) + if (nodeID) { + linkedID += '#' + nodeID + } + } // if we couldn't compute the id, just move on if (!linkedID) { @@ -492,21 +551,88 @@ export class Cache { } // update the linked fields too - this._write(rootID, recordID, fields, linkedID, entry, variables, specs) + this._write({ + rootID, + selection: fields, + recordID: linkedID, + data: entry, + variables, + specs, + applyUpdates, + }) // add the id to the list - linkedIDs.push(linkedID) - // hold onto the new ids - if (!oldIDs.includes(linkedID)) { - newIDs.push(linkedID) - - if (list) { - this.record(linkedID).addListReference({ - parentID: rootID, - name: list, - }) + newIDs.push(linkedID) + } + + // if we're supposed to apply this write as an update, we need to figure out how + if (applyUpdates && update) { + // it's possible that one of the ids in the field corresponds to an entry + // that was added as part of a mutation operation on this list. + // ideally we want to remove the old reference and leave the new one behind. + // In order to pull this off, we have to rely on the fact that a mutation operation + // doesn't leave a cursor behind. so we need to look at the old list of edges, + // track if there's a cursor value, get their node id, and remove any node ids + // that show up in the new list + if (key === 'edges') { + // build up a list of the ids found in the new list + const newNodeIDs: string[] = [] + for (const id of newIDs) { + if (!id) { + continue + } + + // look up the lined node record + const node = this.record(id).linkedRecord('node') + if (!node || !node.fields.__typename) { + continue + } + + newNodeIDs.push(node.id) } + + // filter out any old ids that point to edges with no cursor and a node that is found in the new list + oldIDs = oldIDs.filter((id) => { + if (!id) { + return true + } + + // look up the edge record + const edge = this.record(id) + + // if there is a cursor, keep it + if (edge.fields['cursor']) { + return true + } + + // look up the linked node + const node = edge.linkedRecord('node') + // if there one, keep the edge + if (!node) { + return true + } + + // only keep the edge if the node's id doesn't show up in the new list + return !newNodeIDs.includes(node.id) + }) + } + + // if we have to prepend it, do so + if (update === 'prepend') { + linkedIDs = newIDs.concat(oldIDs) + } + // otherwise we might have to append it + else if (update === 'append') { + linkedIDs = oldIDs.concat(newIDs) } + // if the update is a replace do the right thing + else if (update === 'replace') { + linkedIDs = newIDs + } + } + // we're not supposed to apply this write as an update, just use the new value + else { + linkedIDs = newIDs } // we have to notify the subscribers if a few things happen: @@ -553,9 +679,21 @@ export class Cache { // the value is neither an object or a list so its a scalar else { // if the value is different - if (value !== record.getField(key)) { + if (JSON.stringify(value) !== JSON.stringify(record.getField(key))) { + let newValue = value + // if the value is an array, we might have to apply updates + if (Array.isArray(value) && applyUpdates && update) { + // if we have to prepend the new value on the old one + if (update === 'append') { + newValue = ((record.getField(key) as any[]) || []).concat(value) + } + // we might have to prepend our value onto the old one + else if (update === 'prepend') { + newValue = value.concat(record.getField(key) || []) + } + } // update the cached value - record.writeField(key, value) + record.writeField(key, newValue) // add every subscriber to the list of specs to change specs.push(...subscribers) @@ -571,12 +709,12 @@ export class Cache { if (operation.parentID.kind !== 'Variable') { parentID = operation.parentID.value } else { - const value = variables[operation.parentID.value] - if (typeof value !== 'string') { + const id = variables[operation.parentID.value] + if (typeof id !== 'string') { throw new Error('parentID value must be a string') } - parentID = value + parentID = id } } @@ -616,8 +754,7 @@ export class Cache { if (!targetID) { continue } - - this.delete(targetID, variables) + this.delete(operation.type, targetID, variables) } } } @@ -694,7 +831,10 @@ export class Cache { // if there are fields under this if (fields) { // figure out who else needs subscribers - const children = record.linkedList(key) || [record.linkedRecord(key)] + const children = + record.linkedList(key).length > 0 + ? record.linkedList(key) + : [record.linkedRecord(key)] for (const linkedRecord of children) { // avoid null records @@ -757,6 +897,10 @@ export class Cache { return evaluated } + + private clear(id: string): boolean { + return this._data.delete(id) + } } // the list of characters that make up a valid graphql variable name @@ -772,6 +916,8 @@ export type CacheProxy = { evaluateKey: Cache['evaluateKey'] getRecord: Cache['getRecord'] getData: Cache['getData'] + clear: Cache['clear'] + computeID: Cache['computeID'] } // id that we should use to refer to things in root diff --git a/packages/houdini/runtime/cache/index.ts b/packages/houdini/runtime/cache/index.ts index e63b73f65..58992bdc8 100644 --- a/packages/houdini/runtime/cache/index.ts +++ b/packages/houdini/runtime/cache/index.ts @@ -1,4 +1,4 @@ import { Cache } from './cache' // @ts-ignore: config will be defined by the generator -export default new Cache(config) +export default new Cache(config || {}) diff --git a/packages/houdini/runtime/cache/list.ts b/packages/houdini/runtime/cache/list.ts index 23fe3eb3e..4ffc7a3f8 100644 --- a/packages/houdini/runtime/cache/list.ts +++ b/packages/houdini/runtime/cache/list.ts @@ -1,5 +1,5 @@ // local imports -import { SubscriptionSelection, ListWhen, SubscriptionSpec } from '../types' +import { SubscriptionSelection, ListWhen, SubscriptionSpec, RefetchUpdateMode } from '../types' import { Cache } from './cache' import { Record } from './record' @@ -13,6 +13,7 @@ export class ListHandler { private filters?: { [key: string]: number | boolean | string } readonly name: string readonly parentID: SubscriptionSpec['parentID'] + private connection: boolean constructor({ name, @@ -24,8 +25,10 @@ export class ListHandler { when, filters, parentID, + connection, }: { name: string + connection: boolean cache: Cache record: Record key: string @@ -44,6 +47,7 @@ export class ListHandler { this.filters = filters this.name = name this.parentID = parentID + this.connection = connection } // when applies a when condition to a new list pointing to the same spot @@ -58,6 +62,7 @@ export class ListHandler { filters: this.filters, parentID: this.parentID, name: this.name, + connection: this.connection, }) } @@ -83,36 +88,84 @@ export class ListHandler { return } - // update the cache with the data we just found - this.cache.write(selection, data, variables, dataID) + // we are going to implement the insert as a write with an update flag on a field + // that matches the key of the list. We'll have to embed the lists data and selection + // in the appropriate objects + let insertSelection = selection + let insertData = data - if (where === 'first') { - // add the record we just created to the list - this.record.prependLinkedList(this.key, dataID) + // if we are wrapping a connection, we have to embed the data under edges > node + if (this.connection) { + insertSelection = { + newEntry: { + keyRaw: this.key, + type: 'Connection', + fields: { + edges: { + keyRaw: 'edges', + type: 'ConnectionEdge', + update: (where === 'first' ? 'prepend' : 'append') as RefetchUpdateMode, + fields: { + node: { + type: this.listType, + keyRaw: 'node', + fields: { + ...selection, + __typename: { + keyRaw: '__typename', + type: 'String', + }, + }, + }, + }, + }, + }, + }, + } + insertData = { + newEntry: { + edges: [{ node: { ...data, __typename: this.listType } }], + }, + } } else { - // add the record we just created to the list - this.record.appendLinkedList(this.key, dataID) + insertSelection = { + newEntries: { + keyRaw: this.key, + type: this.listType, + update: (where === 'first' ? 'prepend' : 'append') as RefetchUpdateMode, + fields: { + ...selection, + __typename: { + keyRaw: '__typename', + type: 'String', + }, + }, + }, + } + insertData = { + newEntries: [{ ...data, __typename: this.listType }], + } } // get the list of specs that are subscribing to the list const subscribers = this.record.getSubscribers(this.key) - // notify the subscribers we care about - this.cache.internal.notifySubscribers(subscribers, variables) - // look up the new record in the cache const newRecord = this.cache.internal.record(dataID) - // add the list reference - newRecord.addListReference({ - parentID: this.parentID, - name: this.name, - }) - // walk down the list fields relative to the new record // and make sure all of the list's subscribers are listening // to that object - this.cache.internal.insertSubscribers(newRecord, this.selection, variables, ...subscribers) + this.cache.internal.insertSubscribers(newRecord, selection, variables, ...subscribers) + + // update the cache with the data we just found + this.cache.write({ + selection: insertSelection, + data: insertData, + variables, + parent: this.record.id, + applyUpdates: true, + }) } removeID(id: string, variables: {} = {}) { @@ -121,22 +174,61 @@ export class ListHandler { return } - // add the record we just created to the list - this.record.removeFromLinkedList(this.key, id) + // if we are removing from a connection, the id we are removing from + // has to be computed + let parentID = this.record.id + let targetID = id + let targetKey = this.key + + // if we are removing a record from a connection we have to walk through + // some embedded references first + if (this.connection) { + const embeddedConnection = this.record.linkedRecord(this.key) + if (!embeddedConnection) { + return + } + // look at every embedded edge for the one with a node corresponding to the element + // we want to delete + for (const edge of embeddedConnection.linkedList('edges') || []) { + if (!edge) { + continue + } + // look at the edge's node + const node = edge.linkedRecord('node') + if (!node) { + continue + } + // if we found the node + if (node.id === id) { + targetID = edge.id + } + } + parentID = embeddedConnection.id + targetKey = 'edges' + } // get the list of specs that are subscribing to the list const subscribers = this.record.getSubscribers(this.key) - // notify the subscribers about the change - this.cache.internal.notifySubscribers(subscribers, variables) - // disconnect record from any subscriptions associated with the list this.cache.internal.unsubscribeSelection( - this.cache.internal.record(id), - this.selection, + this.cache.internal.record(targetID), + // if we're unsubscribing from a connection, only unsubscribe from the target + this.connection ? this.selection.edges.fields! : this.selection, variables, ...subscribers.map(({ set }) => set) ) + + // remove the target from the parent + this.cache.internal.record(parentID).removeFromLinkedList(targetKey, targetID) + + // notify the subscribers about the change + this.cache.internal.notifySubscribers(subscribers, variables) + + // if we are removing from a connection, delete the embedded edge holding the record + if (this.connection) { + this.cache.internal.clear(targetID) + } } remove(data: {}, variables: {} = {}) { diff --git a/packages/houdini/runtime/cache/record.ts b/packages/houdini/runtime/cache/record.ts index 9323078bb..12dca395a 100644 --- a/packages/houdini/runtime/cache/record.ts +++ b/packages/houdini/runtime/cache/record.ts @@ -13,7 +13,7 @@ export class Record { fields: { [key: string]: GraphQLValue } = {} keyVersions: { [key: string]: Set } = {} - id: string + readonly id: string private subscribers: { [key: string]: SubscriptionSpec[] } = {} private recordLinks: { [key: string]: string | null } = {} private listLinks: { [key: string]: (string | null)[] } = {} @@ -21,7 +21,6 @@ export class Record { private referenceCounts: { [fieldName: string]: Map } = {} - lists: List[] = [] constructor(cache: Cache, id: string) { this.cache = cache @@ -132,16 +131,6 @@ export class Record { this.forgetSubscribers(...this.allSubscribers()) } - addListReference(ref: List) { - this.lists.push(ref) - } - - removeListReference(ref: List) { - this.lists = this.lists.filter( - (conn) => !(conn.name === ref.name && conn.parentID === ref.parentID) - ) - } - removeAllSubscriptionVersions(keyRaw: string, spec: SubscriptionSpec) { // visit every version of the key we've seen and remove the spec from the list of subscribers const versions = this.keyVersions[keyRaw] diff --git a/packages/houdini/runtime/fragment.ts b/packages/houdini/runtime/fragment.ts index 9a0f74cb0..d39f3cf89 100644 --- a/packages/houdini/runtime/fragment.ts +++ b/packages/houdini/runtime/fragment.ts @@ -1,15 +1,13 @@ // externals import { readable, Readable } from 'svelte/store' import { onMount } from 'svelte' -import type { Config } from 'houdini-common' // locals import type { Fragment, FragmentArtifact, GraphQLTagResult, SubscriptionSpec } from './types' import cache from './cache' import { getVariables } from './context' -import { unmarshalSelection } from './scalars' // fragment returns the requested data from the reference -export default function fragment<_Fragment extends Fragment>( +export function fragment<_Fragment extends Fragment>( fragment: GraphQLTagResult, initialValue: _Fragment ): Readable<_Fragment['shape']> { @@ -28,13 +26,15 @@ export default function fragment<_Fragment extends Fragment>( // @ts-ignore: isn't properly typed yet to know if initialValue has the right values const parentID = cache.id(artifact.rootType, initialValue) + // a fragment has to subscribe individually because svelte can't detect that a prop has changed + // if there is an object passed + // wrap the result in a store we can use to keep this query up to date const value = readable(initialValue, (set) => { // if we couldn't compute the parent of the fragment if (!parentID) { return } - const subscriptionSpec = { rootType: artifact.rootType, selection: artifact.selection, @@ -42,13 +42,11 @@ export default function fragment<_Fragment extends Fragment>( parentID, variables: queryVariables, } - // when the component mounts onMount(() => { // stay up to date cache.subscribe(subscriptionSpec, queryVariables()) }) - // the function used to clean up the store return () => { // if we subscribed to something we'll need to clean up diff --git a/packages/houdini/runtime/index.ts b/packages/houdini/runtime/index.ts index 66207592c..f525bd588 100644 --- a/packages/houdini/runtime/index.ts +++ b/packages/houdini/runtime/index.ts @@ -3,10 +3,11 @@ import { GraphQLTagResult } from './types' export * from './network' export * from './types' -export { default as query, routeQuery, componentQuery } from './query' -export { default as mutation } from './mutation' -export { default as fragment } from './fragment' -export { default as subscription } from './subscription' +export { query, routeQuery, componentQuery } from './query' +export { mutation } from './mutation' +export { fragment } from './fragment' +export { subscription } from './subscription' +export { paginatedQuery } from './pagination' // this template tag gets removed by the preprocessor so it should never be invoked. // this function needs to return the same value as what the preprocessor leaves behind for type consistency diff --git a/packages/houdini/runtime/mutation.ts b/packages/houdini/runtime/mutation.ts index bfc3a5014..ceb6e3b50 100644 --- a/packages/houdini/runtime/mutation.ts +++ b/packages/houdini/runtime/mutation.ts @@ -12,7 +12,7 @@ import { getSession } from './adapter.mjs' // mutation returns a handler that will send the mutation to the server when // invoked -export default function mutation<_Mutation extends Operation>( +export function mutation<_Mutation extends Operation>( document: GraphQLTagResult ): (_input: _Mutation['input']) => Promise<_Mutation['result']> { // make sure we got a query document @@ -45,7 +45,11 @@ export default function mutation<_Mutation extends Operation>( sessionStore ) - cache.write(artifact.selection, result.data, queryVariables()) + cache.write({ + selection: artifact.selection, + data: result.data, + variables: queryVariables(), + }) // unmarshal any scalars on the body return unmarshalSelection(config, artifact.selection, result.data) diff --git a/packages/houdini/runtime/pagination.test.ts b/packages/houdini/runtime/pagination.test.ts new file mode 100644 index 000000000..a0ad44862 --- /dev/null +++ b/packages/houdini/runtime/pagination.test.ts @@ -0,0 +1,27 @@ +import { extractPageInfo } from './utils' + +test('can extract current page info', function () { + const data = { + user: { + friends: { + pageInfo: { + startCursor: '1', + endCursor: '2', + hasNextPage: true, + hasPreviousPage: false, + }, + edges: [ + { + node: { + id: '1', + }, + }, + ], + }, + }, + } + + const path = ['user', 'friends'] + + expect(extractPageInfo(data, path)).toEqual(data.user.friends.pageInfo) +}) diff --git a/packages/houdini/runtime/pagination.ts b/packages/houdini/runtime/pagination.ts new file mode 100644 index 000000000..d505cfc25 --- /dev/null +++ b/packages/houdini/runtime/pagination.ts @@ -0,0 +1,387 @@ +// externals +import { derived, readable, Readable, Writable, writable } from 'svelte/store' +// locals +import { + Operation, + GraphQLTagResult, + Fragment, + GraphQLObject, + QueryArtifact, + TaggedGraphqlQuery, + FragmentArtifact, +} from './types' +import { query, QueryResponse } from './query' +import { fragment } from './fragment' +import { getVariables } from './context' +import { executeQuery } from './network' +import cache from './cache' +// @ts-ignore: this file will get generated and does not exist in the source code +import { getSession } from './adapter.mjs' +// this has to be in a separate file since config isn't defined in cache/index.ts +import { extractPageInfo, PageInfo } from './utils' + +export function paginatedQuery<_Query extends Operation>( + document: GraphQLTagResult +): QueryResponse<_Query['result'], _Query['input']> & PaginatedHandlers { + // make sure we got a query document + if (document.kind !== 'HoudiniQuery') { + throw new Error('paginatedQuery() must be passed a query document') + } + + // @ts-ignore: typing esm/cjs interop is hard + const artifact: QueryArtifact = document.artifact.default || document.artifact + + // if there's no refetch config for the artifact there's a problem + if (!artifact.refetch) { + throw new Error('paginatedQuery must be passed a query with @paginate.') + } + + // pass the artifact to the base query operation + const { data, loading, ...restOfQueryResponse } = query(document) + + return { + data, + ...paginationHandlers({ + initialValue: document.initialValue.data, + store: data, + artifact, + queryVariables: document.variables, + documentLoading: loading, + }), + ...restOfQueryResponse, + } +} + +export function paginatedFragment<_Fragment extends Fragment>( + document: GraphQLTagResult, + initialValue: _Fragment +): { data: Readable<_Fragment['shape']> } & PaginatedHandlers { + // make sure we got a query document + if (document.kind !== 'HoudiniFragment') { + throw new Error('paginatedFragment() must be passed a fragment document') + } + // if we don't have a pagination fragment there is a problem + if (!document.paginationArtifact) { + throw new Error('paginatedFragment must be passed a fragment with @paginate') + } + + // pass the inputs to the normal fragment function + const data = fragment(document, initialValue) + + // @ts-ignore: typing esm/cjs interop is hard + const fragmentArtifact: FragmentArtifact = document.artifact.default || document.artifact + + const paginationArtifact: QueryArtifact = + // @ts-ignore: typing esm/cjs interop is hard + document.paginationArtifact.default || document.paginationArtifact + + return { + data, + ...paginationHandlers({ + initialValue, + store: data, + artifact: paginationArtifact, + queryVariables: paginationArtifact.refetch!.embedded + ? { id: cache.internal.computeID(fragmentArtifact.rootType, initialValue) } + : {}, + }), + } +} + +function paginationHandlers({ + initialValue, + artifact, + store, + queryVariables, + documentLoading, +}: { + initialValue: GraphQLObject + artifact: QueryArtifact + store: Readable + queryVariables?: {} + documentLoading?: Readable +}): PaginatedHandlers { + // start with the defaults and no meaningful page info + let loadPreviousPage = defaultLoadPreviousPage + let loadNextPage = defaultLoadNextPage + let pageInfo = readable( + { + startCursor: null, + endCursor: null, + hasNextPage: false, + hasPreviousPage: false, + }, + () => {} + ) + + let paginationLoadingState = writable(false) + + // if the artifact supports cursor based pagination + if (artifact.refetch?.method === 'cursor') { + // generate the cursor handlers + const cursor = cursorHandlers({ + initialValue, + artifact, + store, + queryVariables, + loading: paginationLoadingState, + }) + // always track pageInfo + pageInfo = cursor.pageInfo + + // if we are implementing forward pagination + if (artifact.refetch.update === 'append') { + loadNextPage = cursor.loadNextPage + } + // the artifact implements backwards pagination + else { + loadPreviousPage = cursor.loadPreviousPage + } + } + // the artifact supports offset-based pagination, only loadNextPage is valid + else { + loadNextPage = offsetPaginationHandler({ + artifact, + queryVariables, + loading: paginationLoadingState, + }) + } + + // if no loading state was provided just use a store that's always false + if (!documentLoading) { + documentLoading = readable(false, () => {}) + } + + // merge the pagination and document loading state + const loading = derived( + [paginationLoadingState, documentLoading], + ($loadingStates) => $loadingStates[0] || $loadingStates[1] + ) + + return { loadNextPage, loadPreviousPage, pageInfo, loading } +} + +function cursorHandlers({ + initialValue, + artifact, + store, + queryVariables: extraVariables, + loading, +}: { + initialValue: GraphQLObject + artifact: QueryArtifact + store: Readable + queryVariables?: {} + loading: Writable +}): { + loadNextPage: PaginatedHandlers['loadNextPage'] + loadPreviousPage: PaginatedHandlers['loadPreviousPage'] + pageInfo: PaginatedHandlers['pageInfo'] +} { + // pull out the context accessors + const variables = getVariables() + const sessionStore = getSession() + + // track the current page info in an easy-to-reach store + const initialPageInfo = initialValue + ? extractPageInfo(initialValue, artifact.refetch!.path) + : { + startCursor: null, + endCursor: null, + hasNextPage: false, + hasPreviousPage: false, + } + const pageInfo = writable(initialPageInfo) + + // hold onto the current value + let value: GraphQLObject + store.subscribe((val) => { + pageInfo.set(extractPageInfo(val, artifact.refetch!.path)) + value = val + }) + + // dry up the page-loading logic + const loadPage = async ({ + pageSizeVar, + input, + functionName, + }: { + pageSizeVar: string + functionName: string + input: {} + }) => { + // set the loading state to true + loading.set(true) + + // build up the variables to pass to the query + const queryVariables: Record = { + ...extraVariables, + ...variables(), + ...input, + } + + // if we don't have a value for the page size, tell the user + if (!queryVariables[pageSizeVar] && !artifact.refetch!.pageSize) { + throw missingPageSizeError(functionName) + } + + // send the query + const result = await executeQuery(artifact, queryVariables, sessionStore) + + // if the query is embedded in a node field (paginated fragments) + // make sure we look down one more for the updated page info + const resultPath = [...artifact.refetch!.path] + if (artifact.refetch!.embedded) { + resultPath.unshift('node') + } + + // we need to find the connection object holding the current page info + pageInfo.set(extractPageInfo(result.data, resultPath)) + + // update cache with the result + cache.write({ + selection: artifact.selection, + data: result.data, + variables: queryVariables, + applyUpdates: true, + }) + + // we're not loading any more + loading.set(false) + } + + return { + loadNextPage: async (pageCount?: number) => { + // we need to find the connection object holding the current page info + const currentPageInfo = extractPageInfo(value, artifact.refetch!.path) + + // if there is no next page, we're done + if (!currentPageInfo.hasNextPage) { + return + } + + // only specify the page count if we're given one + const input: Record = { + after: currentPageInfo.endCursor, + } + if (pageCount) { + input.first = pageCount + } + + // load the page + return await loadPage({ + pageSizeVar: 'first', + functionName: 'loadNextPage', + input, + }) + }, + loadPreviousPage: async (pageCount?: number) => { + // we need to find the connection object holding the current page info + const currentPageInfo = extractPageInfo(value, artifact.refetch!.path) + + // if there is no next page, we're done + if (!currentPageInfo.hasPreviousPage) { + return + } + + // only specify the page count if we're given one + const input: Record = { + before: currentPageInfo.startCursor, + } + if (pageCount) { + input.last = pageCount + } + + // load the page + return await loadPage({ + pageSizeVar: 'last', + functionName: 'loadPreviousPage', + input, + }) + }, + pageInfo: { subscribe: pageInfo.subscribe }, + } +} + +function offsetPaginationHandler({ + artifact, + queryVariables: extraVariables, + loading, +}: { + artifact: QueryArtifact + queryVariables?: {} + loading: Writable +}): PaginatedHandlers['loadNextPage'] { + // we need to track the most recent offset for this handler + let currentOffset = (artifact.refetch?.start as number) || 0 + + // grab the context getters + const variables = getVariables() + const sessionStore = getSession() + + return async (limit?: number) => { + // build up the variables to pass to the query + const queryVariables: Record = { + ...variables(), + ...extraVariables, + offset: currentOffset, + } + if (limit) { + queryVariables.limit = limit + } + + // if we made it this far without a limit argument and there's no default page size, + // they made a mistake + if (!queryVariables.limit && !artifact.refetch!.pageSize) { + throw missingPageSizeError('loadNextPage') + } + + // set the loading state to true + loading.set(true) + + // send the query + const result = await executeQuery(artifact, queryVariables, sessionStore) + + // update cache with the result + cache.write({ + selection: artifact.selection, + data: result.data, + variables: queryVariables, + applyUpdates: true, + }) + + // add the page size to the offset so we load the next page next time + const pageSize = queryVariables.limit || artifact.refetch!.pageSize + currentOffset += pageSize + + // we're not loading any more + loading.set(false) + } +} + +type PaginatedHandlers = { + loadNextPage(pageCount?: number, after?: string | number): Promise + loadPreviousPage(pageCount?: number, before?: string): Promise + loading: Readable + pageInfo: Readable +} + +function defaultLoadNextPage(): Promise { + throw new Error( + 'loadNextPage() only works on fields marked @paginate that implement forward cursor or offset pagination.' + ) +} + +function defaultLoadPreviousPage(): Promise { + throw new Error( + 'loadPreviousPage() only works on fields marked @paginate that implement backward cursor pagination.' + ) +} + +function missingPageSizeError(fnName: string) { + return new Error( + 'Loading a page with no page size. If you are paginating a field with a variable page size, ' + + `you have to pass a value to \`${fnName}\`. If you don't care to have the page size vary, ` + + 'consider passing a fixed value to the field instead.' + ) +} diff --git a/packages/houdini/runtime/query.ts b/packages/houdini/runtime/query.ts index 37cd19e6f..41915f95c 100644 --- a/packages/houdini/runtime/query.ts +++ b/packages/houdini/runtime/query.ts @@ -12,7 +12,7 @@ import { marshalInputs, unmarshalSelection } from './scalars' // @ts-ignore: this file will get generated and does not exist in the source code import { getSession, goTo } from './adapter.mjs' -export default function query<_Query extends Operation>( +export function query<_Query extends Operation>( document: GraphQLTagResult ): QueryResponse<_Query['result'], _Query['input']> { // make sure we got a query document @@ -55,7 +55,11 @@ export default function query<_Query extends Operation>( // if we were given data on mount if (initialValue) { // update the cache with the data that we just ran into - cache.write(artifact.selection, initialValue, variables) + cache.write({ + selection: artifact.selection, + data: initialValue, + variables, + }) // stay up to date if (subscriptionSpec) { @@ -81,6 +85,13 @@ export default function query<_Query extends Operation>( const sessionStore = getSession() function writeData(newData: RequestPayload<_Query['result']>, newVariables: _Query['input']) { + // write the data we received + cache.write({ + selection: artifact.selection, + data: newData.data, + variables: newVariables, + }) + // if the variables changed we need to unsubscribe from the old fields and // listen to the new ones if (subscriptionSpec && JSON.stringify(variables) !== JSON.stringify(newVariables)) { @@ -88,14 +99,11 @@ export default function query<_Query extends Operation>( cache.subscribe(subscriptionSpec, newVariables) } - // save the new variables - variables = newVariables || {} - // update the local store store.set(unmarshalSelection(config, artifact.selection, newData.data)) - // write the data we received - cache.write(artifact.selection, newData.data, variables) + // save the new variables + variables = newVariables || {} } return { @@ -131,7 +139,7 @@ export default function query<_Query extends Operation>( // we need to wrap the response from a query in something that we can // use as a proxy to the query for refetches, writing to the cache, etc -type QueryResponse<_Data, _Input> = { +export type QueryResponse<_Data, _Input> = { data: Readable<_Data> writeData: (data: RequestPayload<_Data>, variables: _Input) => void refetch: (newVariables?: _Input) => Promise diff --git a/packages/houdini/runtime/scalars.test.ts b/packages/houdini/runtime/scalars.test.ts index d47e29371..ff5986723 100644 --- a/packages/houdini/runtime/scalars.test.ts +++ b/packages/houdini/runtime/scalars.test.ts @@ -77,11 +77,19 @@ const artifact: QueryArtifact = { }, }, - list: 'All_Items', + list: { + name: 'All_Items', + type: 'User', + connection: false, + }, }, }, - list: 'All_Items', + list: { + name: 'All_Items', + type: 'User', + connection: false, + }, }, }, rootType: 'Query', @@ -362,7 +370,11 @@ describe('unmarshal selection', function () { }, }, - list: 'All_Items', + list: { + name: 'All_Items', + type: 'User', + connection: false, + }, }, }, }, diff --git a/packages/houdini/runtime/subscription.ts b/packages/houdini/runtime/subscription.ts index 78250f029..4c2e5a97c 100644 --- a/packages/houdini/runtime/subscription.ts +++ b/packages/houdini/runtime/subscription.ts @@ -11,7 +11,7 @@ import { marshalInputs, unmarshalSelection } from './scalars' // subscription holds open a live connection to the server. it returns a store // containing the requested data. Houdini will also update the cache with any // information that it encounters in the response. -export default function subscription<_Subscription extends Operation>( +export function subscription<_Subscription extends Operation>( document: GraphQLTagResult, variables?: _Subscription['input'] ): { @@ -80,7 +80,11 @@ export default function subscription<_Subscription extends Operation>( // if we got a result if (data) { // update the cache with the result - cache.write(selection, data, marshaledVariables) + cache.write({ + selection, + data, + variables: marshaledVariables, + }) // update the local store store.set(unmarshalSelection(config, artifact.selection, data)) diff --git a/packages/houdini/runtime/types.ts b/packages/houdini/runtime/types.ts index 04a934bde..886c13b44 100644 --- a/packages/houdini/runtime/types.ts +++ b/packages/houdini/runtime/types.ts @@ -1,4 +1,5 @@ import type { Config } from 'houdini-common' +import { Readable } from 'svelte/store' export type Fragment<_Result> = { readonly shape?: _Result @@ -36,15 +37,31 @@ export type SubscriptionArtifact = BaseCompiledDocument & { kind: 'HoudiniSubscription' } -type BaseCompiledDocument = { +export enum RefetchUpdateMode { + append = 'append', + prepend = 'prepend', + replace = 'replace', +} + +export type InputObject = { + fields: Record + types: Record> +} + +export type BaseCompiledDocument = { name: string raw: string hash: string selection: SubscriptionSelection rootType: string - input?: { - fields: Record - types: Record> + input?: InputObject + refetch?: { + update: RefetchUpdateMode + path: string[] + method: 'cursor' | 'offset' + pageSize: number + start?: string | number + embedded: boolean } } @@ -59,6 +76,7 @@ export type TaggedGraphqlFragment = { kind: 'HoudiniFragment' artifact: FragmentArtifact config: Config + paginationArtifact?: QueryArtifact } // the result of tagging an operation @@ -130,7 +148,12 @@ export type SubscriptionSelection = { type: string keyRaw: string operations?: MutationOperation[] - list?: string + list?: { + name: string + connection: boolean + type: string + } + update?: RefetchUpdateMode filters?: { [key: string]: { kind: 'Boolean' | 'String' | 'Float' | 'Int' | 'Variable' diff --git a/packages/houdini/runtime/utils.ts b/packages/houdini/runtime/utils.ts new file mode 100644 index 000000000..4e67fbd18 --- /dev/null +++ b/packages/houdini/runtime/utils.ts @@ -0,0 +1,19 @@ +import { GraphQLObject } from './types' + +export type PageInfo = { + startCursor: string | null + endCursor: string | null + hasNextPage: boolean + hasPreviousPage: boolean +} + +export function extractPageInfo(data: GraphQLObject, path: string[]): PageInfo { + let localPath = [...path] + // walk down the object until we get to the end + let current = data + while (localPath.length > 0) { + current = current[localPath.shift() as string] as GraphQLObject + } + + return current.pageInfo as PageInfo +} diff --git a/yarn.lock b/yarn.lock index 2e539de94..90c59d4c4 100644 --- a/yarn.lock +++ b/yarn.lock @@ -5703,8 +5703,9 @@ __metadata: "@sveltejs/kit": 1.0.0-next.107 apollo-server: ^2.24.0 graphql: 15.5.0 - houdini: ^0.9.7 - houdini-preprocess: ^0.9.7 + graphql-relay: ^0.8.0 + houdini: ^0.10.0-alpha.13 + houdini-preprocess: ^0.10.0-alpha.13 subscriptions-transport-ws: ^0.9.18 svelte: ^3.38.2 svelte-preprocess: ^4.0.0 @@ -6491,6 +6492,15 @@ __metadata: languageName: node linkType: hard +"graphql-relay@npm:^0.8.0": + version: 0.8.0 + resolution: "graphql-relay@npm:0.8.0" + peerDependencies: + graphql: 15.5.1 + checksum: 3986b64ca5126e2ad8cbfaf2c7c64161ce00a8f9832c70e226d096dcb47121a7adc652022a746706527661663537d6f721b313d0ad3ff651c8babe339b5b2469 + languageName: node + linkType: hard + "graphql-subscriptions@npm:^1.0.0": version: 1.2.1 resolution: "graphql-subscriptions@npm:1.2.1" @@ -6706,7 +6716,7 @@ __metadata: languageName: node linkType: hard -"houdini-common@^0.9.0, houdini-common@workspace:packages/houdini-common": +"houdini-common@^0.10.0-alpha.8, houdini-common@workspace:packages/houdini-common": version: 0.0.0-use.local resolution: "houdini-common@workspace:packages/houdini-common" dependencies: @@ -6751,7 +6761,7 @@ __metadata: languageName: unknown linkType: soft -"houdini-preprocess@^0.9.7, houdini-preprocess@workspace:packages/houdini-preprocess": +"houdini-preprocess@^0.10.0-alpha.13, houdini-preprocess@workspace:packages/houdini-preprocess": version: 0.0.0-use.local resolution: "houdini-preprocess@workspace:packages/houdini-preprocess" dependencies: @@ -6765,8 +6775,8 @@ __metadata: babylon: ^7.0.0-beta.47 estree-walker: ^2.0.2 graphql: 15.5.0 - houdini: ^0.9.7 - houdini-common: ^0.9.0 + houdini: ^0.10.0-alpha.13 + houdini-common: ^0.10.0-alpha.8 jest: ^26.6.3 mkdirp: ^1.0.4 prettier: "*" @@ -6777,7 +6787,7 @@ __metadata: languageName: unknown linkType: soft -"houdini@^0.9.7, houdini@workspace:packages/houdini": +"houdini@^0.10.0-alpha.13, houdini@workspace:packages/houdini": version: 0.0.0-use.local resolution: "houdini@workspace:packages/houdini" dependencies: @@ -6798,7 +6808,7 @@ __metadata: estree-walker: ^2.0.2 glob: ^7.1.6 graphql: ^15.5.0 - houdini-common: ^0.9.0 + houdini-common: ^0.10.0-alpha.8 inquirer: ^7.3.3 jest: ^26.6.3 mkdirp: ^1.0.4