diff --git a/docs/blog/2022-01-19-release-pg-bulk.md b/docs/blog/2022-01-19-release-pg-bulk.md deleted file mode 100644 index f7922a47..00000000 --- a/docs/blog/2022-01-19-release-pg-bulk.md +++ /dev/null @@ -1,98 +0,0 @@ ---- -title: 'A new utility for really big Postgres operations' -author: Forbes Lindesay -authorURL: http://twitter.com/ForbesLindesay -authorTwitter: ForbesLindesay ---- - -Postgres is very fast, but one limitation you may run into when dealing with big quantities of data is that Postgres fails if you have more than about 90,000 parameters in your query. It is also extremely fiddly to update multiple different records to different values in the same statement. People often resort to using hacks like intentionally triggering a conflict so you can use `ON CONFLICT DO UPDATE`. - - - -That all changes today. The newly released [`@databases/pg-bulk`](https://www.atdatabases.org/docs/pg-bulk) lets you use the powerful `UNNEST` utility to write very high performance queries without all the difficult parts. For example, you can update multiple users in a single statement: - -```typescript -async function updateUsers() { - await bulkUpdate({ - database, - tableName: `users`, - columnTypes: { - email: sql`TEXT`, - favorite_color: sql`TEXT`, - }, - whereColumnNames: [`email`], - setColumnNames: [`favorite_color`], - updates: [ - {where: {email: `joe@example.com`}, set: {favorite_color: `Burgundy`}}, - {where: {email: `clare@example.com`}, set: {favorite_color: `Cream`}}, - {where: {email: `eleanor@example.com`}, set: {favorite_color: `Fuchsia`}}, - ], - }); -} -``` - -To do this without `bulkUpdate` you'd either have to do 3 separate SQL statements, or hand code SQL while trying to remember how `UNNEST` works. - -> If you'd like all the details of how `@databases/pg-bulk` uses `UNNEST` to make these statements work, you can find examples with the SQL statements that they would run in [the documentation for `@databases/pg-bulk`](https://www.atdatabases.org/docs/pg-bulk) - -## pg-typed - -If you're using TypeScript, you can make this even easier by using [`@databases/pg-typed`](https://www.atdatabases.org/docs/pg-typed). `pg-typed` exposes all the same APIs as `bulkUpdate` but with the added bonus of type safety and being able to auto-generate the `columnTypes` using `pg-schema-cli`. If you use that, the above example simplifies to just: - -```typescript -async function updateUsers() { - await users(database).bulkUpdate({ - whereColumnNames: [`email`], - setColumnNames: [`favorite_color`], - updates: [ - {where: {email: `joe@example.com`}, set: {favorite_color: `Burgundy`}}, - {where: {email: `clare@example.com`}, set: {favorite_color: `Cream`}}, - {where: {email: `eleanor@example.com`}, set: {favorite_color: `Fuchsia`}}, - ], - }); -} -``` - -## Other new pg-typed features - -In addition to the new `bulkOperation` methods, [`@databases/pg-typed`](https://www.atdatabases.org/docs/pg-typed) gained a few other small features. - -You can now use `.findOneRequired` in place of `.findOne` to throw an error instead of returning `null` when a record is missing. This can simplify/clean up your app code for cases where you can be sure that the record does exist. - -```typescript -async function getPostWithAuthor(id: DbPost['id']) { - const post = await posts(database).findOne({id}); - if (!post) return null; - const author = await users(database).findOneRequired({id: post.author_id}); - return {post, author}; -} -``` - -You can append `.one()` or `.oneRequired()` onto `SelectQuery`s. This lets you more easily load a subset of a single record. e.g. - -```typescript -async function getPostWithAuthorName(id: DbPost['id']) { - const post = await posts(database).findOne({id}); - if (!post) return null; - const author = await users(database) - .find({id: post.author_id}) - .select(`name`) - .oneRequired(); - return {post, author: author.name}; -} -``` - -There's also a new type safe alternative to `inQueryResults`, making it easier to load related records in a single database round trip . e.g. - -```typescript -async function getPostAuthor(postId: DbPost['id']) { - return await users(database).findOne({ - // find users where the id matches the - // author_id field in posts where the post's - // id is postId - id: posts.key(`author_id`, {id: postId}), - }); -} -``` - -> P.S. did you know that all these utilities can be used in transactions. Just wrap any of the examples in `await database.tx(async database => {/* ... */})`. Passing the `database` in the callback ensures that all queries are run on that transaction. diff --git a/docs/blog/2022-01-21-optimizing-postgres-using-unnest.md b/docs/blog/2022-01-21-optimizing-postgres-using-unnest.md new file mode 100644 index 00000000..213c599b --- /dev/null +++ b/docs/blog/2022-01-21-optimizing-postgres-using-unnest.md @@ -0,0 +1,180 @@ +--- +title: 'Postgres UNNEST cheat sheet for bulk operations' +author: Forbes Lindesay +authorURL: http://twitter.com/ForbesLindesay +authorTwitter: ForbesLindesay +googleSummary: UNNEST is the only way to make Postgres queries fast and reliable, if you want to interact thousands of rows at a time. +ogSummary: UNNEST is the only way to make Postgres queries fast and reliable, if you want to interact thousands of rows at a time. +--- + +Postgres is normally very fast, but it can become slow (or even fail completely), if you have too many parameters in your queries. When it comes to operating on data in bulk, `UNNEST` is the only way to achieve fast, reliable queries. This post has examples for using `UNNEST` to do all types of bulk transactions. + + + +All the examples in this article assume a database schema that looks like: + +```sql +CREATE TABLE users ( + email TEXT NOT NULL PRIMARY KEY, + favorite_color TEXT NOT NULL +) +``` + +## INSERT thousands of records in one go + +To insert many records into a Postgres table in one go, the most efficient method is to provide each column as a separate array and then use `UNNEST` to construct the rows to insert. + +You can run the following query: + +```sql +INSERT INTO users (email, favorite_color) +SELECT + UNNEST(?::TEXT[]), + UNNEST(?::TEXT[]) +``` + +With parameters like: + +```json +[ + ["joe@example.com", "ben@example.com", "mary@example.com"], + ["red", "green", "indigo"] +] +``` + +Notice that you're only passing 2 parameters, no matter how many rows you want to insert. You're also using the same query text no matter how many rows you want to insert. This is what keeps the query so efficient. + +The resulting table would look like: + +| email | favorite_color | +| ---------------- | -------------- | +| joe@example.com | red | +| ben@example.com | green | +| mary@example.com | indigo | + +## UPDATE multiple records to different values in a single query + +One of the most powerful use cases of `UNNEST` is to update multiple records in a single query. The normal `UPDATE` statement only really lets you update multiple records in one go if you want to set them all to the same value, but this approach is way more flexible. + +You can run the following query: + +```sql +UPDATE users +SET + favorite_color=bulk_query.updated_favorite_color +FROM + ( + SELECT + UNNEST(?::TEXT[]) + AS email, + UNNEST(?::TEXT[]) + AS updated_favorite_color + ) AS bulk_query +WHERE + users.email=bulk_query.email +``` + +With parameters like: + +```json +[ + ["joe@example.com", "ben@example.com", "mary@example.com"], + ["purple", "violet", "orange"] +] +``` + +The resulting table would then look like: + +| email | favorite_color | +| ---------------- | -------------- | +| joe@example.com | purple | +| ben@example.com | violet | +| mary@example.com | orange | + +Not only does this let you update all these records in one statement, but the number of parameters remains fixed at 2 now matter how many rows you want to update. + +## SELECT with thousands of different conditions in one go + +You can always build a very large query by combining `OR` and `AND`, but eventually, if you have enough parameters, this may start to become slow. + +You can run the following query: + +```sql +SELECT * FROM users +WHERE (email, favorite_color) IN ( + SELECT + UNNEST(?::TEXT[]), + UNNEST(?::TEXT[]) +) +``` + +With parameters like: + +```json +[ + ["joe@example.com", "ben@example.com", "mary@example.com"], + ["purple", "violet", "orange"] +] +``` + +and it will be equivalent to running: + +```sql +SELECT * FROM users +WHERE + (email='joe@example.com' AND favorite_color='purple') + OR (email='ben@example.com' AND favorite_color='violet') + OR (email='mary@example.com' AND favorite_color='orange') +``` + +Using `UNNEST` here lets us keep the query constant, and use only 2 parameters, regardless of how many conditions we want to add. + +An alternative if you need more control can be to use an INNER JOIN instead of the `IN` part of the query. For example, if you needed the tests to be case insensitive you could do: + +```sql +SELECT users.* FROM users +INNER JOIN ( + SELECT + UNNEST(?::TEXT[]) AS email, + UNNEST(?::TEXT[]) AS favorite_color +) AS unnest_query +ON (LOWER(users.email) = LOWER(unnest_query.email) AND LOWER(user.favorite_color) = LOWER(unnest_query.favorite_color)) +``` + +## DELETE with thousands of different conditions in one go + +Just like `SELECT`, `DELETE` queries can become slow if the complexity of your conditions grows too extreme. + +You can run the following query: + +```sql +DELETE FROM users +WHERE (email, favorite_color) IN ( + SELECT + UNNEST(?::TEXT[]), + UNNEST(?::TEXT[]) +) +``` + +With parameters like: + +```json +[ + ["joe@example.com", "ben@example.com", "mary@example.com"], + ["purple", "violet", "orange"] +] +``` + +and it will be equivalent to running: + +```sql +DELETE FROM users +WHERE + (email='joe@example.com' AND favorite_color='purple') + OR (email='ben@example.com' AND favorite_color='violet') + OR (email='mary@example.com' AND favorite_color='orange') +``` + +Just like with `SELECT`, using `UNNEST` here lets us keep the query constant, and use only 2 parameters, regardless of how many conditions we want to add. + +> If you're using node.js, you can do all these operations without having to memorize the syntax by using [`@database/pg-typed`](https://www.atdatabases.org/docs/pg-guide-typescript) or [`@database/pg-bulk`](https://www.atdatabases.org/docs/pg-bulk). diff --git a/docs/pg-typed.md b/docs/pg-typed.md index 4ea135e0..66c37bff 100644 --- a/docs/pg-typed.md +++ b/docs/pg-typed.md @@ -292,7 +292,7 @@ async function getPosts() { ### bulkInsert(options) -To insert thousands of records at a time, you can use the bulk insert API. This requires you to specify any optional fields that you want to pass in. Any required (i.e. `NOT NULL` and no default value) fields are automatically expected. You can find more details on how this API works in [@databases/pg-bulk](pg-bulk.md). +To insert thousands of records at a time, you can use the bulk insert API. This requires you to specify any optional fields that you want to pass in. Any required (i.e. `NOT NULL` and no default value) fields are automatically expected. You can find more details on how this API works in [@databases/pg-bulk](pg-bulk.md). `bulkInsert` also returns the inserted records. ```typescript async function insertUsers() { @@ -303,7 +303,7 @@ async function insertUsers() { {email: `joe@example.com`, favorite_color: `red`}, {email: `ben@example.com`, favorite_color: `green`}, {email: `tom@example.com`, favorite_color: `blue`}, - {email: `clare@example.com`, favorite_color: `indigo`}, + {email: `mary@example.com`, favorite_color: `indigo`}, ], }); } @@ -311,7 +311,7 @@ async function insertUsers() { ### bulkUpdate(options) -Updating multiple records in one go, where each record needs to be updated to a different value can be tricky to do efficiently. If there is a unique constraint, it may be possible to use `insertOrUpdate`, but failing that you'll want to use this bulk API. You can find more details on how this API works in [@databases/pg-bulk](pg-bulk.md). +Updating multiple records in one go, where each record needs to be updated to a different value can be tricky to do efficiently. If there is a unique constraint, it may be possible to use `insertOrUpdate`, but failing that you'll want to use this bulk API. You can find more details on how this API works in [@databases/pg-bulk](pg-bulk.md). `bulkUpdate` also returns the updated records. ```typescript async function updateUsers() { @@ -323,7 +323,7 @@ async function updateUsers() { {where: {email: `joe@example.com`}, set: {favorite_color: `green`}}, {where: {email: `ben@example.com`}, set: {favorite_color: `blue`}}, {where: {email: `tom@example.com`}, set: {favorite_color: `indigo`}}, - {where: {email: `clare@example.com`}, set: {favorite_color: `green`}}, + {where: {email: `mary@example.com`}, set: {favorite_color: `green`}}, ], }); } @@ -497,6 +497,28 @@ You can use `isNoResultFoundError` to test a caught exception to see if it is th ### anyOf(valuesOrFieldQueries) +Match all of the supplied values. For example, to get posts within a time range: + +```typescript +import {anyOf} from '@databases/pg-typed'; +import db, {posts} from './database'; + +/** + * Get posts where: + * + * timestamp >= start AND timestamp < end + */ +async function getPostsBetween(start: Date, end: Date) { + return await posts(db) + .find({ + timestamp: allOf([anyOf([greaterThan(start), start]), lessThan(end)]), + }) + .all(); +} +``` + +### anyOf(valuesOrFieldQueries) + Match any of the supplied values. For example, to get users who like blue or green: ```typescript @@ -513,6 +535,51 @@ export async function getUsersWhoLikeBlueOrGreen() { } ``` +### caseInsensitive(valueOrFieldQuery) + +Match the supplied string while ignoring case. + +```typescript +import {caseInsensitive} from '@databases/pg-typed'; +import db, {users} from './database'; + +/** + * Return true if there is a user with this username, ignoring + * the case of the user, so ForbesLindesay would be equivalent + * to forbeslindesay + */ +async function userExists(username: string) { + return ( + 0 !== + (await users(db).count({ + username: caseInsensitive(username), + })) + ); +} +``` + +### jsonPath(path, value) + +Match the supplied value against the given path in a `JSON`/`JSONB` column. + +```typescript +import {jsonPath} from '@databases/pg-typed'; +import db, {events} from './database'; + +/** + * return events where: + * + * event_data.type = 'FEEDBACK' + */ +async function getFeedbackEvents() { + return await events(db) + .find({ + event_data: jsonPath(['type'], 'feedback'), + }) + .all(); +} +``` + ### not(valueOrFieldQuery) Match any value except the supplied value. You can combine this with any of the other FieldQuery utilities. @@ -611,3 +678,73 @@ export async function getPostsByUserEmail(email: string) { .all(); } ``` + +## or(conditions) / and(conditions) + +To `or`/`and` values/conditions for a single field, you can use `anyOf`/`allOf`, but the `or` utility helps if you want to have multiple distinct queries. If you anticipate many conditions in an or, you may be get better performance by using `.bulkFind`/`.bulkDelete` instead of `or`. + +```typescript +import {or, and, greaterThan} from '@databases/pg-typed'; +import db, {posts, User} from './database'; + +/** + * return posts where: + * + * user_id=${authorId} + * AND ( + * (is_public IS TRUE AND view_count > 1000) + * OR (is_public IS FALSE AND view_count > 100) + * ) + */ +async function getPopularPostsByAuthor(authorId: User['id']) { + return await posts(db) + .find( + and( + {user_id: authorId}, + or( + { + is_public: true, + view_count: greaterThan(1_000), + }, + { + is_public: false, + view_count: greaterThan(100), + }, + ), + ), + ) + .all(); +} +``` + +You can often avoid using the `and` helper by simply duplicating some fields in the query: + +```typescript +import {or, greaterThan} from '@databases/pg-typed'; +import db, {posts, User} from './database'; + +/** + * return posts where: + * + * (user_id=${authorId} AND is_public IS TRUE AND view_count > 1000) + * OR (user_id=${authorId} AND is_public IS FALSE AND view_count > 100) + */ +async function getPopularPostsByAuthor(authorId: User['id']) { + return await posts(db) + .find( + or( + { + user_id: authorId, + is_public: true, + view_count: greaterThan(1_000), + }, + { + user_id: authorId, + is_public: false, + view_count: greaterThan(100), + }, + ), + ) + .all(); +} +``` diff --git a/packages/pg-bulk/src/index.ts b/packages/pg-bulk/src/index.ts index b1764771..96b9c46c 100644 --- a/packages/pg-bulk/src/index.ts +++ b/packages/pg-bulk/src/index.ts @@ -78,12 +78,18 @@ function select( )}`; } +export async function bulkInsert( + options: BulkInsertOptions & {returning: SQLQuery}, +): Promise; export async function bulkInsert( options: BulkInsertOptions, -): Promise { - const {database, columnsToInsert, records} = options; +): Promise; +export async function bulkInsert( + options: BulkInsertOptions & {returning?: SQLQuery}, +): Promise { + const {database, columnsToInsert, records, returning} = options; const {sql} = database; - await database.query( + return await database.query( sql`INSERT INTO ${tableId(options)} (${sql.join( columnsToInsert.map((columnName) => sql.ident(columnName)), `,`, @@ -91,7 +97,7 @@ export async function bulkInsert( columnsToInsert.map((name) => ({name})), records, options, - )}`, + )}${returning ? sql` RETURNING ${returning}` : sql``}`, ); } @@ -154,11 +160,29 @@ export async function bulkSelect( export async function bulkUpdate< TWhereColumn extends ColumnName, TSetColumn extends ColumnName, ->(options: BulkUpdateOptions): Promise { - const {database, tableName, whereColumnNames, setColumnNames, updates} = - options; +>( + options: BulkUpdateOptions & {returning: SQLQuery}, +): Promise; +export async function bulkUpdate< + TWhereColumn extends ColumnName, + TSetColumn extends ColumnName, +>(options: BulkUpdateOptions): Promise; +export async function bulkUpdate< + TWhereColumn extends ColumnName, + TSetColumn extends ColumnName, +>( + options: BulkUpdateOptions & {returning?: SQLQuery}, +): Promise { + const { + database, + tableName, + whereColumnNames, + setColumnNames, + updates, + returning, + } = options; const {sql} = database; - await database.query( + return await database.query( sql`UPDATE ${tableId(options)} SET ${sql.join( setColumnNames.map( (columnName) => @@ -192,7 +216,7 @@ export async function bulkUpdate< )}`, ), ` AND `, - )}`, + )}${returning ? sql` RETURNING ${returning}` : sql``}`, ); } diff --git a/packages/pg-schema-print-types/src/__tests__/print.test.pg.ts b/packages/pg-schema-print-types/src/__tests__/print.test.pg.ts index 5857440e..c158efc3 100644 --- a/packages/pg-schema-print-types/src/__tests__/print.test.pg.ts +++ b/packages/pg-schema-print-types/src/__tests__/print.test.pg.ts @@ -19,14 +19,18 @@ test('getClasses', async () => { id BIGSERIAL NOT NULL PRIMARY KEY, screen_name TEXT UNIQUE NOT NULL, bio TEXT, - age INT + age INT, + created_at TIMESTAMPTZ, + updated_at TIMESTAMPTZ ); CREATE TABLE print_types.photos ( id BIGSERIAL NOT NULL PRIMARY KEY, owner_user_id BIGINT NOT NULL REFERENCES print_types.users(id), cdn_url TEXT NOT NULL, caption TEXT NULL, - metadata JSONB NOT NULL + metadata JSONB NOT NULL, + created_at TIMESTAMPTZ, + updated_at TIMESTAMPTZ ); CREATE MATERIALIZED VIEW print_types.view_a AS SELECT * FROM print_types.users; CREATE VIEW print_types.view_b AS SELECT * FROM print_types.photos; @@ -99,12 +103,14 @@ test('getClasses', async () => { interface Photo { caption: (string) | null cdn_url: string & {__brand?: \\"url\\"} + created_at: (Date) | null /** * @default nextval('print_types.photos_id_seq'::regclass) */ id: number & {readonly __brand?: 'photos_id'} metadata: unknown owner_user_id: User['id'] + updated_at: (Date) | null } export default Photo; @@ -114,12 +120,14 @@ test('getClasses', async () => { interface Photos_InsertParameters { caption?: (string) | null cdn_url: string & {__brand?: \\"url\\"} + created_at?: (Date) | null /** * @default nextval('print_types.photos_id_seq'::regclass) */ id?: number & {readonly __brand?: 'photos_id'} metadata: unknown owner_user_id: User['id'] + updated_at?: (Date) | null } export type {Photos_InsertParameters} ", @@ -129,22 +137,26 @@ test('getClasses', async () => { "content": "interface User { age: (number) | null bio: (string) | null + created_at: (Date) | null /** * @default nextval('print_types.users_id_seq'::regclass) */ id: number & {readonly __brand?: 'users_id'} screen_name: string + updated_at: (Date) | null } export default User; interface Users_InsertParameters { age?: (number) | null bio?: (string) | null + created_at?: (Date) | null /** * @default nextval('print_types.users_id_seq'::regclass) */ id?: number & {readonly __brand?: 'users_id'} screen_name: string + updated_at?: (Date) | null } export type {Users_InsertParameters} ", @@ -169,6 +181,13 @@ test('getClasses', async () => { \\"typeId\\": 25, \\"typeName\\": \\"TEXT\\" }, + { + \\"name\\": \\"created_at\\", + \\"isNullable\\": true, + \\"hasDefault\\": false, + \\"typeId\\": 1184, + \\"typeName\\": \\"TIMESTAMPTZ\\" + }, { \\"name\\": \\"id\\", \\"isNullable\\": false, @@ -189,6 +208,13 @@ test('getClasses', async () => { \\"hasDefault\\": false, \\"typeId\\": 20, \\"typeName\\": \\"BIGINT\\" + }, + { + \\"name\\": \\"updated_at\\", + \\"isNullable\\": true, + \\"hasDefault\\": false, + \\"typeId\\": 1184, + \\"typeName\\": \\"TIMESTAMPTZ\\" } ] }, @@ -209,6 +235,13 @@ test('getClasses', async () => { \\"typeId\\": 25, \\"typeName\\": \\"TEXT\\" }, + { + \\"name\\": \\"created_at\\", + \\"isNullable\\": true, + \\"hasDefault\\": false, + \\"typeId\\": 1184, + \\"typeName\\": \\"TIMESTAMPTZ\\" + }, { \\"name\\": \\"id\\", \\"isNullable\\": false, @@ -222,6 +255,13 @@ test('getClasses', async () => { \\"hasDefault\\": false, \\"typeId\\": 25, \\"typeName\\": \\"TEXT\\" + }, + { + \\"name\\": \\"updated_at\\", + \\"isNullable\\": true, + \\"hasDefault\\": false, + \\"typeId\\": 1184, + \\"typeName\\": \\"TIMESTAMPTZ\\" } ] } diff --git a/packages/pg-typed/package.json b/packages/pg-typed/package.json index 8ac5bf5e..d75bcb45 100644 --- a/packages/pg-typed/package.json +++ b/packages/pg-typed/package.json @@ -9,7 +9,8 @@ }, "devDependencies": { "@databases/pg": "^0.0.0", - "@databases/pg-schema-print-types": "^0.0.0" + "@databases/pg-schema-print-types": "^0.0.0", + "assert-never": "^1.2.1" }, "peerDependencies": { "@databases/pg": "*" diff --git a/packages/pg-typed/src/__tests__/__generated__/photos.ts b/packages/pg-typed/src/__tests__/__generated__/photos.ts index fa07dbcf..f5d6e805 100644 --- a/packages/pg-typed/src/__tests__/__generated__/photos.ts +++ b/packages/pg-typed/src/__tests__/__generated__/photos.ts @@ -2,10 +2,10 @@ * !!! This file is autogenerated do not edit by hand !!! * * Generated by: @databases/pg-schema-print-types - * Checksum: K/T5GcDa4h6qf9BX0eL8LPkh2BjljQyWOVrcLBBbbrtM2H0Itms4v6nsuQelT9BEdVIetzlSj/XGs/33Q0wm6g== + * Checksum: RaxnJplurntBTC0aUsQPE668GzCYbEs1e3mzJNCfVEd01+oIYkxGUG6sKyRB/W3rsxGishnyNbr3rN5x5IInSw== */ -// eslint:disable +/* eslint-disable */ // tslint:disable import User from './users'; @@ -16,12 +16,14 @@ import User from './users'; interface Photo { caption: string | null; cdn_url: string & {__brand?: 'url'}; + created_at: Date | null; /** * @default nextval('print_types.photos_id_seq'::regclass) */ id: number & {readonly __brand?: 'photos_id'}; metadata: unknown; owner_user_id: User['id']; + updated_at: Date | null; } export default Photo; @@ -31,11 +33,13 @@ export default Photo; interface Photos_InsertParameters { caption?: string | null; cdn_url: string & {__brand?: 'url'}; + created_at?: Date | null; /** * @default nextval('print_types.photos_id_seq'::regclass) */ id?: number & {readonly __brand?: 'photos_id'}; metadata: unknown; owner_user_id: User['id']; + updated_at?: Date | null; } export type {Photos_InsertParameters}; diff --git a/packages/pg-typed/src/__tests__/__generated__/schema.json b/packages/pg-typed/src/__tests__/__generated__/schema.json index b9b787d6..3aedc797 100644 --- a/packages/pg-typed/src/__tests__/__generated__/schema.json +++ b/packages/pg-typed/src/__tests__/__generated__/schema.json @@ -16,6 +16,13 @@ "typeId": 25, "typeName": "TEXT" }, + { + "name": "created_at", + "isNullable": true, + "hasDefault": false, + "typeId": 1184, + "typeName": "TIMESTAMPTZ" + }, { "name": "id", "isNullable": false, @@ -36,6 +43,13 @@ "hasDefault": false, "typeId": 20, "typeName": "BIGINT" + }, + { + "name": "updated_at", + "isNullable": true, + "hasDefault": false, + "typeId": 1184, + "typeName": "TIMESTAMPTZ" } ] }, @@ -56,6 +70,13 @@ "typeId": 25, "typeName": "TEXT" }, + { + "name": "created_at", + "isNullable": true, + "hasDefault": false, + "typeId": 1184, + "typeName": "TIMESTAMPTZ" + }, { "name": "id", "isNullable": false, @@ -69,6 +90,13 @@ "hasDefault": false, "typeId": 25, "typeName": "TEXT" + }, + { + "name": "updated_at", + "isNullable": true, + "hasDefault": false, + "typeId": 1184, + "typeName": "TIMESTAMPTZ" } ] } diff --git a/packages/pg-typed/src/__tests__/__generated__/users.ts b/packages/pg-typed/src/__tests__/__generated__/users.ts index 40c64899..e66704dc 100644 --- a/packages/pg-typed/src/__tests__/__generated__/users.ts +++ b/packages/pg-typed/src/__tests__/__generated__/users.ts @@ -2,30 +2,34 @@ * !!! This file is autogenerated do not edit by hand !!! * * Generated by: @databases/pg-schema-print-types - * Checksum: xcD+WCgqjXJHycRnxZpcKF96M4VAUj6PxRRELyI5DcAnUWxyaZYawz2B9XGO1KVlylmRojbdoE8ruySe0eAu5A== + * Checksum: j+3qGf2A7qh66mBdMgbG2zm2tS4/9DZvvpuPV6hggsfbAJIDjB7we0FoYOGNdZ0W+dltixV1QPj6Odx503aykA== */ -// eslint:disable +/* eslint-disable */ // tslint:disable interface User { age: number | null; bio: string | null; + created_at: Date | null; /** * @default nextval('print_types.users_id_seq'::regclass) */ id: number & {readonly __brand?: 'users_id'}; screen_name: string; + updated_at: Date | null; } export default User; interface Users_InsertParameters { age?: number | null; bio?: string | null; + created_at?: Date | null; /** * @default nextval('print_types.users_id_seq'::regclass) */ id?: number & {readonly __brand?: 'users_id'}; screen_name: string; + updated_at?: Date | null; } export type {Users_InsertParameters}; diff --git a/packages/pg-typed/src/__tests__/advancedQueries.test.pg.ts b/packages/pg-typed/src/__tests__/advancedQueries.test.pg.ts index 1b7fb32c..63b092e8 100644 --- a/packages/pg-typed/src/__tests__/advancedQueries.test.pg.ts +++ b/packages/pg-typed/src/__tests__/advancedQueries.test.pg.ts @@ -6,14 +6,43 @@ import defineTables, { greaterThan, lessThan, inQueryResults, + jsonPath, + allOf, + or, + caseInsensitive, + and, } from '..'; +import User from './__generated__/users'; const {users, photos} = defineTables({ schemaName: 'typed_queries_advanced_tests', databaseSchema: require('./__generated__/schema.json'), }); -const db = connect({bigIntMode: 'number'}); +let queries: {readonly text: string; readonly values: readonly any[]}[] = []; +const db = connect({ + bigIntMode: 'number', + onQueryStart(_q, q) { + queries.push({ + text: q.text.split(`"typed_queries_advanced_tests".`).join(``), + values: q.values, + }); + }, +}); +function expectQueries(fn: () => Promise) { + return expect( + (async () => { + try { + queries = []; + await fn(); + return queries; + } catch (ex) { + console.error(queries); + throw ex; + } + })(), + ).resolves; +} afterAll(async () => { await db.dispose(); @@ -27,14 +56,18 @@ test('create schema', async () => { id BIGSERIAL NOT NULL PRIMARY KEY, screen_name TEXT UNIQUE NOT NULL, bio TEXT, - age INT + age INT, + created_at TIMESTAMPTZ, + updated_at TIMESTAMPTZ ); CREATE TABLE typed_queries_advanced_tests.photos ( id BIGSERIAL NOT NULL PRIMARY KEY, owner_user_id BIGINT NOT NULL REFERENCES typed_queries_advanced_tests.users(id), cdn_url TEXT NOT NULL, caption TEXT NULL, - metadata JSONB NOT NULL + metadata JSONB NOT NULL, + created_at TIMESTAMPTZ, + updated_at TIMESTAMPTZ ); `, ); @@ -189,7 +222,12 @@ test('JSON values can be of any type', async () => { }, { cdn_url: 'http://example.com/f', - metadata: {whatever: 'this is great'}, + metadata: {whatever: 'this is GREAT'}, + owner_user_id: user.id, + }, + { + cdn_url: 'http://example.com/g', + metadata: {whatever: 'this is GREAT', also: {this: 'is super GREAT'}}, owner_user_id: user.id, }, ); @@ -199,6 +237,162 @@ test('JSON values can be of any type', async () => { ['http://example.com/c', true], ['http://example.com/d', [1, 2, 3]], ['http://example.com/e', ['foo', 'bar', 'baz']], - ['http://example.com/f', {whatever: 'this is great'}], + ['http://example.com/f', {whatever: 'this is GREAT'}], + [ + 'http://example.com/g', + {whatever: 'this is GREAT', also: {this: 'is super GREAT'}}, + ], + ]); + + expect( + await photos(db) + .find({ + metadata: anyOf([ + jsonPath(['1'], 2), + allOf([ + jsonPath(['whatever'], caseInsensitive('this is great')), + jsonPath(['also', 'this'], caseInsensitive('is super great')), + ]), + ]), + }) + .orderByAsc(`cdn_url`) + .select(`cdn_url`, `metadata`) + .all(), + ).toEqual([ + { + cdn_url: 'http://example.com/d', + metadata: [1, 2, 3], + }, + { + cdn_url: 'http://example.com/g', + metadata: { + also: {this: 'is super GREAT'}, + whatever: 'this is GREAT', + }, + }, + ]); + + expect( + await photos(db) + .find( + or( + { + cdn_url: anyOf(['http://example.com/d', 'http://example.com/g']), + metadata: jsonPath(['1'], 2), + }, + { + cdn_url: 'http://example.com/f', + metadata: jsonPath(['whatever'], caseInsensitive('this is great')), + }, + ), + ) + .orderByAsc(`cdn_url`) + .select(`cdn_url`, `metadata`) + .all(), + ).toEqual([ + { + cdn_url: 'http://example.com/d', + metadata: [1, 2, 3], + }, + { + cdn_url: 'http://example.com/f', + metadata: {whatever: 'this is GREAT'}, + }, + ]); +}); + +test('case insensitive', async () => { + const USER_NAME_A = `USERwithMIXEDcaseNAME_A`; + const USER_NAME_B = `USERwithMIXEDcaseNAME_B`; + const [USER_A, USER_B] = await users(db).insert( + {screen_name: 'userWITHmixedCASEname_A', age: 1}, + {screen_name: 'userWITHmixedCASEname_B', age: 2}, + ); + + expect( + await users(db) + .find({screen_name: anyOf([USER_NAME_A, USER_NAME_B])}) + .all(), + ).toEqual([]); + + await expectQueries(async () => { + expect( + await users(db) + .find({ + screen_name: anyOf([ + caseInsensitive(USER_NAME_A), + caseInsensitive(USER_NAME_B), + ]), + }) + .all(), + ).toEqual([USER_A, USER_B]); + }).toEqual([ + { + text: `SELECT * FROM "users" WHERE LOWER(CAST("screen_name" AS TEXT)) = ANY($1)`, + values: [['userwithmixedcasename_a', 'userwithmixedcasename_b']], + }, + ]); + + await expectQueries(async () => { + expect( + await users(db) + .find({ + screen_name: caseInsensitive(anyOf([USER_NAME_A, USER_NAME_B])), + }) + .all(), + ).toEqual([USER_A, USER_B]); + }).toEqual([ + { + text: `SELECT * FROM "users" WHERE LOWER(CAST("screen_name" AS TEXT)) = ANY($1)`, + values: [['userwithmixedcasename_a', 'userwithmixedcasename_b']], + }, + ]); + + await expectQueries(async () => { + expect( + await users(db) + .find({ + screen_name: anyOf([ + caseInsensitive(USER_NAME_A), + caseInsensitive(USER_NAME_B), + ]), + age: allOf([not(1), not(3)]), + }) + .all(), + ).toEqual([USER_B]); + }).toEqual([ + { + text: `SELECT * FROM "users" WHERE LOWER(CAST("screen_name" AS TEXT)) = ANY($1) AND NOT ("age" = ANY($2))`, + values: [ + ['userwithmixedcasename_a', 'userwithmixedcasename_b'], + [1, 3], + ], + }, + ]); + + await expectQueries(async () => { + expect( + await users(db) + .find( + and( + { + screen_name: anyOf([ + caseInsensitive(USER_NAME_A), + caseInsensitive(USER_NAME_B), + ]), + age: anyOf([not(1), not(2)]), + }, + { + age: anyOf([not(greaterThan(5)), not(lessThan(10))]), + }, + ), + ) + .all(), + ).toEqual([USER_A, USER_B]); + }).toEqual([ + { + text: `SELECT * FROM "users" WHERE LOWER(CAST("screen_name" AS TEXT)) = ANY($1) AND NOT (("age" > $2 AND "age" < $3))`, + values: [['userwithmixedcasename_a', 'userwithmixedcasename_b'], 5, 10], + }, ]); }); diff --git a/packages/pg-typed/src/__tests__/bulk-insert.test.pg.ts b/packages/pg-typed/src/__tests__/bulk-insert.test.pg.ts index a017d65c..44500b39 100644 --- a/packages/pg-typed/src/__tests__/bulk-insert.test.pg.ts +++ b/packages/pg-typed/src/__tests__/bulk-insert.test.pg.ts @@ -21,7 +21,9 @@ test('create schema', async () => { id BIGSERIAL NOT NULL PRIMARY KEY, screen_name TEXT UNIQUE NOT NULL, bio TEXT, - age INT DEFAULT 42 + age INT DEFAULT 42, + created_at TIMESTAMPTZ, + updated_at TIMESTAMPTZ ); `, ); @@ -38,20 +40,26 @@ test('create multiple users per statement using non-bulk API', async () => { Object { "age": 42, "bio": null, + "created_at": null, "id": 1, "screen_name": "Forbes", + "updated_at": null, }, Object { "age": 10, "bio": null, + "created_at": null, "id": 2, "screen_name": "Ellie", + "updated_at": null, }, Object { "age": 42, "bio": "Hello world", + "created_at": null, "id": 3, "screen_name": "John", + "updated_at": null, }, ] `); @@ -66,8 +74,10 @@ test('create multiple users per statement using non-bulk API', async () => { Object { "age": 42, "bio": null, + "created_at": null, "id": 5, "screen_name": "Martin", + "updated_at": null, }, undefined, undefined, @@ -86,26 +96,34 @@ test('create multiple users per statement using non-bulk API', async () => { Object { "age": 20, "bio": null, + "created_at": null, "id": 1, "screen_name": "Forbes", + "updated_at": null, }, Object { "age": 42, "bio": null, + "created_at": null, "id": 5, "screen_name": "Martin", + "updated_at": null, }, Object { "age": 5, "bio": null, + "created_at": null, "id": 2, "screen_name": "Ellie", + "updated_at": null, }, Object { "age": 42, "bio": "Whatever world", + "created_at": null, "id": 3, "screen_name": "John", + "updated_at": null, }, ] `); @@ -116,13 +134,14 @@ test('create users in bulk', async () => { for (let i = 0; i < 50_000; i++) { names.push(`bulk_insert_name_${i}`); } - await users(db).bulkInsert({ + const inserted = await users(db).bulkInsert({ columnsToInsert: [`age`], records: names.map((n) => ({screen_name: n, age: 42})), }); - // await users(db).insert( - // ...names.map((n) => ({screen_name: n, age: 42})), - // ); + + expect(inserted.map((i) => i.screen_name)).toEqual(names); + expect(inserted.map((i) => i.age)).toEqual(names.map(() => 42)); + const records = await users(db) .find({screen_name: anyOf(names)}) .orderByAsc(`screen_name`) @@ -152,7 +171,7 @@ test('query users in bulk', async () => { }); test('update users in bulk', async () => { - await users(db).bulkUpdate({ + const updated = await users(db).bulkUpdate({ whereColumnNames: [`screen_name`, `age`], setColumnNames: [`age`], updates: [ @@ -161,6 +180,10 @@ test('update users in bulk', async () => { {where: {screen_name: `bulk_insert_name_12`, age: 32}, set: {age: 3}}, ], }); + expect(updated.map(({screen_name, age}) => ({screen_name, age}))).toEqual([ + {screen_name: `bulk_insert_name_10`, age: 1}, + {screen_name: `bulk_insert_name_11`, age: 2}, + ]); expect( await users(db) .find({ diff --git a/packages/pg-typed/src/__tests__/count.test.pg.ts b/packages/pg-typed/src/__tests__/count.test.pg.ts index f08f2ae9..f5631751 100644 --- a/packages/pg-typed/src/__tests__/count.test.pg.ts +++ b/packages/pg-typed/src/__tests__/count.test.pg.ts @@ -24,7 +24,9 @@ test('create schema', async () => { id BIGSERIAL NOT NULL PRIMARY KEY, screen_name TEXT UNIQUE NOT NULL, bio TEXT, - age INT + age INT, + created_at TIMESTAMPTZ, + updated_at TIMESTAMPTZ ); `, ); diff --git a/packages/pg-typed/src/__tests__/index.test.pg.ts b/packages/pg-typed/src/__tests__/index.test.pg.ts index 0284d3fe..61f586de 100644 --- a/packages/pg-typed/src/__tests__/index.test.pg.ts +++ b/packages/pg-typed/src/__tests__/index.test.pg.ts @@ -20,14 +20,18 @@ test('create schema', async () => { id BIGSERIAL NOT NULL PRIMARY KEY, screen_name TEXT UNIQUE NOT NULL, bio TEXT, - age INT + age INT, + created_at TIMESTAMPTZ, + updated_at TIMESTAMPTZ ); CREATE TABLE typed_queries_tests.photos ( id BIGSERIAL NOT NULL PRIMARY KEY, owner_user_id BIGINT NOT NULL REFERENCES typed_queries_tests.users(id), cdn_url TEXT NOT NULL, caption TEXT NULL, - metadata JSONB NOT NULL + metadata JSONB NOT NULL, + created_at TIMESTAMPTZ, + updated_at TIMESTAMPTZ ); `, ); @@ -264,3 +268,60 @@ test('use a default connection', async () => { `"You must either provide a \\"defaultConnection\\" to pg-typed, or specify a connection when accessing the table."`, ); }); + +test('insert or update', async () => { + const time1 = new Date('2021-01-10T11:00:00.000Z'); + const time2 = new Date('2021-01-10T20:00:00.000Z'); + const time3 = new Date('2021-01-10T23:00:00.000Z'); + + const USER_A = 'insert_or_update_a'; + const USER_B = 'insert_or_update_b'; + + function exp(user: any, {created, updated}: {created: Date; updated: Date}) { + expect({ + created: user.created_at.toISOString(), + updated: user.updated_at.toISOString(), + }).toEqual({ + created: created.toISOString(), + updated: updated.toISOString(), + }); + } + + const [userA1] = await users(db).insert({ + screen_name: USER_A, + created_at: time1, + updated_at: time1, + }); + exp(userA1, {created: time1, updated: time1}); + const [userA2, userB2] = await users(db).insertOrUpdate( + {onConflict: [`screen_name`], set: ['screen_name', 'updated_at']}, + { + screen_name: USER_A, + created_at: time2, + updated_at: time2, + }, + { + screen_name: USER_B, + created_at: time2, + updated_at: time2, + }, + ); + exp(userA2, {created: time1, updated: time2}); + exp(userB2, {created: time2, updated: time2}); + + const [userA3, userB3] = await users(db).insertOrUpdate( + {onConflict: [`screen_name`], doNotSet: ['created_at']}, + { + screen_name: USER_A, + created_at: time3, + updated_at: time3, + }, + { + screen_name: USER_B, + created_at: time3, + updated_at: time3, + }, + ); + exp(userA3, {created: time1, updated: time3}); + exp(userB3, {created: time2, updated: time3}); +}); diff --git a/packages/pg-typed/src/index.ts b/packages/pg-typed/src/index.ts index 6fd6d0c1..c3f4ce71 100644 --- a/packages/pg-typed/src/index.ts +++ b/packages/pg-typed/src/index.ts @@ -1,3 +1,4 @@ +import assertNever from 'assert-never'; import {SQLQuery, Queryable} from '@databases/pg'; import { bulkInsert, @@ -74,20 +75,31 @@ export interface OrderedSelectQuery extends SelectQuery { limit(count: number): Promise; } +type SpecialFieldQuery = + | { + type: 'json_path'; + path: readonly string[]; + query: T | FieldQuery; + } + | {type: 'case_insensitive'; query: T | FieldQuery} + | {type: 'not'; query: T | FieldQuery}; class FieldQuery { protected readonly __query: ( - columnName: string, + columnName: SQLQuery, sql: Queryable['sql'], - toValue: (columnName: string, value: unknown) => unknown, + toValue: (value: unknown) => unknown, ) => SQLQuery; + protected readonly __special: SpecialFieldQuery | undefined; constructor( query: ( - columnName: string, + columnName: SQLQuery, sql: Queryable['sql'], - toValue: (columnName: string, value: unknown) => unknown, + toValue: (value: unknown) => unknown, ) => SQLQuery, + special?: SpecialFieldQuery, ) { this.__query = query; + this.__special = special; } protected __checkFieldType(): T { throw new Error( @@ -95,34 +107,48 @@ class FieldQuery { ); } static query( - columnName: string, + columnName: SQLQuery, q: FieldQuery | unknown, sql: Queryable['sql'], - toValue: (columnName: string, value: unknown) => unknown, + toValue: (value: unknown) => unknown, ): SQLQuery { if (q === null) { - return sql`${sql.ident(columnName)} IS NULL`; + return sql`${columnName} IS NULL`; } if (q && q instanceof FieldQuery) { return q.__query(columnName, sql, toValue); } - return sql`${sql.ident(columnName)} = ${toValue(columnName, q)}`; + return sql`${columnName} = ${toValue(q)}`; + } + static getSpecial(q: T | FieldQuery) { + if (q && q instanceof FieldQuery) { + return q.__special; + } else { + return undefined; + } } } -export type {FieldQuery}; +const FALSE_FIELD_QUERY = new FieldQuery((_columnName, sql) => sql`FALSE`); +const TRUE_FIELD_QUERY = new FieldQuery((_columnName, sql) => sql`TRUE`); -export type WhereCondition = Partial< - {readonly [key in keyof TRecord]: TRecord[key] | FieldQuery} ->; +export type {FieldQuery}; export function anyOf(values: { [Symbol.iterator](): IterableIterator>; -}) { - const valuesArray: any[] = []; +}): T | FieldQuery { + const valuesSet = new Set(); const parts: FieldQuery[] = []; + const caseInsensitiveParts: (T | FieldQuery)[] = []; + const negatedParts: (T | FieldQuery)[] = []; for (const value of values) { + if (value === TRUE_FIELD_QUERY) { + return TRUE_FIELD_QUERY; + } + if (value === FALSE_FIELD_QUERY) { + continue; + } if (value === null) { parts.push( new FieldQuery((columnName, sql, toValue) => @@ -130,23 +156,50 @@ export function anyOf(values: { ), ); } else if (value instanceof FieldQuery) { - parts.push(value); + const special = FieldQuery.getSpecial(value); + if (special?.type === 'case_insensitive') { + caseInsensitiveParts.push(special.query); + } else if (special?.type === 'not') { + negatedParts.push(special.query); + } else { + parts.push(value); + } } else { - valuesArray.push(value); + valuesSet.add(value); } } - if (valuesArray.length) { - parts.push( - new FieldQuery( - (columnName, sql, toValue) => - sql`${sql.ident(columnName)} = ANY(${valuesArray.map((v) => - toValue(columnName, v), - )})`, - ), - ); + if (caseInsensitiveParts.length) { + parts.push(caseInsensitive(anyOf(caseInsensitiveParts) as any) as any); + } + if (negatedParts.length) { + const negated = not(allOf(negatedParts)); + if (negated && negated instanceof FieldQuery) { + parts.push(negated); + } else { + valuesSet.add(negated); + } + } + if (valuesSet.size) { + if (valuesSet.size === 1 && parts.length === 0) { + return [...valuesSet][0]; + } + if (valuesSet.size === 1) { + parts.push( + new FieldQuery((columnName, sql, toValue) => + FieldQuery.query(columnName, [...valuesSet][0], sql, toValue), + ), + ); + } else { + parts.push( + new FieldQuery( + (columnName, sql, toValue) => + sql`${columnName} = ANY(${[...valuesSet].map((v) => toValue(v))})`, + ), + ); + } } if (parts.length === 0) { - return new FieldQuery((_columnName, sql) => sql`FALSE`); + return FALSE_FIELD_QUERY; } if (parts.length === 1) { return parts[0]; @@ -160,37 +213,236 @@ export function anyOf(values: { ); } -export function not(value: T | FieldQuery) { +export function allOf(values: { + [Symbol.iterator](): IterableIterator>; +}): T | FieldQuery { + const valuesSet = new Set(); + const ordinaryParts: FieldQuery[] = []; + const negated: (T | FieldQuery)[] = []; + for (const q of values) { + if (q === FALSE_FIELD_QUERY) { + return FALSE_FIELD_QUERY; + } + if (q === TRUE_FIELD_QUERY) { + continue; + } + if (q && q instanceof FieldQuery) { + const special = FieldQuery.getSpecial(q); + if (special?.type === 'not') { + negated.push(special.query); + } else { + ordinaryParts.push(q); + } + } else { + valuesSet.add(q); + } + } + if (negated.length) { + const n = not(anyOf(negated)); + if (n && n instanceof FieldQuery) { + ordinaryParts.push(n); + } else { + valuesSet.add(n); + } + } + if (valuesSet.size > 1) { + return FALSE_FIELD_QUERY; + } else if (valuesSet.size) { + ordinaryParts.push( + new FieldQuery((columnName, sql, toValue) => + FieldQuery.query(columnName, [...valuesSet][0], sql, toValue), + ), + ); + } + if (ordinaryParts.length === 0) { + return TRUE_FIELD_QUERY; + } + if (ordinaryParts.length === 1) { + return ordinaryParts[0]; + } + return new FieldQuery( + (columnName, sql, toValue) => + sql`(${sql.join( + ordinaryParts.map((p) => FieldQuery.query(columnName, p, sql, toValue)), + ' AND ', + )})`, + ); +} + +export function not(value: T | FieldQuery): T | FieldQuery { + if (value === TRUE_FIELD_QUERY) { + return FALSE_FIELD_QUERY; + } else if (value === FALSE_FIELD_QUERY) { + return TRUE_FIELD_QUERY; + } + const special = FieldQuery.getSpecial(value); + if (special?.type === 'not') { + return special.query; + } return new FieldQuery( (columnName, sql, toValue) => sql`NOT (${FieldQuery.query(columnName, value, sql, toValue)})`, + {type: 'not', query: value}, ); } -function internalInQueryResults(query: (sql: Queryable['sql']) => SQLQuery) { +function internalInQueryResults( + query: (sql: Queryable['sql']) => SQLQuery | FieldQuery, +): FieldQuery { return new FieldQuery( - (columnName, sql) => sql`${sql.ident(columnName)} IN (${query(sql)})`, + (columnName, sql) => sql`${columnName} IN (${query(sql)})`, ); } -export function inQueryResults(query: SQLQuery) { +export function inQueryResults(query: SQLQuery): FieldQuery { return internalInQueryResults(() => query); } -export function lessThan(value: T) { +export function lessThan(value: T): FieldQuery { return new FieldQuery( - (columnName, sql, toValue) => - sql`${sql.ident(columnName)} < ${toValue(columnName, value)}`, + (columnName, sql, toValue) => sql`${columnName} < ${toValue(value)}`, ); } -export function greaterThan(value: T) { +export function greaterThan(value: T): FieldQuery { return new FieldQuery( + (columnName, sql, toValue) => sql`${columnName} > ${toValue(value)}`, + ); +} + +export function jsonPath( + path: readonly string[], + query: any | FieldQuery, +): FieldQuery { + return new FieldQuery( + (columnName, sql, toValue) => + FieldQuery.query(sql`${columnName}#>${path}`, query, sql, toValue), + {type: 'json_path', path, query}, + ); +} + +export function caseInsensitive( + query: string | FieldQuery, +): FieldQuery { + const special = FieldQuery.getSpecial(query); + if (special?.type === 'json_path') { + return jsonPath(special.path, caseInsensitive(special.query)); + } + return new FieldQuery( (columnName, sql, toValue) => - sql`${sql.ident(columnName)} > ${toValue(columnName, value)}`, + FieldQuery.query( + sql`LOWER(CAST(${columnName} AS TEXT))`, + query, + sql, + (v) => `${toValue(v)}`.toLowerCase(), + ), + {type: 'case_insensitive', query}, ); } +class WhereCombinedCondition { + protected readonly __conditions: readonly WhereCondition[]; + protected readonly __combiner: 'AND' | 'OR'; + constructor( + conditions: readonly WhereCondition[], + combiner: 'AND' | 'OR', + ) { + this.__conditions = conditions; + this.__combiner = combiner; + } + static query( + q: WhereCombinedCondition | WhereCondition, + sql: Queryable['sql'], + toValue: (columnName: string, value: unknown) => unknown, + parentType?: 'AND' | 'OR', + ): SQLQuery | 'TRUE' | 'FALSE' { + if (q instanceof WhereCombinedCondition) { + const conditions = q.__conditions.map((c) => + WhereCombinedCondition.query(c, sql, toValue, q.__combiner), + ); + const significantConditions: SQLQuery[] = []; + switch (q.__combiner) { + case 'AND': { + for (const c of conditions) { + if (c === 'FALSE') { + return 'FALSE'; + } else if (c !== 'TRUE') { + significantConditions.push(c); + } + } + if (!significantConditions.length) { + return 'TRUE'; + } + if (significantConditions.length === 1) { + return significantConditions[0]; + } + const query = sql.join(significantConditions, sql` AND `); + return parentType === 'OR' ? sql`(${query})` : query; + } + case 'OR': { + for (const c of conditions) { + if (c === 'TRUE') { + return 'TRUE'; + } else if (c !== 'FALSE') { + significantConditions.push(c); + } + } + if (!significantConditions.length) { + return 'FALSE'; + } + if (significantConditions.length === 1) { + return significantConditions[0]; + } + const query = sql.join(significantConditions, sql` OR `); + return parentType === 'AND' ? sql`(${query})` : query; + } + default: + return assertNever(q.__combiner); + } + } + + const entries = Object.entries(q); + const fieldTests: SQLQuery[] = []; + for (const [columnName, value] of entries) { + if (value === FALSE_FIELD_QUERY) { + return 'FALSE'; + } else if (value !== TRUE_FIELD_QUERY) { + fieldTests.push( + FieldQuery.query(sql.ident(columnName), value, sql, (v) => + toValue(columnName, v), + ), + ); + } + } + if (fieldTests.length === 0) { + return 'TRUE'; + } + if (fieldTests.length === 1) { + return fieldTests[0]; + } + const query = sql.join(fieldTests, sql` AND `); + return parentType === 'OR' ? sql`(${query})` : query; + } +} +export type {WhereCombinedCondition}; + +export type WhereCondition = + | Partial< + {readonly [key in keyof TRecord]: TRecord[key] | FieldQuery} + > + | WhereCombinedCondition; + +export function and( + ...conditions: readonly WhereCondition[] +): WhereCondition { + return new WhereCombinedCondition(conditions, 'AND'); +} +export function or( + ...conditions: readonly WhereCondition[] +): WhereCondition { + return new WhereCombinedCondition(conditions, 'OR'); +} + interface SelectQueryOptions { selectColumnNames: readonly string[] | undefined; distinctColumnNames: readonly string[]; @@ -358,31 +610,6 @@ type BulkInsertRecord< TKey extends keyof TInsertParameters, > = BulkRecord>; -function rowToCondition( - row: WhereCondition, - sql: Queryable['sql'], - toValue: (columnName: string, value: unknown) => unknown, -): SQLQuery | null { - const entries = Object.entries(row).filter((row) => row[1] !== undefined); - if (entries.length === 0) { - return null; - } - return sql.join( - entries.map(([columnName, value]) => - FieldQuery.query(columnName, value, sql, toValue), - ), - sql` AND `, - ); -} -function rowToWhere( - row: WhereCondition, - sql: Queryable['sql'], - toValue: (columnName: string, value: unknown) => unknown, -): SQLQuery { - const condition = rowToCondition(row, sql, toValue); - return condition ? sql`WHERE ${condition}` : sql``; -} - type BulkOperationOptionsBase< TColumnName extends string | number | symbol, TInsertColumnName extends string | number | symbol, @@ -463,8 +690,12 @@ class Table { TInsertParameters, TColumnsToInsert[number] >[]; - }) { - await bulkInsert({ + }): Promise { + if (records.length === 0) { + return []; + } + const {sql} = this._underlyingDb; + return await bulkInsert({ ...this._getBulkOperationOptions(), columnsToInsert: [ ...new Set([ @@ -473,6 +704,7 @@ class Table { ]), ].sort(), records, + returning: sql`${this.tableId}.*`, }); } @@ -487,12 +719,14 @@ class Table { >[]; }): UnorderedSelectQuery { const bulkOperationOptions = this._getBulkOperationOptions(); - return this.findUntyped( - bulkCondition({ - ...bulkOperationOptions, - whereColumnNames, - whereConditions, - }), + return this._findUntyped( + whereConditions.length + ? bulkCondition({ + ...bulkOperationOptions, + whereColumnNames, + whereConditions, + }) + : 'FALSE', ); } @@ -510,12 +744,17 @@ class Table { readonly where: BulkRecord; readonly set: BulkRecord; }[]; - }) { - await bulkUpdate({ + }): Promise { + if (updates.length === 0) { + return []; + } + const {sql} = this._underlyingDb; + return await bulkUpdate({ ...this._getBulkOperationOptions(), whereColumnNames, setColumnNames, updates, + returning: sql`${this.tableId}.*`, }); } @@ -531,6 +770,9 @@ class Table { TWhereColumns[number] >[]; }) { + if (whereConditions.length === 0) { + return; + } await bulkDelete({ ...this._getBulkOperationOptions(), whereColumnNames, @@ -605,7 +847,18 @@ class Table { } async insertOrUpdate( - conflictKeys: readonly [keyof TRecord, ...(keyof TRecord)[]], + options: + | readonly [keyof TRecord, ...(keyof TRecord)[]] + | { + onConflict: readonly [keyof TRecord, ...(keyof TRecord)[]]; + set?: readonly [keyof TRecord, ...(keyof TRecord)[]]; + doNotSet?: undefined; + } + | { + onConflict: readonly [keyof TRecord, ...(keyof TRecord)[]]; + set?: undefined; + doNotSet?: readonly [keyof TRecord, ...(keyof TRecord)[]]; + }, ...rows: keyof TRecordsToInsert[number] extends keyof TInsertParameters ? TRecordsToInsert : readonly ({[key in keyof TInsertParameters]: TInsertParameters[key]} & @@ -616,20 +869,36 @@ class Table { >]: never; })[] ): Promise<{-readonly [key in keyof TRecordsToInsert]: TRecord}> { + const getOption = ( + k: 'onConflict' | 'set' | 'doNotSet', + ): readonly (keyof TRecord)[] | undefined => { + return Array.isArray(options) ? undefined : (options as any)[k]; + }; + const conflictKeys = + getOption('onConflict') ?? (options as readonly (keyof TRecord)[]); + const set = getOption('set'); + const doNotSet = getOption('doNotSet'); + const {sql} = this._underlyingDb; - return this._insert( - (columnNames) => - sql`ON CONFLICT (${sql.join( - conflictKeys.map((k) => sql.ident(k)), - sql`, `, - )}) DO UPDATE SET ${sql.join( - columnNames.map( - (key) => sql`${sql.ident(key)}=EXCLUDED.${sql.ident(key)}`, - ), - sql`, `, - )}`, - ...rows, - ) as any; + return this._insert((columnNames) => { + let updateKeys: readonly (string | number | symbol)[] = columnNames; + if (set) { + updateKeys = set; + } + if (doNotSet) { + const keysNotToSet = new Set(doNotSet); + updateKeys = updateKeys.filter((key) => !keysNotToSet.has(key)); + } + return sql`ON CONFLICT (${sql.join( + conflictKeys.map((k) => sql.ident(k)), + sql`, `, + )}) DO UPDATE SET ${sql.join( + updateKeys.map( + (key) => sql`${sql.ident(key)}=EXCLUDED.${sql.ident(key)}`, + ), + sql`, `, + )}`; + }, ...rows) as any; } async insertOrIgnore( @@ -654,7 +923,14 @@ class Table { updateValues: Partial, ): Promise { const {sql} = this._underlyingDb; - const where = rowToWhere(whereValues, sql, this._value); + const whereConditions = WhereCombinedCondition.query( + whereValues, + sql, + this._value, + ); + if (whereConditions === `FALSE`) { + return []; + } const setClause = sql.join( Object.entries(updateValues).map(([columnName, value]) => { return sql`${sql.ident(columnName)} = ${this._value( @@ -664,15 +940,31 @@ class Table { }), sql`, `, ); - return await this.untypedQuery( - sql`UPDATE ${this.tableId} SET ${setClause} ${where} RETURNING *`, - ); + if (whereConditions === 'TRUE') { + return await this.untypedQuery( + sql`UPDATE ${this.tableId} SET ${setClause} RETURNING *`, + ); + } else { + return await this.untypedQuery( + sql`UPDATE ${this.tableId} SET ${setClause} WHERE ${whereConditions} RETURNING *`, + ); + } } async delete(whereValues: WhereCondition): Promise { const {sql} = this._underlyingDb; - const where = rowToWhere(whereValues, sql, this._value); - await this.untypedQuery(sql`DELETE FROM ${this.tableId} ${where}`); + const whereConditions = WhereCombinedCondition.query( + whereValues, + sql, + this._value, + ); + if (whereConditions === 'TRUE') { + await this.untypedQuery(sql`DELETE FROM ${this.tableId}`); + } else if (whereConditions !== 'FALSE') { + await this.untypedQuery( + sql`DELETE FROM ${this.tableId} WHERE ${whereConditions}`, + ); + } } /** @@ -684,7 +976,9 @@ class Table { return this.find(whereValues); } - findUntyped(whereCondition: SQLQuery | null): UnorderedSelectQuery { + private _findUntyped( + whereCondition: SQLQuery | 'TRUE' | 'FALSE', + ): UnorderedSelectQuery { const {sql} = this._underlyingDb; return new SelectQueryImplementation( this.tableName, @@ -694,6 +988,7 @@ class Table { limit: limitCount, distinctColumnNames, }) => { + if (whereCondition === 'FALSE') return []; return await this._underlyingDb.query( sql.join( [ @@ -711,7 +1006,7 @@ class Table { ) : sql`*`, sql`FROM ${this.tableId}`, - whereCondition ? sql`WHERE ${whereCondition}` : null, + whereCondition !== 'TRUE' ? sql`WHERE ${whereCondition}` : null, orderByQueries.length ? sql`ORDER BY ${sql.join( orderByQueries.map((q) => @@ -735,7 +1030,9 @@ class Table { whereValues: WhereCondition = {}, ): UnorderedSelectQuery { const {sql} = this._underlyingDb; - return this.findUntyped(rowToCondition(whereValues, sql, this._value)); + return this._findUntyped( + WhereCombinedCondition.query(whereValues, sql, this._value), + ); } /** @@ -758,11 +1055,24 @@ class Table { async count(whereValues: WhereCondition = {}): Promise { const {sql} = this._underlyingDb; - const where = rowToWhere(whereValues, sql, this._value); - const [result] = await this._underlyingDb.query( - sql`SELECT count(*) AS count FROM ${this.tableId} ${where}`, + const whereCondition = WhereCombinedCondition.query( + whereValues, + sql, + this._value, ); - return parseInt(result.count, 10); + if (whereCondition === `FALSE`) { + return 0; + } else if (whereCondition === `TRUE`) { + const [result] = await this._underlyingDb.query( + sql`SELECT count(*) AS count FROM ${this.tableId}`, + ); + return parseInt(result.count, 10); + } else { + const [result] = await this._underlyingDb.query( + sql`SELECT count(*) AS count FROM ${this.tableId} WHERE ${whereCondition}`, + ); + return parseInt(result.count, 10); + } } async untypedQuery(query: SQLQuery): Promise { @@ -832,14 +1142,25 @@ function getTable( fieldName: TKey, condition: WhereCondition = {}, ): FieldQuery => - internalInQueryResults( - (sql) => - sql`SELECT ${sql.ident(fieldName)} FROM ${ - schemaName - ? sql.ident(schemaName, tableName) - : sql.ident(tableName) - } ${rowToWhere(condition, sql, serializeValue)}`, - ), + internalInQueryResults((sql) => { + const whereCondition = WhereCombinedCondition.query( + condition, + sql, + serializeValue, + ); + if (whereCondition === 'FALSE') { + return FALSE_FIELD_QUERY; + } + const tableId = schemaName + ? sql.ident(schemaName, tableName) + : sql.ident(tableName); + const fieldId = sql.ident(fieldName); + if (whereCondition === 'TRUE') { + return sql`SELECT ${fieldId} FROM ${tableId}`; + } else { + return sql`SELECT ${fieldId} FROM ${tableId} WHERE ${whereCondition}`; + } + }), }, ); } @@ -999,10 +1320,15 @@ function getTableSerializeValue( module.exports = Object.assign(defineTables, { default: defineTables, anyOf, + allOf, not, inQueryResults, lessThan, + jsonPath, + caseInsensitive, greaterThan, + and, + or, isNoResultFoundError, isMultipleResultsFoundError, });