Skip to content

Commit

Permalink
Add pg-mig README and reuse common files, reshuffle files to internal/
Browse files Browse the repository at this point in the history
  • Loading branch information
dimikot committed Jan 15, 2024
1 parent 2f8abbd commit db5d09a
Show file tree
Hide file tree
Showing 38 changed files with 1,166 additions and 103 deletions.
435 changes: 435 additions & 0 deletions .eslintrc.base.js

Large diffs are not rendered by default.

8 changes: 6 additions & 2 deletions .eslintrc.js
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
"use strict";
module.exports = require("../../.eslintrc.base.js")(__dirname, {
module.exports = require("./.eslintrc.base.js")(__dirname, {
"import/no-extraneous-dependencies": "error",
"lodash/import-scope": ["error", "method"],
"@typescript-eslint/explicit-function-return-type": [
"error",
{ allowExpressions: true, allowedNames: ["configure"] },
],
"lodash/import-scope": ["error", "method"]
});
26 changes: 26 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
name: "CI Full Run"
on:
pull_request:
branches:
- main
- grok/*/*
push:
branches:
- main
jobs:
test:
runs-on: ubuntu-latest
strategy:
matrix:
node-version: ["20.x"]
steps:
- uses: actions/checkout@v4
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v3
with:
node-version: ${{ matrix.node-version }}
- run: npm install -g pnpm --force
- run: pnpm install
- run: pnpm run build
- run: pnpm run lint
- run: pnpm run test
36 changes: 36 additions & 0 deletions .github/workflows/semgrep.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
# Name of this GitHub Actions workflow.
name: Semgrep

on:
# Scan changed files in PRs (diff-aware scanning):
pull_request:
branches: ['main']

# Schedule the CI job (this method uses cron syntax):
schedule:
- cron: '0 0 * * MON-FRI'

jobs:
semgrep:
# User definable name of this GitHub Actions job.
name: Scan
# If you are self-hosting, change the following `runs-on` value:
runs-on: ubuntu-latest

container:
# A Docker image with Semgrep installed. Do not change this.
image: returntocorp/semgrep@sha256:6c7ab81e4d1fd25a09f89f1bd52c984ce107c6ff33affef6ca3bc626a4cc479b

# Skip any PR created by dependabot to avoid permission issues:
if: (github.actor != 'dependabot[bot]')

steps:
# Fetch project source with GitHub Actions Checkout.
- uses: actions/checkout@f43a0e5ff2bd294095638e18286ca9a3d1956744 # v3.6.0
# Run the "semgrep ci" command on the command line of the docker image.
- run: semgrep ci
env:
# Connect to Semgrep Cloud Platform through your SEMGREP_APP_TOKEN.
# Generate a token from Semgrep Cloud Platform > Settings
# and add it to your GitHub secrets.
SEMGREP_APP_TOKEN: ${{ secrets.SEMGREP_APP_TOKEN }}
11 changes: 11 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
dist

# Common in both .gitignore and .npmignore
node_modules
package-lock.json
yarn.lock
pnpm-lock.yaml
.DS_Store
*.log
*.tmp
*.swp
14 changes: 14 additions & 0 deletions .npmignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
dist/__tests__
dist/**/__tests__
dist/tsconfig.tsbuildinfo
.npmrc

# Common in both .gitignore and .npmignore
node_modules
package-lock.json
yarn.lock
pnpm-lock.yaml
.DS_Store
*.log
*.tmp
*.swp
1 change: 1 addition & 0 deletions .npmrc
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
# Published to https://www.npmjs.com
8 changes: 8 additions & 0 deletions .vscode/extensions.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
{
"recommendations": [
"dbaeumer.vscode-eslint",
"esbenp.prettier-vscode",
"mhutchie.git-graph",
"trentrand.git-rebase-shortcuts"
]
}
20 changes: 20 additions & 0 deletions .vscode/tasks.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
{
"version": "2.0.0",
"tasks": [
{
"label": "git grok: push local commits as individual PRs",
"detail": "Install git-grok first: https://github.com/dimikot/git-grok",
"type": "shell",
"command": "git grok",
"problemMatcher": [],
"hide": false
},
{
"label": "git rebase --interactive",
"detail": "Opens a UI for interactive rebase (install \"Git rebase shortcuts\" extension).",
"type": "shell",
"command": "GIT_EDITOR=\"code --wait\" git rebase -i",
"problemMatcher": []
}
]
}
126 changes: 126 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,126 @@
# @clickup/pg-mig: PostgreSQL schema migration tool with micro-sharding and clustering support

See also [Full API documentation](https://github.com/clickup/pg-mig/blob/master/docs/modules.md).

![CI run](https://github.com/clickup/pg-mig/actions/workflows/ci.yml/badge.svg?branch=main)

The tool allows to create a PostgreSQL database schema (with tables, indexes,
sequences, functions etc.) and apply it consistently across multiple PG hosts
(even more, across multiple micro-shard schemas on multiple hosts). The behavior
is transactional per each micro-shard per version ("all or nothing").

In other words, **pg-mig** helps to keep your database clusters' schemas identical
(each micro-shard schema will have exactly the same DDL structure as any other
schema on all other PG hosts).

# Usage

```
pg-mig
[--migdir=path/to/my-migrations/directory]
[--hosts=master1,master2,...]
[--port=5432]
[--user=user-which-can-apply-ddl]
[--pass=password]
[--db=my-database-name]
[--undo=20191107201239.my-migration-name.sh]
[--make=my-migration-name@sh]
[--parallelism=8]
[--dry]
[--list]
[--ci]
```

All of the arguments are optional: the tool tries to use `PGHOST`, `PGPORT`,
`PGUSER`, `PGPASSWORD`, `PGDATABASE` environment variables which are standard
for e.g. `psql`.

It also uses `PGMIGDIR` environment variable as a default value for `--migdir`
option.

When running in default mode, **pg-mig** tool reads (in order) the migration
versions `*.up.sql` files from the migration directory and applies them all of
the hostnames passed (of course, checking whether it has already been applied
before or not). See below for more details.

## Migration Version File Format

The migration version file name has the following format, examples:

```
20191107201239.add-table-abc.sh0000.up.sql
20191107201239.add-table-abc.sh0000.dn.sql
20231317204837.some-other-name.sh.up.sql
20231317204837.some-other-name.sh.dn.sql
20231203493744.anything-works.public.up.sql
20231203493744.anything-works.public.dn.sql
```

Here,

- the 1st part is a UTC timestamp when the migration version file was created,
- the 2nd part is a descriptive name of the migration (can be arbitrary),
- the 3rd part is the "PostgreSQL schema name prefix" (micro-shard name prefix)
- the 4th part is either "up" ("up" migration) or "dn" ("down" migration).
Up-migrations roll the database schema version forward, and down-migrations
allow to undo the changes.

It is the responsibility of the user to create up- and down-migration SQL files.
Basically, the user provides DDL SQL queries on how to roll the database schema
forward and how to roll it backward.

You can use any `psql`-specific instructions in `*.sql` files: they are fed to
`psql` tool directly. E.g. you can use environment variables, `\echo`, `\ir` for
inclusion etc. See [psql
documentation](https://www.postgresql.org/docs/current/app-psql.html) for
details.

## Applying the Migrations

Each migration version will be applied (in order) to all PG schemas (aka
micro-shards) on all hosts whose names start from the provided prefix (if
multiple migration files match some schema, then only the file with the longest
prefix will be used; in the above example, prefix "sh" effectively works as "sh*
except sh0000" wildcard).

The main idea is that, if the migration file application succeeds, then it will
be remembered on the corresponding PG host, in the corresponding schema
(micro-shard) itself. So next time when you run the tool, it will understand
that the migration version has already been applied, and won't try to apply it
again.

When the tool runs, it prints a live-updating progress, which migration version
file is in progress on which PG host in which schema (micro-shard). In the end,
it prints the final versions map across all of the hosts and schemas.

## Undoing the Migrations

If `--undo` argument is used, then the tool will try to run the down-migration
for the the corresponding version everywhere. If it succeeds, then it will
remember that fact on the corresponding PG host in the corresponding schema.
Only the very latest migration version applied can be undone.

Undoing migrations in production is not recommended (since the code which uses
the database may rely on its new structure), although you can use it of course.
The main use case for undoing the migrations is while development: you may want
to test your DDL statements multiple times, or you may pull from Git and get
someone else's migration before yours, so you'll need to undo your migration and
recreate its files.

## Creating the New Migration Files

If `--make` argument is used, **pg-mig** creates a new pair of empty files in the
migration directory. E.g. if you run:

```
pg-mig --migdir=my-dir --make=my-migration-name@sh
```

then it will create a pair of files which looks like
`my-dir/20231203493744.my-migration-name.sh.up.sql` and
`my-dir/20231203493744.my-migration-name.sh.dn.sql` which you can edit further.

New migration version files can only be appended in the end. If **pg-mig** detects
that you try to apply migrations which conflict with the existing migration
versions remembered in the database, it will print the error and refuse to
continue. This is similar to "fast-forward" mode in Git.
39 changes: 39 additions & 0 deletions SECURITY.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
# Security

Keeping our clients' data secure is an absolute top priority at ClickUp. Our goal is to provide a secure environment, while also being mindful of application performance and the overall user experience.

ClickUp believes effective disclosure of security vulnerabilities requires mutual trust, respect, transparency and common good between ClickUp and Security Researchers. Together, our vigilant expertise promotes the continued security and privacy of ClickUp customers, products, and services.

If you think you've found a security vulnerability in any ClickUp-owned repository, please let us know as outlined below.

ClickUp defines a security vulnerability as an unintended weakness or exposure that could be used to compromise the integrity, availability or confidentiality of our products and services.

## Our Commitment to Reporters

- **Trust**. We maintain trust and confidentiality in our professional exchanges with security researchers.
- **Respect**. We treat all researchers with respect and recognize your contribution for keeping our customers safe and secure.
- **Transparency**. We will work with you to validate and remediate reported vulnerabilities in accordance with our commitment to security and privacy.
- **Common Good**. We investigate and remediate issues in a manner consistent with protecting the safety and security of those potentially affected by a reported vulnerability.

## What We Ask of Reporters

- **Trust**. We request that you communicate about potential vulnerabilities in a responsible manner, providing sufficient time and information for our team to validate and address potential issues.
- **Respect**. We request that researchers make every effort to avoid privacy violations, degradation of user experience, disruption to production systems, and destruction of data during security testing.
- **Transparency**. We request that researchers provide the technical details and background necessary for our team to identify and validate reported issues, using the form below.
- **Common Good**. We request that researchers act for the common good, protecting user privacy and security by refraining from publicly disclosing unverified vulnerabilities until our team has had time to validate and address reported issues.

## Vulnerability Reporting

ClickUp recommends that you share the details of any suspected vulnerabilities across any asset owned, controlled, or operated by ClickUp (or that would reasonably impact the security of ClickUp and our users) using our vulnerability disclosure form at <http://clickup.com/bug-bounty>. The ClickUp Security team will acknowledge receipt of each valid vulnerability report, conduct a thorough investigation, and then take appropriate action for resolution.

## Safe Harbor

When conducting vulnerability research according to this policy, we consider this research to be:

- Authorized in accordance with the Computer Fraud and Abuse Act (CFAA) (and/or similar state laws), and we will not initiate or support legal action against you for accidental, good faith violations of this policy;
- Exempt from the Digital Millennium Copyright Act (DMCA), and we will not bring a claim against you for circumvention of technology controls;
- Exempt from restrictions in our Terms & Conditions that would interfere with conducting security research, and we waive those restrictions on a limited basis for work done under this policy; and
- Lawful, helpful to the overall security of the Internet, and conducted in good faith.
- You are expected, as always, to comply with all applicable laws.

If at any time you have concerns or are uncertain whether your security research is consistent with this policy, please inquire via <[email protected]> before going any further.
1 change: 1 addition & 0 deletions docs/.nojekyll
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
TypeDoc added this file to prevent GitHub Pages from using Jekyll. You can turn off this behavior by setting the `githubPages` option to false.
Loading

0 comments on commit db5d09a

Please sign in to comment.