Automated Database Migrations

We need a database project. Since we have a development and production environment for our front-end and backend-end code, we will need to also have separate databases so that one can be used for testing.

Since this project has potential for multiple developers working on it, multiple branches, multiple databases, and needs to migrate scripts from various versions, we need to have some kind of migration plan in effect to make it easy for everyone. I’ve had experience with Ruby on Rails Active Record. It versioned the migrations, and had the ability to rollback an update. Npm has a package called mysql-migrations that performs this same capability.

Basically you have a migrations folder with many files that are prefixed with timestamps. You can migrate a database from one version to another. You can upgrade and downgrade along the way based on the timestamps and a table in the database that remembers which scripts were executed. The package also lets you store a snapshot of the database as one script so that you don’t need to run all of the migration scripts to start off with a new database.

I first setup a .env-template and added .env to .gitignore

PERIPLUX_DB_HOST=localhost
PERIPLUX_DB_USER=YOUR_USERNAME
PERIPLUX_DB_PASSWORD=YOUR_PASSWORD
PERIPLUX_DB_NAME=local_periplux

Then I setup the migration script in the scripts using environment variables

const { createPool } = require('mysql');
const { init } = require('mysql-migrations');
const path = require('path');

var connection = createPool({
  connectionLimit : 10,
  host     : process.env.PERIPLUX_DB_HOST,
  user     : process.env.PERIPLUX_DB_USER,
  password : process.env.PERIPLUX_DB_PASSWORD,
  database : process.env.PERIPLUX_DB_NAME
});

init(connection,path.join(__dirname, '../migrations'), () => {
  console.log('Done.');
});

I added a few scripts to package.json to use the env file

  "scripts": {
    "update-schema": "node --env-file=.env ./scripts/migration.js up --update-schema",
    "load-from-schema": "node --env-file=.env ./scripts/migration.js load-from-schema",
    "migrate-all": "node --env-file=.env ./scripts/migration.js up --migrate-all",
    "up": "node --env-file=.env ./scripts/migration.js up",
    "down": "node --env-file=.env ./scripts/migration.js down",
    "refresh": "node --env-file=.env ./scripts/migration.js refresh",
    "migration": "node --env-file=.env ./scripts/migration.js add migration",
    "seed": "node --env-file=.env ./scripts/migration.js add seed",
    "run": "node --env-file=.env ./scripts/migration.js add run",
  },

Documented what they do in the readme.md

- `update-schema` updates `schema.sql` with latest snapshot of tables
- `load-from-schema` creates a new database from `schema.sql` without migrations
- `migrate-all` migrates older scripts that have not previously run
- `up` upgrades all pending migrations to the latest version
- `up <count>` applies the next `<count>` `up` migrations from the current version
- `down` runs one `down` migration
- `run <file> <up|down>` runs the specific migration up or down.
- `migration <description>` alter structure of database
- `seed <description>` alter data within database

The scripts work with SQL as javascript strings. To get the full highlighting effect, I installed the vscode-sql-template-literal extension and created a generic string template function.

const sql =  (strings, ...values) => strings.map((str, index) => 
  str + (values.length > index ? String(values[index]) : '')
).join('');

module.exports = sql;

And then I created a migration npm run migration create_function_uuid4. It created a file that exported an object with an up and down key. I changed the string values to represent how to add and remove my uuid4 function.

SQL Syntax highlighting string template via extension

Once we run npm run up our local MariaDB database gets the new function added as well as a table that includes the scripts timestamp.

MySQL Workbench with migrated script

Now its time to setup our GitHub projects actions to migrate the database to the development database. First – we don’t know what IP address github is going to be accessing the database from. It’s not just one block of IP addresses. Until we can track this down, we need to open our database to any remote ip address.

Allow any IP to access the database

Next, we setup our YAML files. We can expose our secrets as environment variables. Since we are using the --env-file flag with node, we only need to make sure the file exists. It can be empty.

name: Development

on:
  push:
    branches:
      - develop

env:
  PERIPLUX_DB_HOST: ${{ secrets.DEV_DB_HOST }}
  PERIPLUX_DB_USER: ${{ secrets.DEV_DB_USER }}
  PERIPLUX_DB_PASSWORD: ${{ secrets.DEV_DB_PASSWORD }}
  PERIPLUX_DB_NAME: ${{ secrets.DEV_DB_NAME }}
      
jobs:
  build:
    name: Migrate
    runs-on: ubuntu-latest
    timeout-minutes: 2 # Limited server resources

    steps:
      - name: Get latest code
        uses: actions/checkout@v4

      - name: Setup environment
        run: touch .env

      - name: Setup Node
        uses: actions/setup-node@v4
        with:
          node-version-file: '.nvmrc'

      - name: Install Dependencies
        run: npm ci

      - name: Run pending migrations
        run: npm run up

Now we run and see our remote database has been migrated!

Automated database deployments

Discover more from Lewis Moten

Subscribe now to keep reading and get access to the full archive.

Continue reading