Skip to content

Configure your project with Drizzle for Local & Deployed Databases

Configure your project with Drizzle for Local & Deployed Databases

Updated April 8, 2024: Thanks to Jassi Bacha for pointing out that the migrate function is not exported from the drizzle-orm/postgres-js/migrator module for Vercel. I've updated the article to reflect the correct import path.

It was a fun Friday, and Jason Lengstorf and I both decided to try and use Drizzle on our respective projects. Jason went the SQLite route and wrote an amazing article about how he got his setup working. My approach was a bit different. I started with Vercel's Postgres + Drizzle Next.js Starter and wanted to use PostgreSQL. If you don't know what Drizzle is, it's a type-safe ORM similar to Prisma. My colleague, Dane Grant, wrote a great intro post on it, so go check his article out if you want to learn more about Drizzle.

Getting my project off the ground took longer than I expected, especially coming from a starter kit, but I figured it out. This is the article I wish I had at the time to help get this project set up with less friction. I will focus on using local and Vercel PostgreSQL, but this same setup should work with other databases and adapters. I'll make some notes about where those places are. While I did use Next.js here, these setup instructions work on other projects, too.

Configuring Drizzle

Every project that leverages Drizzle requires a drizzle.config in the root. Because I'm leveraging TypeScript, I named mine drizzle.config.ts, and to secure secrets, I also installed dotenv. My final file appeared as follows:

import type { Config } from "drizzle-kit";
import * as dotenv from "dotenv";

dotenv.config();

export default {
  schema: "./db/schema.ts",
  out: "./db/migrations",
  driver: "pg",
  dbCredentials: {
    connectionString: process.env.POSTGRES_URL as string,
  },
} satisfies Config;

The schema field is used to identify where your project's database schema is defined. Mine is in a file called schema.ts but you can split your schema into multiple files and use glob patterns to detect all of them.

The out field determines where your migration outputs will be stored. I recommend putting them in a folder in the same directory as your schema to keep all your database-related information together.

Additionally, the config requires a driver and dbCredentials.connectionString to be specified so Drizzle knows what APIs to leverage and where your database lives. For the connectionString, I'm using dotenv to store the value in a secret and protect it. The connectionString should be in a valid connection format for your database. For PostgreSQL, this format is postgresql://<username>:<password>@<hostname>:<port>/<database_name>.

Getting your connection string

Now, you may be wondering how to get that connection string. If you're hosting on Vercel using their Postgres offering, you need to go to your Vercel dashboard and select their Postgres option and attach it to your app. This will set environment variables for you that you can pull into your local development environment. This is all covered in their "Getting Started with Vercel Postgres" guide. If you use a different database hosting solution, they'll offer similar instructions for fetching your connection string. drizzle screenshot However, I wanted a local database to modify and blow away as needed for this project. Out of the box, Drizzle does not offer a database initialization command and I needed something that could be easily and quickly replicated across different machines. For this, I pulled in Docker Compose and set up my docker-compose.yaml as follows:

services:
  postgres:
    image: "postgres:15.2-alpine"
    environment:
      POSTGRES_USER: postgres
      POSTGRES_PASSWORD: postgres
      POSTGRES_DB: my-local-db
    ports:
      - "5432:5432"

The 3 most important values to note here are the values in the environment key and the ports. These are what allowed me to determine my connection key. For this example, it would be: postgresql://postgres:postgres@localhost:5432/my-local-db. With the compose file set, I ran docker-compose up -d to get the container running, which also initializes the database. Now, we can connect and operate on the database as needed.

Creating the database connection

To make operations in our app, we need to get a database connection instance. I put mine in db/drizzle.ts to keep all my related database files together. My file looks like:

import { sql } from "@vercel/postgres";
import postgres from "postgres";
import {
  drizzle as VercelDrizzle,
  type VercelPgDatabase,
} from "drizzle-orm/vercel-postgres";
import {
  drizzle as LocalDrizzle,
  type PostgresJsDatabase,
} from "drizzle-orm/postgres-js";

let db:
  | VercelPgDatabase<Record<string, never>>
  | PostgresJsDatabase<Record<string, never>>;
if (process.env.NODE_ENV === "production") {
  db = VercelDrizzle(sql);
} else {
  const migrationClient = postgres(process.env.POSTGRES_URL as string);
  db = LocalDrizzle(migrationClient);
}

export { db };

This is a bit more complicated because we're using 2 different Drizzle adapters depending on our environment. For local development, we're using the generic PostgreSQL adapter, but for production, we're using the Vercel adapter. While these have different initializers, they have the same output interface which is why this works. The same wouldn't be true if you used MySQL locally and PostgreSQL in production. If we chose a RDS or similar PostgreSQL solution, we could use the same postgres adapter in both cases but change the connection string. That's all this file does at the end - detects which environment and uses the chosen adapter.

If we go to use this exported instance, it won't be able to find our tables or provide type safety. This is because we haven't created our database tables yet.

Creating database tables

To get our database tables created, we're going to leverage Drizzle's Migrations. This allows us to create atomic changes to our database as our schema evolves. To accomplish this, we define the schema changes in our schema files as specified in our config. Then we can run npm run drizzle-kit generate:pg (or whatever script runner you use) to generate the migration SQL file that will be located where we specified in our config. You want to check this file into source! By default, Drizzle doesn't allow you to override migration names yet (they're working on it!) so if you want to make your migration file more descriptive, you need to take both of these steps:

  1. Rename the migration file. Take note of the old name.
  2. Locate _journal.json. It should be in your migration folder in a folder called meta. From here, find the old file name and replace it with the new file name.

Now, we need to run the migrations. I had some issues with top-level awaits and tsx like the Drizzle docs recommend, so I had to go a slightly different route and I'm not thrilled about it still. I made a file called migrate.mts that I stored next to my drizzle.ts. In theory, I should have been able to import my drizzle connection instance here and use that, but I ran out of time to figure it out and ended up repeating myself across files. Here's the file:

import "dotenv/config";
import { migrate as LocalMigrate } from "drizzle-orm/postgres-js/migrator";
import { migrate as VercelMigrate } from "drizzle-orm/vercel-postgres/migrator";
import postgres from "postgres";
import { sql } from "@vercel/postgres";
import { drizzle as VercelDrizzle } from "drizzle-orm/vercel-postgres";
import { drizzle as LocalDrizzle } from "drizzle-orm/postgres-js";

// Could not import from drizzle.ts due to mts v ts compatibility issues
let db;
if (process.env.NODE_ENV === "production") {
  db = VercelDrizzle(sql);
  await VercelMigrate(db, { migrationsFolder: "./db/migrations" });
} else {
  const migrationClient = postgres(process.env.POSTGRES_URL as string, {
    max: 1,
  });
  db = LocalDrizzle(migrationClient);
  await LocalMigrate(db, { migrationsFolder: "./db/migrations" });
  await migrationClient.end();
}

Here, I'm connecting to the correct database pending environment then running the drizzle migrate command. For local development, I set my connection pool to max at 1. This probably isn't necessary for this use case, but when connecting to a cluster, this is a recommended best practice from the Drizzle team. For the local case, I also had to close the connection to the db when I was done. For both cases, though, I had to specify the migrations folder location. I could probably DRY this up a bit, but hopefully, the Drizzle team will eliminate this need and use the config to set this value in the future.

With the above file set and our schema generated, we can now run npm run tsx db/migrate.mts and our database will have our latest schema. We can now use the db client to fetch and store data in our database.

Note: Jason uses the push command here. This is fine for an initial database creation, but it will override tables in the future. The migration path is the recommended pattern for non-destructive database updates.

Conclusion

Congratulations! We can connect to our database and perform CRUD operations against our tables. We can also use Drizzle Studio to modify and inspect our data. To review, we had to:

  1. Setup a local PostgreSQL server via a tool like Docker Compose
  2. Configure the database adapter to work in local mode
  3. Generate a schema
  4. Create a script to execute migrations so our database is aligned with our schema

This was my first experience with Drizzle, and I enjoyed its SQL-like interfaces which made it easy to quickly prototype my project. I noticed in their Discord that they're about to have a full-time maintainer so I'm excited to see what their future looks like. I hope you enjoy it too!

This Dot is a consultancy dedicated to guiding companies through their modernization and digital transformation journeys. Specializing in replatforming, modernizing, and launching new initiatives, we stand out by taking true ownership of your engineering projects.

We love helping teams with projects that have missed their deadlines or helping keep your strategic digital initiatives on course. Check out our case studies and our clients that trust us with their engineering.