Quick SQLite to PostgreSQL Migration with pgloader

pgloader is an open-source data loading tool that efficiently streams data into PostgreSQL using the COPY protocol. It handles schema discovery, data transformations, and error management automatically. For SQLite migrations, it can handle both schema and data in a single command, making it perfect for automated database migrations. 1. Restore SQLite from Litestream # Set your AWS credentials and restore the backup export LITESTREAM_ACCESS_KEY_ID="your_access_key" \ LITESTREAM_SECRET_ACCESS_KEY="your_secret_key" && \ litestream restore -o storage/production.sqlite3 \ s3://your-bucket/storage/production.sqlite3 2. Migrate to PostgreSQL For cloud PostgreSQL (e.g., Supabase): docker run --rm -it \ -v /path/to/your/sqlite:/data/production.sqlite3 \ dimitri/pgloader:latest \ pgloader --with "DATA ONLY" --verbose \ sqlite:///data/production.sqlite3 \ postgresql://user:password@your-postgres-host:6543/postgres For local PostgreSQL: docker run --rm -it \ -v /path/to/your/sqlite:/data/production.sqlite3 \ dimitri/pgloader:latest \ pgloader --with "DATA ONLY" --verbose \ sqlite:///data/production.sqlite3 \ postgresql://postgres:postgres@host.docker.internal:5432/your_db Key points: Use --with "DATA ONLY" if tables are already created in PostgreSQL Use host.docker.internal to connect to local PostgreSQL from Docker Ensure your SQLite file path in the volume mount matches the path in the pgloader command Important Notes Always backup your data before migration Test the migration process in a staging environment first Plan for downtime during the migration Update your application's database connection strings after migration

May 1, 2025 - 18:44
 0
Quick SQLite to PostgreSQL Migration with pgloader

pgloader is an open-source data loading tool that efficiently streams data into PostgreSQL using the COPY protocol. It handles schema discovery, data transformations, and error management automatically. For SQLite migrations, it can handle both schema and data in a single command, making it perfect for automated database migrations.

1. Restore SQLite from Litestream

# Set your AWS credentials and restore the backup
export LITESTREAM_ACCESS_KEY_ID="your_access_key" \
       LITESTREAM_SECRET_ACCESS_KEY="your_secret_key" && \
litestream restore -o storage/production.sqlite3 \
  s3://your-bucket/storage/production.sqlite3

2. Migrate to PostgreSQL

For cloud PostgreSQL (e.g., Supabase):

docker run --rm -it \
  -v /path/to/your/sqlite:/data/production.sqlite3 \
  dimitri/pgloader:latest \
  pgloader --with "DATA ONLY" --verbose \
    sqlite:///data/production.sqlite3 \
    postgresql://user:password@your-postgres-host:6543/postgres

For local PostgreSQL:

docker run --rm -it \
  -v /path/to/your/sqlite:/data/production.sqlite3 \
  dimitri/pgloader:latest \
  pgloader --with "DATA ONLY" --verbose \
    sqlite:///data/production.sqlite3 \
    postgresql://postgres:postgres@host.docker.internal:5432/your_db

Key points:

  • Use --with "DATA ONLY" if tables are already created in PostgreSQL
  • Use host.docker.internal to connect to local PostgreSQL from Docker
  • Ensure your SQLite file path in the volume mount matches the path in the pgloader command

Important Notes

  1. Always backup your data before migration
  2. Test the migration process in a staging environment first
  3. Plan for downtime during the migration
  4. Update your application's database connection strings after migration