Migrate your Flask SQLite DB to Postgres

Sqlite is nice and lightweight for use in development but isn't necessarily desirable to use in production. Generally a object-relational database like postgres is advised for production environments where you expect many concurrent queries. Here's how you go about switching:

On your server, install PostgreSQL:

sudo apt update
sudo apt install postgresql postgresql-contrib

Change the default user's password and create a new superuser 'dbuser':

sudo su postgres
psql
ALTER USER postgres WITH PASSWORD 'postgrespasswordhere';
CREATE USER dbuser WITH PASSWORD 'dbuserpasswordhere';
ALTER USER dbuser WITH SUPERUSER;

Create new db, myapp:

CREATE DATABASE myapp;
\l

Ensure your config json file has the correct SQLALCHEMY_DATABASE_URI for postgres like so:

{
"SQLALCHEMY_DATABASE_URI": "postgresql://dbuser:dbuserpassword@localhost:5432/myapp"
}

And because we only want postgres on production we have as our config.py:

import json
import os

with open("/etc/yourproject-config.json") as config_file:
    config = json.load(config_file)


class Config:

    # Define the database
    PROJECT_ROOT = os.path.dirname(os.path.abspath(__file__))
    BASE_DIR = os.path.dirname(PROJECT_ROOT)

    DEBUG = config.get("DEBUG")

    if DEBUG:
        SQLALCHEMY_DATABASE_URI = "sqlite:///" + os.path.join(
            BASE_DIR, config.get("SQLALCHEMY_DATABASE_NAME")
        )
    else:
        SQLALCHEMY_DATABASE_URI = config.get("SQLALCHEMY_DATABASE_URI")

    DATABASE_CONNECT_OPTIONS = {}

    # Check overhead before enabling this in production
    SQLALCHEMY_TRACK_MODIFICATIONS = False

    # Application threads. A common general assumption is
    # using 2 per available processor cores - to handle
    # incoming requests using one and performing background
    # operations using the other.
    THREADS_PER_PAGE = 2

    # Enable protection agains *Cross-site Request Forgery (CSRF)*
    CSRF_ENABLED = True

    # Use a secure, unique and absolutely secret key for
    # signing the data.
    CSRF_SESSION_KEY = config.get("CSRF_SESSION_KEY")

    # Secret key for signing cookies
    SECRET_KEY = config.get("SECRET_KEY")

    # Mail, for reset password & contact form
    MAIL_SERVER = config.get("MAIL_SERVER")
    MAIL_PORT = 587
    MAIL_USE_TLS = True
    MAIL_USERNAME = config.get("MAIL_USERNAME")
    MAIL_PASSWORD = config.get("MAIL_PASSWORD")
    MAIL_RECIPIENT_USERNAME = config.get("MAIL_RECIPIENT_USERNAME")

Stop your server:

sudo supervisorctl stop yourproject

On our server, unstall psycopg2-binary locally

cd ~/yourproject
source venv/bin/activate
pip install psycopg2-binary

Now let's dump our old sqlite db to a .sql file:

cd
sqlite3 app.db .dump > dump.sql

To make sure this will be compatible with postgres we could just attempt to load it and work through the errors:

psql -h localhost -d yourproject -U yourdbuser -f ~/dump.sql

In my specific case, I had to change my id's from INTEGER to SERIAL, DATETIME to TIMESTAMP and put quotes around references to table "user".

Restart the server:

sudo supervisorctl start yourproject

After you successfully import your data you're good to go. It is worth noting that id's will increment from 1 so if you have say 10 existing records, you'll get 10 errors before a record can be saved. This is fine if you have a small data set at the start of the project but If you are converting a large database I suggest looking for a way to set the id to the next available.