I am developing a FastAPI project and using SQLAlchemy and Alembic. I'm running everything in Docker containers with Docker Compose. Now, I'm worried about something. I would like to take advantage of the alembic revision --autogenerate command, but it requires a connection to the database, obviously, as it compares to the current schema.
I came up with the following solution, which I'm unhappy with:
I spin up just the database container
docker-compose up -d databaseThen I put a connection string on a .env file inside the particular FastAPI service I'm working on
I use
load_dotenv()on theenv.pyfile to set theDATABASE_URLenvironment variable which is used to connect to the dtabaseNow I can do
alembic revision --autogenerate
It works, but I'm unhappy with it for many reasons. It doesn't feel very professional, it feels there are lots of "manual stuff" and there are things that simply don't feel natural.
For example: the connection string I'm manually putting on the .env file is just stupid. This is something I need only when coding, it has nothing to do with running the app. Second, the actual connection string to run the app is provided in the environment files from Docker Compose, it doesn't make sense to have it in an additional .env
What is the most professional way of dealing with this? How people deal with generating the migrations with Alembic when using Docker Compose?
DATABASE_URLenvironment variable without involving a file, but otherwise what you describe seems like a reasonable enough setup.alembicfrom within my application's docker container withexecand the dev db is already running on another docker container in the same docker network. Are running your application (fastapi) INSIDE a docker container or actually just locally outside of docker? How will you run your migrations when you deploy?