How to deploy a pet project?

A week ago, I have started to work on a new pet project. Since I went from software engineering into product management 7 years ago, I did not work on more serious projects, but I kept myself up to date with the technology, especially the infrastructure side of it. As a result, once I decided that I want to code my project in Django, the immediate next question was how to deploy it in an automatic and modern way.

TL;DR

  1. Set up a Dockerfile
  2. Test it locally using docker compose
  3. Start a DigitalOcean App, and grab its configuration
  4. Understand the config, and tweak it if needed using doctl

The Story

Where to deploy

There are many alternatives. First, instead of using any backend framework, I could have gone serverless. I am pretty familiar both with AWS and GCP, but I decided that a proper framework will help me a lot to move fast. So, I choose Django. Once I need a backend, I need some level of infrastucture management around it. What are the options at first?

  • AWS ECS with Fargate
  • AWS ECS with EC2
  • GKE Autopilot
  • Setting up a Nomad cluster
  • DigitalOcean App

The majority of these require containers, and actually behind the hood all of them use containers. So, which one to choose? And how to deploy?

I have a few preferences:

  1. simplicity
  2. affordability

While an Autopilot cluster is way more expensive than any other option, I decided that it’s still affordable, if it works well.

Deployment method

Iteration 1: GitLab features

My primary SCM/Issue tracker/DevOps tool is GitLab. So, I use it for this project too. GitLab has two interesting initiative that I thought to give a try:

Both failed me. As my team is responsible for Auto DevOps (I am the product manager), I knew that Auto DevOps won’t be a good fit for me as it requires a full-featured cluster, it does not support Autopilot, and the GitLab Kubernetes Agent does not support Auto DevOps, while I would prefer to not use any other way to connect my cluster.

The 5-minute app seemed compelling at first, and I quickly managed to start my first pipeline using the Django template. Unfortunately, the deployment job failed every time. I have reached out to its developers, and I learned that the Django template is a work in progress.

Iteration 2: GKE Autopilot

I tried to set up a GKE Autopilot cluster using Terraform and the GitLab Kubernetes Agent for deployments. As for a properly working setup I would need tools, like cert-manager and Autopilot does not support 3rd party webhooks, I had to turned this approach down. Of course, I could still go with a full-blown GKE cluster, but I don’t want to waste my time managing it.

Iteration 3: ECS with Fargate

I really hate the hundreds of atomic AWS services! I know that setting up a simple webapp would require a huge amount of resources, and I don’t know the AWS ecosystem well enough to do it on my own. Thankfully, I learned that docker compose has built-in AWS support!

From its description, it seems like a really nice integration. So I thought to give it a try!

I could quickly find a docker compose tutorial on creating a Django+Postgres stack, and started from there. This led me to have a nicely working Dockerfile, but I could not figure out how to create a Postgres instance at AWS without opening and being lost in their console. So, I decided to move on.

Iteration 4: DigitalOcean App

I have a bit of experience with DigitalOcean Apps, this made me confident enought to give it a try, and it worth the time! I am a huge fan of the simple continuous deployment tools like Vercel or Netlify. DigitalOcean managed to create a similarly easy to use platform for any type of service. While I can create the service using their UI, I don’t have to look for the right service and won’t get lost as they have a simple wizard that drives me through the setup process in 4 steps.

Finally, the dedicated “How To Deploy a Django App on App Platform” was of great help in setting up all the environment variables in a neat way.

Once my app is deployed, I can download its configuration to store it alongside with my app code, edit it, and modify the running configuration from the CLI. This is kind-of infrastructure as code, that was really useful to finish the setup.

Final setup

Continuous Deployment setup

The DigitalOcean App has 3 components

  • Django webapp
  • Static files
  • Postgres connection

Creating the webapp and static files at the same time was not trivial. As my repo already contained a Dockerfile, the DigitalOcean App platform UI forced me to use that Dockerfile for the static app too. Here, I could use the downloaded configuration to specify separate dockerfiles for the webapp and the static file generation.

In an ideal world, I would prefer to build everything on the GitLab side, and have just the deployment to reach out to my final platform. Unfortunately, this is not the case with this setup. Today, I run the tests in CI, then call out to DigitalOcean to start a new deployment, and the containers are built inside DigitalOcean, and stored in a hidden registry that I can not access. Instead, I would prefer to build the webapp container with GitLab, store the resulting container within GitLab, update the static files without building a separate container for it, just pushing the a storage server, and finally tell to the platform what container to deploy. Even though this ideal workflow is just a dream, I could use GitLab’s environments to have a bit of more insights into the deployments. Moreover, the current setup scales relatively well, as one could easily set up even review environments.

Local environment / Continuous Delivery

As the docker configurations are ready, and in “Iteration 3” presented above I built a working docker-compose.yaml, it was really simple to modify the compose settings, and be able to use the production webapp dockerfile locally too.