DEV Community

Shi Han
Shi Han

Posted on • Updated on

Looking into PostgreSQL DB's tables of your Development Environment

You just joined an established project and have been given access to the source code. Sometimes we find that it is easier to understand the structure of the project by looking into the database tables and the relations between them. Recently I found myself in this situation with a project that can be built locally with Docker Compose. This is a note about what I did to glimpse in the project's tables.

Setup

In order to allow readers to follow along for a more hands-on experience, we are going to use the Dockerize Rails app created by Georg Ledermann as the "established project." The project already has a docker-compose.yml setup that connects a PostgreSQL database to a Ruby on Rails application.

GitHub logo ledermann / docker-rails

Dockerize Rails 6 with ActionCable, Webpacker, Stimulus, Elasticsearch, Sidekiq

Docker-Rails

Maintainability

Simple Rails 6 application to demonstrate using Docker for production deployment. The application is a very simple kind of CMS (content management system) allowing to manage posts. Beside the boring CRUD functionality it has some non-default features.

This project aims to build a lean Docker image for use in production. Therefore it's based on the official Alpine Ruby image, uses multi-stage building and some optimizations that I described in my blog. This results in an image size of ~80MB.

Features

  • Auto refresh via ActionCable: If a displayed post gets changed by another user/instance, it refreshes automatically using the publish/subscribe pattern
  • Full text search via Elasticsearch and the Searchkick gem to find post content (with suggestions)
  • Autocompletion with autocompleter
  • Editing HTML content with the WYSIWYG JavaScript editor Trix
  • Uploading images directly to S3 with the Shrine gem and jQuery-File-Upload
  • Background jobs with ActiveJob and the Sidekiq gem (to…

Let's setup the project locally by referring to the Check it out! section of the repository:

$ git clone https://github.com/ledermann/docker-rails.git
$ cd docker-rails
$ docker-compose build
$ docker-compose run app yarn install
$ docker-compose up -d

At this point, you should have a few services running which you can check with the docker-compose ps command:

$ docker-compose ps
            Name                          Command               State            Ports
-----------------------------------------------------------------------------------------------
docker-rails_app_1             docker/startup.sh                Up      0.0.0.0:32774->3000/tcp
docker-rails_db_1              docker-entrypoint.sh postgres    Up      5432/tcp
docker-rails_elasticsearch_1   /usr/local/bin/docker-entr ...   Up      9200/tcp, 9300/tcp
docker-rails_redis_1           docker-entrypoint.sh redis ...   Up      6379/tcp
docker-rails_worker_1          bundle exec sidekiq              Up      3000/tcp

If you want to checkout the web application that is running on your host, visit http://localhost:32774. Note that the value 32774 comes from the Ports column of docker-rails_app_1 in the output of the docker-compose ps: container's port 3000 is exposed on the host's port 32774.

The demo Docker-Rails app

Accessing the Database

Here we will show how to access to the development database using two different tools:

  1. pgcli,
  2. SchemaSpy for a more visualized approach.

Before doing so, we need to know the credentials for the database which can be found on the docker-compose.yml. The database that we are interested in is docker-rails_development. Let's export these value as environment variables:

$ export DB_USER=postgres
$ export DB_PASSWORD=foobar123
$ export DB_NAME=docker-rails_development

pgcli

Install pgcli based on the instructions in official documentation. Based on the information below we want to connect to the database (docker-rails_db_1) on port 5432.

$ docker-compose ps db
      Name                     Command              State    Ports
--------------------------------------------------------------------
docker-rails_db_1   docker-entrypoint.sh postgres   Up      5432/tcp

However the port is not exposed to the host. Therefore, doing the following will fail

$ pgcli -h localhost -p 5432 -U "${DB_USER}" -d "${DB_NAME}"
could not connect to server: Connection refused
        Is the server running on host "localhost" (::1) and accepting
        TCP/IP connections on port 5432?
could not connect to server: Connection refused
        Is the server running on host "localhost" (127.0.0.1) and accepting
        TCP/IP connections on port 5432?

In this situation, we can use the IP address of the container instance to access the container without exposing the port to the host machine. The IP address can be obtained via

$ docker inspect docker-rails_db_1 | grep IPAddress
            "SecondaryIPAddresses": null,
            "IPAddress": "",
                    "IPAddress": "172.24.0.4",

Using the IP address above, we can access the database via

$ pgcli -h 172.24.0.4 -p 5432 -U "${DB_USER}" -d "${DB_NAME}"
Server: PostgreSQL 11.4
Version: 2.1.1
Chat: https://gitter.im/dbcli/pgcli
Mail: https://groups.google.com/forum/#!forum/pgcli
Home: http://pgcli.com
docker-rails_development>

SchemaSpy

For SchemaSpy, we will try will a different approach: Using the SchemaSpy's Docker container to connect to the existing database and generate the database diagrams for us. For this we need to know the name of the network created by docker-compose for this project:

$ docker network list | grep docker-rails
27a3a7b42168        docker-rails_default   bridge              local

Now that we know the network name, we can use the --net option in the docker run command to connect the SchemaSpy container to the existing project network. The following will generate the database diagrams in the db_out/ directory.

$ export DB_HOST=docker-rails_db_1
$ export DB_NET=docker-rails_default
$ mkdir -p db_out
$ docker run --rm --net "${DB_NET}" \
  -v `pwd`/db_out:/output schemaspy/schemaspy:6.0.0 \
  -t pgsql \
  -db "${DB_NAME}" \
  -host "${DB_HOST}" \
  -port 5432 \
  -u "${DB_USER}" \
  -p "${DB_PASSWORD}"

Open db_out/index.html with a browser to checkout the generated diagrams.

Using SchemaSpy

Cleanup

That's all for this post. The tricks mentioned above might seem limited to PostgreSQL but with little modification we believe that it can be applied on other types of databases too. As for cleanup, since it is a Docker Compose project, cleanup is as simple as:

$ docker-compose stop
$ docker-compose rm

Found a typo?

Thank you for reading! Found a typo? See something that could be improved or anything else that should be updated on this blog post? Thanks to this project, you can easily create a pull request on https://github.com/shihanng/dev.to to propose a fix instead of posting a comment.

Top comments (0)