Goal
Show a simple setup of headless system testing inside a container with Ruby on Rails.
There are likely many ways to achieve this goal. However, my aim was to hook into an eventual CI system where the Rails application could be validated by building the docker image and running the test suite.
I'll break this down into 3 steps:
- Create docker image with CentOS and ruby 2.4
- Create docker images for Rails and headless chrome testing
- Configure Rails for headless chrome system testing
Create docker image with CentOS and ruby 2.4
To accomplish this part I created a simple project which provides a base image for the Rails application. I decided to create this base image because I wanted to stick with CentOS, which at the time of creating, I wasn't able to find on Docker Hub for ruby 2.4.
Dockerfile
FROM centos:centos7
# Install the appropriate software
RUN yum -y update && yum clean all
RUN yum -y install \
epel-release \
which \
&& \
yum clean all \
&& \
:
# Install rvm, default ruby version and bundler.
RUN gpg --keyserver hkp://keys.gnupg.net --recv-keys D39DC0E3
RUN /bin/bash -l -c "curl -L get.rvm.io | bash -s stable"
RUN /bin/bash -l -c "echo 'source /etc/profile.d/rvm.sh' >> /etc/profile"
RUN /bin/bash -l -c "rvm install 2.4.2"
RUN /bin/bash -l -c "rvm cleanup all"
RUN /bin/bash -l -c "echo 'gem: --no-ri --no-rdoc' > .gemrc"
RUN /bin/bash -l -c "gem install bundler --no-ri --no-rdoc"
I then created a repository on Docker Hub, and setup an automated build against the Github repo. This allows it to automatically rebuild the image when changes are pushed to Github.
Create docker images for Rails and headless chrome testing
Now that the base image for ruby is built and available on Docker Hub, I can use it as the base in my example Rails application.
Here starts the tricky part:
I wanted to keep the headless chrome pieces outside of the Rails application image that would go to production. In order to do that, there are a few ways this can be accomplished:
Setup a container side by side with my application, and integrate using prebuilt Selenium containers. I initially tried this, with the help of some Docker pros, but the test suite speed was just too slow (non headless setup/impatient developer/didn't try too hard to diagnose).
Setup a container with chrome installed along with the Rails application. I ended up choosing this route, while accepting that the application image and the testing application image were not exactly the same.
Rails Application image setup
In a build script I performed the steps that would create the needed Dockerfile at runtime, and execute the build of the image. This allows the specific application build steps to be kept common to this image and the test image. Note that the Dockerfile should not be checked into the repo, but is in this application for demonstration purposes.
Build script(rails image function)
build_rails() {
cat Dockerfile.prefix.in \
sdlc/Dockerfile.in \
> Dockerfile
time docker-compose build \
rails
}
As seen above, I am concatenating the Ruby base piece (Dockerfile.prefix.in) with the common Rails part (sdlc/Dockerfile.in), and then into the final Dockerfile that acts as the input for the docker-compose build. In a real world setup the Dockerfile 'in' file pieces maybe a bit larger, and have a postfix with variables defined to tag the image.
Final Dockerfile
FROM hammer098/ruby_24
# common rails build steps
WORKDIR /web
# Install rvm, default ruby version and bundler.
COPY .ruby-version /web/.ruby-version
COPY Gemfile /web/Gemfile
COPY Gemfile.lock /web/Gemfile.lock
RUN /bin/bash -l -c "bundle install"
COPY . /web
Chrome + Rails Application image setup
In the build script similar steps were performed to craft the Dockerfile for the test image at time of build.
build_rails_test() {
cat sdlc/Dockerfile.rails-test.in \
sdlc/Dockerfile.in \
> sdlc/Dockerfile.rails-test
time docker-compose build \
rails-test
}
In this function similar steps are being performed as in the Rails application build. The steps unique to the test image are included in sdlc/Dockerfile.rails-test.in. The common Rails part are then added, and sdlc/Dockerfile.rails-test is created to be used by the rails-test image. Do not add sdlc/Dockerfile.rails-test to the repo as it is created 'on the fly' by this process.
Final Dockerfile.rails-test
FROM hammer098/ruby_24
RUN yum -y install \
chromedriver \
chromium \
gnu-free-sans-fonts \
xorg-x11-server-Xvfb \
&& \
yum clean all \
&& \
:
# common rails build steps
WORKDIR /web
# Install rvm, default ruby version and bundler.
COPY .ruby-version /web/.ruby-version
COPY Gemfile /web/Gemfile
COPY Gemfile.lock /web/Gemfile.lock
RUN /bin/bash -l -c "bundle install"
COPY . /web
Configure Rails for headless chrome system testing
For this part, I needed to configure 2 pieces:
Rails test configuration
In this part we need to setup the headless chrome driver and enable specific chrome options.
application_system_test_case.rb
require 'test_helper'
require 'selenium-webdriver'
require 'capybara'
class ApplicationSystemTestCase < ActionDispatch::SystemTestCase
Capybara.register_driver :headless_chrome do |app|
# options explained https://peter.sh/experiments/chromium-command-line-switches/
# no-sandbox
# because the user namespace is not enabled in the container by default
# headless
# run w/o actually launching gui
# disable-gpu
# Disables graphics processing unit(GPU) hardware acceleration
# window-size
# sets default window size in case the smaller default size is not enough
# we do not want max either, so this is a good compromise
capabilities = Selenium::WebDriver::Remote::Capabilities.chrome(
chromeOptions: { args: %w[no-sandbox headless disable-gpu window-size=1400,1400] }
)
Capybara::Selenium::Driver.new(
app,
browser: :chrome,
desired_capabilities: capabilities
)
end
driven_by :headless_chrome
end
The corresponding example tests that use this configuration can be found here. An explanation of why I chose to use Rails system tests over rspec can be found here.
Xvfb install and Docker entrypoint setup for test execution
In the setup at work we needed xvfb installed for the tests to function properly. However, we were also using an older chrome version(v59). On this example setup I wasn't able to prove out that xvfb was required. This was either due to the newer version of chrome (v62), or a less complex testing setup.
Setup:
- Install the xvfb-run rpm as seen above in the test docker image.
- Modify the entrypoint directive in docker-compose.yml to include xvfb-run and rails test execution steps.
docker-compose.yml rails-test-run definition
rails-test-run:
image: rails-test
entrypoint:
- bash
- --login
- xvfb-run
- rails
- test:system
- RAILS_ENV=test
In the above I am telling docker via the command docker-compose run --rm rails-test-run
to interactively login (bash --login), run xvfb, execute rails system tests, then exit. The RAILS_ENV variable is set to test so that Rails doesn't by default first look at the development db setup.
Here is an example output of a test run utilizing a test script:
docker-rails$ sdlc/test
Starting rails-test, running and exiting
Run options: --seed 22514
# Running:
Puma starting in single mode...
* Version 3.10.0 (ruby 2.4.2-p198), codename: Russell's Teapot
* Min threads: 0, max threads: 1
* Environment: test
* Listening on tcp://0.0.0.0:35683
Use Ctrl-C to stop
...
Finished in 8.587443s, 0.3493 runs/s, 0.4658 assertions/s.
3 runs, 4 assertions, 0 failures, 0 errors, 0 skips
Conclusion
I found this setup to be the one that worked in our eventual goal of hooking into a CI system.
The setup, as shown here, is far from complete for a Rails application destined for production. Here are some of the things that I believe are out of scope and were left out of this article and the example application:
- Nginx/Unicorn install and setup
- This container is running as root, in our production setup, we run as a non root user.
- Setting container to read-only and adding VOLUME directives in the Dockerfile for certain areas Rails needs for writing. Changing of permissions/ownership to those files and directories at build time.
- Mysql containers for test/development.
- Environment variable breakout.
- Handling of compiled assets and js runtime env. ...many more
Top comments (3)
Hi @dougstull, thank you for this informative tutorial.
In some of the examples i have seen, testing is normally run outside the app artifact container, either in a separate container (as the first approach mentioned) or directly in a jenkins agent. It is beneficial because it results in a minimalist artifact image.
thoughts?
Hi Jack - thanks!
If I understand correctly you are referencing the ability to run the testing suite outside the app container...
With the rails system tests, the testing suite is integrated in the app git repo and is hitting the backend. If I stick to that approach to perform the system tests, then the only option I can see is to have the system tests run from inside the container for the app.
However, if we were using some testing suite that was perhaps testing a frontend framework instead, I could definitely see that as being the way to go as it perhaps wouldn't be so tightly coupled to the backend code/framework.
For this example I believe the benefits of utilizing the rails system testing suite outweigh the negatives. I assume that would hold true for me up and until a split of the frontend and backend technologies were to happen.
Hi Doug,
Yes, I am referring to the ability to run integration tests outside the app container. I believe that in capybara and headless chrome selenium driver, you can specify an external url to run the integration test. This enables CI agent to run the test against the docker container.
Docker images, at least the production ones, are supposed to have the bare minimum of what needs to get the job done. I think it should probably exclude all the development and test gems and I would also exclude the test directory.
If the purpose of the image is for test, then it is fine to run tests inside the container itself.