Skip links
monitorizar-el-rendimiento-de-contenedores-Docker

Applications.Export Jenkins jobs from a server or locally, to a Docker container.Debugging High CPU usage in gh Applications.

By Jorge Díaz on Friday, September 27, 2019

Brief.

In some cases when you are working on a new project for a company that has many other projects running on production, the best idea is avoiding to break the systems that are already running, and not to spend hours trying to upgrade the libraries or packages from the production server with the objective to fulfill the new project requirements.

There are a lot of solutions that can be implemented. One option is to make a specific server in order to run the new development with the operating system and the libraries that are needed. Another one is to create a virtual machine or a specific docker container inside the production server in order to isolate the software and the configurations needed to run our new application.

Introduction.

“A container is a standard unit of software that packages up code and all its dependencies so the application runs quickly and reliably from one computing environment to another. A Docker container image is a lightweight, standalone, executable package of software that includes everything needed to run an application: code, runtime, system tools, system libraries, and settings.” (1)

What Docker does is to isolate the packages, software, dependencies, and libraries for each application by packing them into containers. When we run a container it actually executes a particular image that has only the software for what it’s built for.

I will explain one problem I had and how I solve it.

The problematic and its solution.

The project I was working on was about to generate a new build of the software, then install it on a UI automated test application and perform a series of UI tests.

The first issue I faced was about the compatibility between the ubuntu server that was in production and the requirements of the UI test software application. The second one was that in order to run the UI tests I needed a graphical environment installed on the production server that they hadn’t. The third one was that some ubuntu libraries that were used on production had to be upgraded, and we know that’s a big problem when we have applications on production.

The solution was to use the Jenkins: LTS (Long Term Support) Docker container and install all the packages, configurations, and scripts I needed to perform the build of the project and the UI tests via “Dockerfile”.

The problematic and its solution.

The project I was working on was about to generate a new build of the software, then install it on a UI automated test application and perform a series of UI tests.

The first issue I faced was about the compatibility between the ubuntu server that was in production and the requirements of the UI test software application. The second one was that in order to run the UI tests I needed a graphical environment installed on the production server that they hadn’t. The third one was that some ubuntu libraries that were used on production had to be upgraded, and we know that’s a big problem when we have applications on production.

The solution was to use the Jenkins: LTS (Long Term Support) Docker container and install all the packages, configurations, and scripts I needed to perform the build of the project and the UI tests via “Dockerfile”.

nearshore in mexico team5

The implementation.

Background.

First of all, I was running Jenkins on my local machine (Ubuntu 18.04 LTS). At that time I had two jobs, one for the build and another to make the UI tests when the build was finished. Along with Jenkins, I had installed two plugins that helped me simulate a graphical environment. I also had a folder with all the tests that I needed to execute the UI tests.

Steps to install Docker-ce.

Add the official Docker GPG key.

curl -fsSL “https://download.docker.com/linux/ubuntu/gpg” | sudo apt-key add –

 Verify GPG key was properly installed.

sudo apt-key fingerprint 0EBFCD88

 Add Docker’s repository (stable version).

sudo add-apt-repository “deb [arch=amd64] https://download.docker.com/linux/ubuntu $(lsb_release -cs) stable”

 Update the package index.

sudo apt-get update

 Install Docker (CE version).

sudo apt-get install Docker-ce

Any trouble installing Docker CE version, these guides can help you: https://docs.docker.com/install/linux/docker-ce/ubuntu/https://docs.docker.com/v17.09/engine/installation/linux/docker-ce/ubuntu/#uninstall-old-versions

Verify Docker was installed.
 sudo docker run hello-world

 Add users who can access to Docker. 
sudo adduser <username> docker

 Steps to run Jenkins container.
Install Jenkins with Docker. 
sudo docker pull jenkins/jenkins:lts

This will download the latest LTS Jenkins image from the docker’s repository.

Verify Jenkins’s image was added correctly.
sudo docker images

The following steps are optional in case you want to use a clean installation of Jenkins.
Test the downloaded docker container.
sudo docker run -p 49001:8080 -t jenkins/jenkins:lts

Verify if the Jenkins container is running.
sudo docker ps -a

Access container.
sudo docker exec -it <container_id> /bin/bash

 Obtain the IP address where Jenkins is running.
sudo docker inspect -f ‘{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}’ <container_id>

 Access Jenkins via web.
Go to 172.17.0.2:8080

At this point, we are now running a clean installation of Jenkins. What we want is to migrate the configurations from our local Jenkins to the Docker container we just created. Also, we want to install some specific packages and libraries inside the container. This is like cloning our local Jenkins environment with all the configurations and libraries that Jenkins needs to run all the jobs we have working on our local machine.

 Configure, install and run customenkins container.

In order to accomplish the configuration, installation, and migration we need to create a file called Dockerfile. In this file, we will indicate Docker what instructions run in order to have our Jenkins container as we want it to.

Inside a file called “Dockerfile” we will write the instructions to customize our new Jenkins container. Here is an example:

Explanation

On the first line, we need to specify the container that we will use as our starting point. In this case, we will use the jenkins/jenkins:lts image that we just created.

In order to run and execute certain commands, we need to specify the user that will perform such actions. In this example, we will use the root user on the second line.

Lines 5 and 6 are about the packages and libraries we want to install. In this case, we first update the package index and then install Ant.

We can declare arguments as variables like “job_name” in line 9. In this case, we will use “test” as the name of the job we want to import into the new container.

As we are using a Jenkins container, the JENKINS_HOME is already set, so we can use it to create a folder for our “test” job as described in line 12.

In some cases, we need to take advantage of the numerous plugins that Jenkins has, in line 15 I set the instruction to install the Ant plugin with the version 1.8. Again, as we are using a Jenkins container, by default inside /usr/local/ we will find the install-plugins.sh script that will help us to install all the plugins we want.

In this case, I had an Ant configuration file from my local Jenkins. In this file, I’ve specified the name and version of the plugin. This is defined at line 18. (See ant.xml example)

Finally, on line 21 we copy the Jenkins job from our local machine to the new Jenkins container. (See config.xml example)

ant.xml example:

config.xml example:

 Go inside the folder where you create the Dockerfile and execute the next command to generate a custom Jenkins image (including the dot).

sudo docker build -t jenkins:test .

 Verify if the custom Jenkins image has been created.
sudo docker images

 Run a Docker container using the Jenkins custom image.

sudo docker run -p 49001:8080 -t jenkins:test

 Access Jenkins via the web.
Go to localhost:49001 and you will see the job that we exported.

Verify that the Ant plugin is configured.
Go to “Manage Jenkins” -> “Global Tool Configuration” and click on “Ant installations…”

Then go back to the Dashboard and click on “Manage Jenkins” -> “Manage Plugins” -> “Installed” and you will see that Ant is installed.

 Finally, we can run the job.

Conclusion.

The Docker Containers are a very powerful tool that we can use in cases where we want to have all the systems isolated from each other. We can have the Dockerfile configured exactly as we want, and then we can go anywhere and run a Docker image. Voilà!, we have our system running and configured.

What we’ve seen here is just a little of what Docker has to offer as a tool. This is a small example of what we can accomplish with Docker Containers. There are more advanced features that we can implement, like making our own container, run Jenkins inside an Ubuntu container, and many other very complex scenarios.

nearshore in mexico team4

References.

What is a Container? | Docker. (2019). Retrieved September 25, 2019, from https://www.docker.com/resources/what-container

Top Tutorials To Learn Docker To Run Distributed Applications. (2019, March 15). Retrieved September 25, 2019, from https://medium.com/quick-code/top-tutorials-to-learn-docker-to-run-distributed-applications-bce896e260ec

Author: Jorge Díaz

Leave a comment

This website uses cookies to improve your web experience.