Search Tutorials


Docker Swarm Tutorial - Deploying Spring Boot Microservices to AWS| JavaInUse

Docker Swarm Tutorial - Deploying Spring Boot Microservices to AWS

In a previous tutorial we had deployed services in a docker swarm using docker stacks. We were using Play With Docker to simulate multiple nodes in Docker Swarm. In this tutorial we will be starting multiple AWS EC2 instances and deploying the microservices on them using Docker Swarm.
AWS EC2 Docker Swarm Tutorial

Docker Tutorial - Table Of Contents

Docker Deploying Spring Based WAR Application to Docker Deploying Spring Based JAR Application to Docker Deploying Multiple Spring Boot Microservices using Docker Networking Deploying Multiple Spring Boot Microservices using Docker Compose Deploying Spring Boot + MYSQL Application to Docker Publishing Docker Image to DockerHub Deploy Docker Swarm services using Play With Docker Deploy Docker Swarm services using Docker Stack Deploy Docker Swarm services to multiple AWS EC2 instances Docker Cheat Sheet

Video

This tutorial is explained in the below Youtube Video.

Lets Begin-

Starting multiple EC2 instances for deploying services using Docker Swarm

For this you will need to register with Amazon web services and create an AWS account. When registering the service we will need to provide credit card details. AWS is free for a period of 1 year. But there are some usage limitations. If these are crossed then AWS will charge you. In this tutorial we will be starting two AWS EC2 instance. Once we are done with this tutorial do remember to stop/terminate the EC2 instances.
  • Once we have registered with AWS go to the services section and select EC2-
    AWS services
    We will see the EC2 Dashboard. We will see there are zero instances running.
    EC2 services
  • From the left side menu select the Security Group
    EC2 Security Group
  • For running swarm in containers docker has mentioned the rules. We need to open the following ports-
    Docker Container cloud port rules



Create new security Group named docker with following inbound and outbound rules.
Docker Container inbound rules

Docker Container outbound rules
  • Next go again to EC2 home page and click on Launch Instance
    EC2 services
  • Select Amazon Linux 2 AMI(HVM) Machine.
    Amazon Linux 2 AMI
  • Select the Instance Type as t2.micro which is the default option.Select Configure Instance Details Button
    Amazon Linux Instance Type
  • Keep the default Configure Instance Details as provided and select Add Storage Button
    Amazon Linux Configure Instance Details
  • Keep the default Storage setting and click Add Tags Button
    Amazon Linux Storage setting
  • In the Tags section add a new tag named ec1 and select Configure Security Group Button
    Amazon Linux Tags section
  • In Configure Security Group section select the existing security group named docker that we had created previously.
    Amazon Linux Configure Security Group
  • Finally launch new instance. Create a new key pair named ec1 and download the key named ec1.pem.
    Amazon Linux Launch
  • Again follow all the steps mentioned above for creating another EC2 instance. Only add the tag as ec2 and when launching the instance dont create a new key pair but the existing key pair named ec1.pem So we have launched two EC2 instances.
    AWS EC2 instances
    Next using Putty we will be connecting with them. For this we will first need to convert the ec1.pem key to ec1.ppk format. This is done using PuttyGen as follows-
  • Open PuttyGen
    Open PuttyGen
  • Select the ec2.pem file from where you have stored it. Select save private key and save the key as ec2.ppk.
    Select pem key in PuttyGen
  • Next we will be connecting both the EC2 instances using Putty.
    Open Putty
  • Open Putty instance- In the AWS portal, when you select EC2 there is a connect button which gives us details regarding connecting to the EC2 instance.
    EC2 instance details
  • In Putty enter the Host from above as ec2-user@ec2-18-216-91-80.us-east-2.compute.amazonaws.com and in the SSH->Auth select the ec1.ppk key. Click on connect button.
    EC2 instance select key
  • The EC2 instance is now connected using Putty-
    EC2 instance using putty
  • Similarly connect to the second EC2 instance.
    EC2 instance
  • Starting services on AWS EC2 instances using Docker Swarm

    In both instances install Docker as follows- Start docker service on both EC2 instances
     sudo yum install docker
     

    EC2 instance install docker service
    In one EC2 instance which will be the leader node start docker swarm as follows-
    sudo docker swarm init
     

    EC2 instance start docker service
    In the second EC2 instance which will be the worker node use the join command as follows-
    sudo docker swarm join --token <Token>
     

    EC2 instance init swarm
    We can list the nodes in the docker swarm as follows-
    sudo docker node ls
     

    EC2 instance list docker service
    Now as in previous tutorial we will be creating the Docker Stack file named docker-compose.yaml as follows-
     sudo vi docker-compose.yaml
     

    EC2 instance docker stack
    The content of the file will be as follows-
     version: "3"
    services:
      consumer:
        image: javainuse/employee-consumer
        networks:
          - consumer-producer
        depends_on:
          - producer
     
      producer:
        image: javainuse/employee-producer
        ports:
          - "8080:8080"
        networks:
          - consumer-producer 
    
    networks:
      consumer-producer:
     

    EC2 instance docker stack configuration
    Next deploy the Docker Stack to multiple AWS EC2 instances using the above created stack file -
     sudo docker stack deploy -c docker-compose.yaml dockTest
     

    EC2 instance docker stack deploy
    We can list the running services in docker swarm as follows-
     sudo docker service ls
     

    EC2 instance docker services list
    Also by listing the running containers we can find the employee consumer and employee producer services are running in which EC2 instances. Below we can see that employee consumer is running in the Manager Node while the employee producer service is running in the Worker Node.
     sudo docker container ls
     

    EC2 instance docker container list
    Also if we check the employee consumer logs, it can be seen that the REST service exposed by the employee producer is successfully consumed by the employee consumer.
     sudo docker container logs l3
     

    EC2 instance docker container logs

    EC2 instance docker service logs