Server-Side Load Balancing Example

About

In server-side load balancing, a central load balancer distributes incoming traffic across multiple backend servers to ensure that no single server is overwhelmed with requests.

NGINX

One of the most widely used tools for server-side load balancing is NGINX. Unlike client-side load balancing (where the client makes decisions about which server to use), server-side load balancing occurs on a dedicated machine that sits between the client and the backend services.

What is NGINX?

NGINX is a powerful open-source web server that also functions as a reverse proxy, load balancer, and HTTP cache. It is commonly used to manage incoming HTTP, TCP, and UDP traffic and distribute it efficiently across multiple backend servers.

How Server-Side Load Balancing Works with NGINX?

When a client sends a request to a system using server-side load balancing, the request is first received by the load balancer (NGINX in this case). NGINX then selects a backend server (from a pool of servers) based on a predefined load-balancing algorithm and forwards the request to that server. Once the backend server processes the request, it sends the response back to NGINX, which in turn forwards it to the client.

NGINX Load Balancing Architecture

  • Client: Sends requests to the load balancer (NGINX).

  • NGINX (Load Balancer): Distributes incoming requests to one of the backend servers based on a load-balancing algorithm.

  • Backend Servers: A pool of servers (e.g., API servers, web servers) that handle the actual processing of requests.

Example

Let's create a sample service maven project say sample-project with an client API endpoint.

Add the following dependencies in pom.xml file

We may need to add spring-boot-maven-plugin to be able to generate jar file.

Create a sample controller class with 1 endpoint

We may notice the use of SERVER_INSTANCE variable. We will be setting it the docker compose file as a environment variable since we will need to create multiple instance (more than 1) of this service to test the load balancing feature.

Create a main application file

application.yaml file

Build the project to generate jar file (to be used later in docker compose file)

We will use the generated jar file directly in docker-compose instead of creating docker image for above service

Now, we need to setup NGINX.

Create a nginx.conf config file

Create a docker file for nginx with above config file to create a nginx image

Dockerfile

Note that config file is kept at the same place as Dockerfile and relative path is being used

Now, let us create docker image using above docker file

Now, we have the services ready. Let's create docker-compose.yml file

We can keep this docker-compose file in root of sample-project since we need to provide relative path of jar file

Run the docker-compose file

Call the API http://localhost:90/api/hello-world multiple time to see load balancing effect

Last updated