Load Balancer

Introduction

The Load Balancer offered by SeFlow allows you to distribute incoming traffic across multiple servers (Cloud VPS, Cloud Servers, or Dedicated Servers), improving availability, performance, and service continuity. By routing traffic through a load balancer connected to a redundant pool of back-end servers, you prevent overloading a single machine and ensure greater resilience for your infrastructure.

Tip: Use the load balancer for critical services (websites, APIs, business applications) that must remain online at all times, distributing traffic across at least two or more identical servers.

What is a Load Balancer

A SeFlow load balancer is a network service with its own public IP address and one or more rules determining how traffic should be routed to back-end servers. From the user’s perspective, the load balancer is the only access point, while behind it may lie an entire server cluster.

A load balancer allows you to:

  • distribute incoming traffic across multiple servers, preventing overloads;
  • increase reliability through redundancy of back-end servers;
  • reduce response times by routing clients to the least busy server;
  • simplify maintenance by allowing you to remove servers from the pool without service interruption.
Tip: All back-end servers behind a load balancer must run the same service and version to ensure consistent behavior for end users.

Rules Associated with a Load Balancer

Each load balancer is composed of one or more rules. A rule defines how incoming traffic should be routed and includes several key parameters:

  • Front-end IP and port of the load balancer;
  • Back-end IPs and ports of the target servers;
  • Protocol (HTTP, HTTPS, TCP);
  • Load balancing algorithm (e.g., LeastConn, Source);
  • optional session persistence settings.

Main Load Balancing Algorithms

  • LeastConn:
    • routes traffic to the server with the fewest active connections;
    • dynamic algorithm suited for real-time load distribution;
    • ideal for persistent HTTP/HTTPS connections.
  • Source:
    • routes traffic based on a hash of the client's source IP and port;
    • ensures the same client is always routed to the same server;
    • suitable for certain TCP-based services requiring session consistency.
Tip: For web applications (HTTP/HTTPS), use LeastConn. For specific TCP services where session affinity is required, evaluate the Source algorithm.

Supported Protocols

The load balancer supports the following protocols:

  • HTTP: unencrypted web traffic (default port 80);
  • HTTPS: encrypted web traffic using SSL/TLS (default port 443);
  • TCP: any custom TCP-based service (e.g., internal services, custom ports, etc.).
Tip: Whenever possible, prefer HTTPS over HTTP to protect traffic between clients and the load balancer.

HTTPS, Public Certificate, and Private Key

To use the load balancer in HTTPS mode with the LeastConn algorithm, you must upload a public certificate and its corresponding private key to the load balancer. This enables SSL/TLS termination directly on the load balancer, which decrypts the traffic before forwarding it to the back-end servers.

What is an SSL/TLS Certificate

An SSL/TLS certificate consists of a pair of cryptographic keys:

  • Public certificate: contains the public key and identity details signed by a Certificate Authority (CA);
  • Private key: must remain secret and is used to decrypt traffic and validate authenticity.

Common file formats include .crt, .cer, .pem for certificates and .key for private keys.

Tip: Never share your private key. Anyone in possession of it can impersonate your service.

Uploading Certificate and Key to the Load Balancer

  1. Access the rule configuration.
  2. Select HTTPS as the protocol and specify the load balancer's port (usually 443).
  3. Define the back-end server port (80 or 443 depending on SSL passthrough or termination).
  4. Upload the public certificate and private key.
  5. Save changes and test your site through the browser.
Tip: After configuration, test your domain with tools such as SSL Labs to verify certificate chains and TLS compatibility.

How to Add a Load Balancer

You can create a new load balancer from the SeFlow Control Panel.

  1. Log in to the SeFlow Control Panel.
  2. Select the Data Center.
  3. Go to Public Cloud → Network → Load Balancers.
  4. Click + CREATE LOAD BALANCER.
  5. Fill in the required details:
    • Load balancer name (e.g., lb-web-production);
    • Network / VLAN;
    • Public IP to associate;
    • optional notification settings.
  6. Confirm and wait for the load balancer to become “Active”.
Tip: Use descriptive names that identify the environment (test, staging, production) and purpose (web, api, etc.).

Managing a Load Balancer

Accessing the Details Page

  1. Go to Public Cloud → Network → Load Balancers.
  2. Select the load balancer to manage.
  3. Click on Manage to open the details page.

From this page you can:

  • activate or deactivate the load balancer;
  • link or unlink back-end servers;
  • create, modify, or delete rules;
  • view traffic statistics;
  • modify name and notifications;
  • delete the load balancer.

Activating / Deactivating a Load Balancer

Deactivate a load balancer:

  1. Open the load balancer details.
  2. Click DEACTIVATE LOAD BALANCER.
  3. Confirm.

Activate a load balancer:

  1. Open the details page.
  2. Click ACTIVATE LOAD BALANCER.
  3. Confirm.
Tip: Deactivating a load balancer makes the service unreachable. Do this only during maintenance windows.

Linking and Unlinking Servers

Linking a Server

  1. Open the load balancer details.
  2. Go to Balanced Servers.
  3. Click LINK SERVER.
  4. Select:
    • Cloud VPS or Cloud Servers;
    • Dedicated Server IPs if supported.
  5. Confirm.

Unlinking a Server

  1. Go to Balanced Servers.
  2. Select the server to remove.
  3. Click UNLINK.
  4. Confirm.
Tip: Ensure at least one server is always connected, otherwise the load balancer cannot route traffic.

Creating or Removing Rules

Add a Rule

  1. Open the load balancer details.
  2. Scroll to Rules.
  3. Click CREATE RULE.
  4. Fill in:
    • protocol (HTTP, HTTPS, TCP);
    • load balancer port;
    • back-end server port;
    • algorithm;
    • extra parameters if needed.
  5. Confirm.

Remove a Rule

  1. Go to Rules.
  2. Select the rule.
  3. Click DELETE.
  4. Confirm.
Tip: Keep at least one active rule. If replacing a rule, first create the new one, then delete the old one.

Viewing Load Statistics

Under Load History (or Statistics) you can view:

  • traffic graphs (requests/sec, throughput);
  • per-server load distribution;
  • rule-level aggregated metrics.
Tip: Use statistics to identify bottlenecks and decide when to scale your infrastructure.

Editing Name and Notifications

  1. Go to the Edit tab in the load balancer details.
  2. Modify:
    • Load balancer name;
    • Notification settings.
  3. Save changes.

Deleting a Load Balancer

You can delete a load balancer when it is no longer needed.

  1. Open the load balancer details.
  2. Deactivate it if active.
  3. Click DELETE.
  4. Confirm the operation.
Tip: Before deleting a load balancer, ensure that traffic has been migrated elsewhere to avoid downtime.

Best Practices for Using SeFlow Load Balancers

  • use at least two back-end servers for production environments;
  • keep server configurations homogeneous (OS version, application stack);
  • use HTTPS for Internet-facing services;
  • monitor statistics to adjust capacity over time;
  • document load balancer configuration details within your team.
Tip: For mission-critical environments, combine load balancers with dedicated firewalls and advanced monitoring.

 

Was this answer helpful? 0 Users Found This Useful (0 Votes)