Load Balancer
Introduction
The Load Balancer offered by SeFlow allows you to distribute incoming traffic across multiple servers (Cloud VPS, Cloud Servers, or Dedicated Servers), improving availability, performance, and service continuity. By routing traffic through a load balancer connected to a redundant pool of back-end servers, you prevent overloading a single machine and ensure greater resilience for your infrastructure.
What is a Load Balancer
A SeFlow load balancer is a network service with its own public IP address and one or more rules determining how traffic should be routed to back-end servers. From the user’s perspective, the load balancer is the only access point, while behind it may lie an entire server cluster.
A load balancer allows you to:
- distribute incoming traffic across multiple servers, preventing overloads;
- increase reliability through redundancy of back-end servers;
- reduce response times by routing clients to the least busy server;
- simplify maintenance by allowing you to remove servers from the pool without service interruption.
Rules Associated with a Load Balancer
Each load balancer is composed of one or more rules. A rule defines how incoming traffic should be routed and includes several key parameters:
- Front-end IP and port of the load balancer;
- Back-end IPs and ports of the target servers;
- Protocol (HTTP, HTTPS, TCP);
- Load balancing algorithm (e.g., LeastConn, Source);
- optional session persistence settings.
Main Load Balancing Algorithms
- LeastConn:
- routes traffic to the server with the fewest active connections;
- dynamic algorithm suited for real-time load distribution;
- ideal for persistent HTTP/HTTPS connections.
- Source:
- routes traffic based on a hash of the client's source IP and port;
- ensures the same client is always routed to the same server;
- suitable for certain TCP-based services requiring session consistency.
Supported Protocols
The load balancer supports the following protocols:
- HTTP: unencrypted web traffic (default port 80);
- HTTPS: encrypted web traffic using SSL/TLS (default port 443);
- TCP: any custom TCP-based service (e.g., internal services, custom ports, etc.).
HTTPS, Public Certificate, and Private Key
To use the load balancer in HTTPS mode with the LeastConn algorithm, you must upload a public certificate and its corresponding private key to the load balancer. This enables SSL/TLS termination directly on the load balancer, which decrypts the traffic before forwarding it to the back-end servers.
What is an SSL/TLS Certificate
An SSL/TLS certificate consists of a pair of cryptographic keys:
- Public certificate: contains the public key and identity details signed by a Certificate Authority (CA);
- Private key: must remain secret and is used to decrypt traffic and validate authenticity.
Common file formats include .crt, .cer, .pem for certificates and .key for private keys.
Uploading Certificate and Key to the Load Balancer
- Access the rule configuration.
- Select HTTPS as the protocol and specify the load balancer's port (usually 443).
- Define the back-end server port (80 or 443 depending on SSL passthrough or termination).
- Upload the public certificate and private key.
- Save changes and test your site through the browser.
How to Add a Load Balancer
You can create a new load balancer from the SeFlow Control Panel.
- Log in to the SeFlow Control Panel.
- Select the Data Center.
- Go to Public Cloud → Network → Load Balancers.
- Click + CREATE LOAD BALANCER.
- Fill in the required details:
- Load balancer name (e.g., lb-web-production);
- Network / VLAN;
- Public IP to associate;
- optional notification settings.
- Confirm and wait for the load balancer to become “Active”.
Managing a Load Balancer
Accessing the Details Page
- Go to Public Cloud → Network → Load Balancers.
- Select the load balancer to manage.
- Click on Manage to open the details page.
From this page you can:
- activate or deactivate the load balancer;
- link or unlink back-end servers;
- create, modify, or delete rules;
- view traffic statistics;
- modify name and notifications;
- delete the load balancer.
Activating / Deactivating a Load Balancer
Deactivate a load balancer:
- Open the load balancer details.
- Click DEACTIVATE LOAD BALANCER.
- Confirm.
Activate a load balancer:
- Open the details page.
- Click ACTIVATE LOAD BALANCER.
- Confirm.
Linking and Unlinking Servers
Linking a Server
- Open the load balancer details.
- Go to Balanced Servers.
- Click LINK SERVER.
- Select:
- Cloud VPS or Cloud Servers;
- Dedicated Server IPs if supported.
- Confirm.
Unlinking a Server
- Go to Balanced Servers.
- Select the server to remove.
- Click UNLINK.
- Confirm.
Creating or Removing Rules
Add a Rule
- Open the load balancer details.
- Scroll to Rules.
- Click CREATE RULE.
- Fill in:
- protocol (HTTP, HTTPS, TCP);
- load balancer port;
- back-end server port;
- algorithm;
- extra parameters if needed.
- Confirm.
Remove a Rule
- Go to Rules.
- Select the rule.
- Click DELETE.
- Confirm.
Viewing Load Statistics
Under Load History (or Statistics) you can view:
- traffic graphs (requests/sec, throughput);
- per-server load distribution;
- rule-level aggregated metrics.
Editing Name and Notifications
- Go to the Edit tab in the load balancer details.
- Modify:
- Load balancer name;
- Notification settings.
- Save changes.
Deleting a Load Balancer
You can delete a load balancer when it is no longer needed.
- Open the load balancer details.
- Deactivate it if active.
- Click DELETE.
- Confirm the operation.
Best Practices for Using SeFlow Load Balancers
- use at least two back-end servers for production environments;
- keep server configurations homogeneous (OS version, application stack);
- use HTTPS for Internet-facing services;
- monitor statistics to adjust capacity over time;
- document load balancer configuration details within your team.
