Load Balancer

Improve the performance of your services as you grow.

Increased availability

Load Balancer is the easiest way to build a resilient platform thanks to a highly available architecture. All backend servers are monitored to ensure that traffic is distributed among healthy resources.

Peak load management

Avoid dips in performance by adding as many backend servers as necessary. Increase your processing capacity in a few clicks from the Scaleway console or configure automatic scaling using the API.

Scale your business

As your business grows, you need more resources to succeed. With Load Balancer, you can easily scale your business by adding new backend servers to improve your quality of service without any downtime.

Available zones:
Paris:PAR 1PAR 2
Amsterdam:AMS 1AMS 2
Warsaw:WAW 1

Load Balancer use cases

Handle peaks in e-commerce activity

Distribute workloads across multiple servers during peaks in traffic to your website using Load Balancer to ensure continued availability and avoid servers being overloaded.

Under the hood

  • BandwidthUp to 4Gbps

  • Multi-cloud compatibilityOn LB-GP-L & LB-GP-XL offers

  • Health checksHTTP(S), MySQL, PgSQL, LDAP, REDIS, TCP

  • Balancing algorithmsRound-robin, sticky, least connexion, first healthy

  • Backend serversUnlimited

  • RedundancyHigh availability

  • Service level99.99% SLA

  • HTTPSLet’s Encrypt & custom SSL certificates

Key features

Health checks

Whether you are distributing your workload between web servers, databases, or other TCP services, you can easily set up health checks to ensure the availability of your backend servers. You can even monitor their availability in real time. If one of them fails to respond, its traffic is automatically redirected until the problem is solved.

SSL - Offloading

Improve network speed by retrieving the SSL certification passing through your Load Balancer, increasing the speed of backend request processing as well as communication between servers and end users.

Balancing rules/Proxy

Use our Load Balancer to regulate traffic according to your use cases. Round-robin, sticky connections, least connection or first healthy rules are good examples of what is possible.

Unlimited backends

Add as many backend servers as you want to our Load Balancer and scale your infrastructure on the fly, without any limits, and distribute your traffic across multiple platforms with the multi-cloud offer.

ACLs permissions

Filter the IP addresses that are allowed to request your servers. Disable unwanted visitors to keep them from connecting to your network bandwidth, thus increasing security.

Multi-cloud connections

Some offers allow you to distribute your traffic between different platforms or any on-premise server or Instance. This allows you to build a more robust infrastructure and avoids depending on a single platform.

Kubernetes-ready

Use our Load Balancer to expose your containers and pods to the internet, so they have a common DNS and IP address, and to balance workloads.

Bandwidth of up to 4Gbit/s

With a sizable bandwidth offer, there’s no use case we don’t support. And, as we do not charge for egress, you will be billed a fixed price with no surprises.

Adapt the protocol

You can configure your Load Balancer’s backend, and choose the protocol (HTTP, HTTPs or TCP) used to send and receive data.

Pricing

No egress fees

Name
Multicloud
Backend servers
Traffic
Bandwidth
Price
LB-GP-S
No
Unlimited
Unlimited
200 Mbit/s
€0.016/hour
LB-GP-M
No
Unlimited
Unlimited
500 Mbit/s
€0.037/hour
LB-GP-L
Yes
Unlimited
Unlimited
1 Gbit/s
€0.094/hour
LB-GP-XL
Yes
Unlimited
Unlimited
4 Gbit/s
€0.941/hour

Start in minutes

Increase the resiliency, improve infrastructure security and tune the traffic: add Load Balancer to Instances.

All Virtual Instances

Get started with tutorials

  • First steps with Scaleway Load BalancerLearn more
  • Setting up a Load Balancer for WordpressLearn more
  • How to use the Proxy protocol v2 with Load BalancerLearn more
Tutorials

Frequently asked questions

Load Balancers are highly available and fully managed Instances which distribute workloads among your servers. They ensure application scaling while delivering their continuous availability. They are commonly used to improve the performance and reliability of websites, applications, databases, and other services by distributing workloads across multiple servers.

It monitors the availability of your backend servers, detects if a server fails and rebalances the load between the rest of the servers, making your applications highly available for users.

The multi cloud is an environment in which multiple cloud providers are used simultaneously. With our multi-cloud offers, you can add multiple backend servers besides Instances, Elastic Metal and Dedibox servers.
These can be services from other cloud platforms such as Amazon Web Services, Digital Ocean, Google Cloud, Microsoft Azure or OVHcloud, but also on-premise servers hosted in a third-party datacenter.

All protocols based on TCP are supported. It includes database, HTTP, LDAP, IMAP and so on. You can also specify HTTP to benefit from support and features that are exclusive to this protocol.

Yes, you can restrict the use of a TCP port or HTTP URL via ACLS. Find more information here.

Each Load Balancer provides external connectivity via an IPv4 address. IPv6 is not yet supported for external connections, but it can be used to communicate between the Load Balancer and your backend servers.

No, it’s not required. You can use private Scaleway IPs on your backend servers if they are hosted in the same Availability Zone (AZ) as the Load Balancer.