How our unique MultiCloud Load Balancer Works
Thanks to its redundant architecture, a Load Balancer allows you to guarantee your service’s availability. In case of primary instance failure, a secondary instance instantly takes over while a fresh instance is automatically provisioned to maintain redundancy.
The LB-GP-L offer allows you to distribute your traffic between different platforms such as Amazon Web Services, Digital Ocean, Google Cloud, Microsoft Azure, OVHcloud or any on-premises server or instance. This allows you to build a more robust infrastructure and limit your dependency on a single platform.
Scaleway is the only cloud provider to guarantee the bandwidth of your load balancers. You can count on up to 1 Gbit/s to distribute your traffic and manage your peak loads.
Add as many backend servers as you want to our Load Balancer and easily configure your balancing algorithm (round-robin, sticky or least connection). Scale your infrastructure on the fly, with no limits, and distribute your traffic across multiple platforms with the multicloud offer.
The volume transferred is free and unlimited with our Load Balancer offers. In addition, you are not charged for the number of rules created or requests processed. You pay a fixed price with no surprises.
Whether you are distributing your load to web servers, databases, or other TCP services, you can easily set up health checks to ensure the availability of your backend servers. You can also monitor their availability in real-time. If one of them fails to respond, its traffic is automatically redirected until the problem is solved.
To secure your traffic with HTTPS, you can create your Let’s Encrypt certificates directly from the Scaleway console or through our API. You can also import your own SSL certificate to authenticate your website.
We ensure your services are always up and running. Our technical assistance is available 24 hours a day, 7 days a week to answer all your questions and to assist you. Simply open a new ticket in case of a problem. You can also reach our support directly by phone or to get faster responses if you upgrade your support plan.
Discover our top Use Cases for Load Balancer
Load Balancer is the easiest way to build a resilient platform thanks to a primary-secondary architecture. All backend servers are monitored to ensure that traffic is distributed among healthy resources.
Avoid dips in performance by adding as many backend servers as necessary. Increase your processing capacity in a few clicks from the Scaleway console or configure automatic scaling using the API.
As your business grows, you need more resources to succeed. With Load Balancer, you can easily scale your business by adding new backend servers to improve your quality of service without any downtime.
Choose from three types of Load Balancer including a high performance Multicloud product to instantly scale your infrastructure!
Prices before tax
Load Balancers are highly available and fully managed instances which distribute workloads among your servers. They ensure application scaling while delivering their continuous availability. They are commonly used to improve the performance and reliability of websites, applications, databases, and other services by distributing workloads across multiple servers.
It monitors the availability of your backend servers, detects if a server fails and rebalances the load between the rest of the servers, making your applications highly available for users.
Multicloud means that you can add mutliple backend servers besides instances, Bare Metal cloud servers, Dedibox dedicated servers.
These can be services from other cloud platforms such as Amazon Web Services, Digital Ocean, Google Cloud, Microsoft Azure or OVHcloud, but also on-premises servers hosted in a third-party datacenter.
Unlike our multicloud offer, non-multicloud offers allow you to add only backend servers part of the Scaleway ecosystems which include Instances, Bare Metal cloud servers, and Dedibox dedicated servers.
All protocols based on TCP are supported. It includes database, HTTP, LDAP, IMAP and so on. You can also specify HTTP to benefit from support and features that are exclusive to this protocol.
Yes, you can restrict the use of a TCP port or HTTP URL via ACLS. Find more information here.
Each Load Balancer provides external connectivity via an IPv4 address. IPv6 is not yet supported for external connections, but it can be used to communicate between the Load Balancer and your backend servers.
No, it’s not required. You can use private Scaleway IPs on your backend servers if they are hosted in the same Availability Zone (AZ) as the Load Balancer.
Have a question? Call us: