Serverless overview
What is serverless compute?
Serverless computing is a cloud computing model where the cloud providers manage the infrastructure and dynamically allocate computing resources as needed.
This means that there is no need for users to provision machines, manage a cluster, or pay for servers. Your application code is executed on demand, and you only pay for the computing time consumed by your software.
This approach enables greater scalability, flexibility, and cost-effectiveness.
Key features:
- Automatic scaling: Your application can scale up or down automatically based on demand
- No server management: No need to provision, maintain, or manage servers
- Pay as you go: Only pay for the computing time you consume, making it cost-effective
The Scaleway Serverless ecosystem is not limited to Serverless Functions, which is perfect to deploy small chunks of code. You can also directly deploy containers on Serverless Containers and Serverless Jobs. Refer to the differences between Functions, Jobs, and Containers for more information on Scaleway's different Serverless products.
Why Serverless?
Serverless offers numerous advantages that can significantly enhance your development and operational efficiency:
- Cost savings: You only pay for the compute time your code uses, which can lead to significant cost reductions. No idle resources are left running and incurring unnecessary costs
- Scalability: Automatically scale your applications to handle varying loads without manual intervention. It is perfect for seasonal traffic and viral apps
- Faster time to market: Focus on writing code rather than managing infrastructure, accelerating your development cycles
- Reduced operational overhead: Let the cloud provider handle server maintenance, updates, and scaling, freeing up your team to focus on innovation
- Eco-friendly: Optimized resource usage reduces wasted energy
- Focus on apps: You can put all your energy into building application value instead of managing infrastructure
How to control Serverless costs
Serverless is inherently cost-transparent. Here are some tips to optimize costs:
- Cost estimator: When deploying Serverless resources via the Scaleway console, you can try different parameters to evaluate costs
- Best practices Use efficient code with optimized libraries, ensuring a small resource footprint
- Monitoring Monitor usage with built-in observability and use the cost manager. Refer to the dedicated documentation to find out how to use the cost manager
- Parameters Define a max-scale setting according to traffic spikes
Serverless eliminates upfront capital expenses and reduces operational costs, giving you predictable, granular billing.
Ready to go Serverless?
By adopting Serverless, you are choosing agility, innovation, and cost savings. Whether you are a startup scaling rapidly or an enterprise modernizing legacy systems, Serverless lets you focus on what matters: delivering value to your users.
You can fully deploy your API on Serverless or use it to empower and add automation to your infrastructure. Serverless is excellent at handling traffic spikes, which is useful for offloading regular servers during special events.
Is my application a good fit for Serverless?
Most applications can benefit from Serverless. Check out some use cases:
- Event-driven workloads: file processing, real-time notification, IoT data streams
- Microservices and APIs: Stateless, short-lived tasks (e.g., user authentication, payment processing, etc.)
- Sporadic traffic: Apps with variable usage (e.g., marketing campaigns, ticketing system).
- Rapid prototyping: Test ideas quickly without upfront infrastructure investment
How secure are Serverless resources?
Scaleway prioritizes strong security and isolation for Serverless products.
- Secured isolation layers: Each container runs in a secure, isolated environment. Our systems provide a VM-like security while maintaining container-like performance
How do I debug and monitor applications in a Serverless environment?
Scaleway provides full observability:
- Logs and metrics: Centralized logging and real-time metrics via Cockpit
- Local debugging: Test containers locally using the Serverless CLI and emulator
- Error Reporting: Automatic alerts for failed invocations or resource bottlenecks via Cockpit. See how to configure alerts for Serverless Containers
As our Serverless environment does not encourage vendor lock-in, you can easily debug your container locally.
Will Serverless lock me into the ecosystem?
No. Scaleway Serverless is designed to minimize vendor lock-in. We believe in empowering your freedom to choose, adapt, and evolve. Here is how we ensure flexibility:
Container portability
- Docker compatibility: Your container images (built with Docker, Helm, or other tools) are portable. You can redeploy them elsewhere—on-premises, on other clouds, or in hybrid environments.
- No proprietary formats: We do not modify your containers. What you build works anywhere.
No proprietary lock-in
- No forced dependencies: Some providers require customers to import specific libraries to work properly. We do not.
- Open APIs: Manage Serverless Containers via REST APIs, Terraform, CLI, and more with no proprietary control required.
How to migrate to a Serverless ecosystem
Serverless products at Scaleway allow you to gradually migrate to Serverless, offering different strategies:
Start small
- Small workloads: Migrate non-critical workloads first, like parts of APIs, automation, and scheduled tasks
- Proof of concept: Use our tutorial and check our scaleway/serverless-examples repository for inspiration
Hybrid architecture
- Coexistence: Run serverless endpoints alongside VMs, clusters, and traditional apps
Incremental refactoring
- Break monoliths: Convert microservices or stateless components to Serverless first
What specific knowledge is required to deploy Serverless projects?
Serverless is designed to eliminate infrastructure complexity, so teams can focus on innovation:
No infrastructure expertise needed
- Managed services: Scaleway handles networking, scaling, patching, and availability
- Simplified operations: No need for DevOps engineers to manage clusters or servers
Developer-centric workflow
- Familiar tools: Use Git, Docker, CI/CD pipelines, and IDEs you already know
- Language flexibility: Support for Python, Node.js, Rust, Go, PHP, and custom runtimes via Serverless Containers
Learn Serverless basics
- Minimal learning curve: Teams only need to understand event-driven architecture, container basics, and Scaleway’s serverless console/CLI
- Training resources: Free tutorials, examples, and free tier for experiments and testing
Is Serverless a good choice for a growing business?
Absolutely. Serverless is ideal for startups and scaling businesses due to its cost efficiency, elasticity, and operational simplicity:
Auto-Scaling for traffic spikes
- Zero manual intervention: Automatically scale from zero to millions of requests during flash sales, marketing campaigns, or viral events
- Example: An e-commerce app handles Black Friday traffic seamlessly without provisioning extra resources
Pay-as-you-go cost model
- *No idle costs: Growing businesses avoid overspending on underutilized infrastructure
- Predictable budgeting: Use cost estimator to forecast costs based on expected usage
Focus on core innovation
- Reduce operational overhead: Teams avoid spending time on server management, freeing resources for product development
- Faster iteration: Deploy updates in minutes without downtime
Enterprise-Ready as you scale
- SLAs and security: Check our SLA page
- Hybrid flexibility: Seamlessly integrate with other Scaleway services (e.g., Managed Databases, Queues, Topics and Events, and Managed Inference) to support complex workflows