Serverless: 1 year in, and we’re just getting started
Scaleway’s Serverless ecosystem is one year old today—the perfect occasion to tell you all about how Serverless was built, and the new features planned for the year ahead.
If done right, Serverless should be a no-brainer for any use-case: less time spent configuring and scaling means more time spent building applications; auto-scaling means that the same code that serves one user, can serve thousands of users; and pay-as-you-go billing means no more paying for over-provisioned, under-utilized resources.
However, because Serverless is still in its infancy, it lacks the ecosystem of frameworks and libraries that exist for other cloud products. It also requires a shift in mindset, as developers get used to handing over control of their infrastructure to cloud providers. To give a better view of the state of Serverless today, we’ll discuss a few use-cases where it works well, and a few use-cases where it doesn’t work so well.
FaaS and CaaS are effective tools for building APIs, where each API endpoint is packaged as a single function or container. Rather than having to provision, configure and scale a load balancer and a set of VMs, the user can trust the Serverless provider to handle all the auto-scaling, load-balancing and networking, leaving them free to focus on their application logic. In addition, by writing each endpoint as its own function/container, the endpoints will each scale independently, thus freeing the developer from tuning their own auto-scaling logic. The application state behind the APIs can be stored in serverless storage, such as object storage, or a serverless NoSQL database.
Serverless can be used to build scalable workflows, such as downstream data processing, or image processing pipelines. By expressing each stage of the workflow as a serverless function/container and connecting them via message queues, the user does not have to determine the appropriate parallelism and resource levels for each stage, nor waste resources on over-provisioned VMs.
Distributed cloud applications often need to connect streams of data between different parts of the application. This data will frequently need to be transformed or aggregated, such as when collecting events from a web application to write to a data warehouse. Rather than managing dedicated resources for this task, developers can create serverless functions, automatically triggered by upstream events, to run on demand, and transform or aggregate the data.
With Serverless, the cloud provider is responsible for managing and scaling your infrastructure. The underlying platform must therefore cater for a range of different use-cases, scales, and performance expectations. This can lead to a “lowest common denominator” effect, where serverless will cater to the minimum requirements for all, but not do well in more custom or high-performance scenarios.
Certain characteristics of an application can make it unsuitable for serverless computing:
If your application requires a high level of computational power, such as video rendering or scientific simulations, serverless computing may not be able to provide the necessary resources.
Serverless computing is designed for short-lived, stateless functions. If your application requires long-running processes or needs to maintain state, it may be better to use a traditional server-based model.
Serverless runs your application on shared infrastructure, i.e. side by side with other, potentially untrusted applications. Cloud providers make every effort to isolate serverless applications, and there is no reason to believe it is any less secure than any other form of cloud computing. However, it is unlikely to meet the strict regulatory requirements around sensitive data, such as that used in medical, governmental, or law-enforcement applications.
If your application is expected to receive a high volume of traffic, the cost of running it on a serverless platform could be significantly higher than using a traditional server-based model.
If your application is complex, it may be more difficult to break it down into smaller, independent functions that can be run on a serverless platform.
Once you’ve decided to embark on your Serverless journey, it can be difficult to know where to begin. All providers offer a browser-based console, where you can write code, connect services, and upload dependencies. This is great for experimenting with serverless, but can be cumbersome when it comes to managing a larger app.
Serverless Framework is an open-source project that aims to address some of the complexity around managing serverless apps. It takes a declarative approach, where you can declare multiple serverless functions and the communication between them using YAML files. Serverless Framework provides back-ends for all major cloud providers, so you can easily migrate functions between them.
Finally, there is an emerging trend for higher level Serverless frameworks, where the framework will transparently create Serverless resources from your code. These frameworks often adopt conventions from other frameworks, for example, the annotations pattern often seen in Python web frameworks.
Once you’ve written and deployed your app, you will want to monitor it, and view logs and metrics. Unlike for other distributed systems where this can be a job in itself, most serverless providers offer out-of-the-box observability products.
As we’ve mentioned, serverless is still a young technology, and the associated open-source ecosystem, development tools, and design patterns, are still rapidly evolving. In the next 5 years or so, we can expect serverless computing to improve on its current weaknesses in several ways:
Scaleway’s Serverless ecosystem is one year old today—the perfect occasion to tell you all about how Serverless was built, and the new features planned for the year ahead.
From then, until now, the same three principles have driven innovation in the cloud: ease, scale, and cost. Today, these same three principles are embodied in the latest serverless technologies.
Serverless is only provided by major cloud providers… but could we imagine Serverless outside the range of cloud providers someday?