Is Serverless the Future of Computing?

vintage illustration of a woman's hand holding a carrier pigeon with a letter

The IT field wouldn’t be interesting if a new paradigm didn’t come along every few years or so and threaten to upset the proverbial apple cart. The latest of these is serverless computing – on-demand processing that, advocates claim, will revolutionize the way developers handle computing tasks.

The Promise

Serverless offers some advantages. It aims to spare developers the effort of managing support functions like server stack management, load balancing, storage and security. These advantages are being hyped as the beginning of the end for traditional programming. Using a serverless function uncouples your application from much of the baggage that weighs down development projects. “With serverless computing platforms we, software developers, can at last put aside all irrelevant technicalities and start delivering what we are paid for — business features,” writes Asher Sterkin of Hackernoon.

Developers can have their functions run on-demand with nary a concern for resources. They also have the advantage of using just as much computing as they need when a trigger is activated. So they don’t need to worry about things like always-on functionality or states, and can focus on developing great executable code that does what it’s supposed to do in a lean, highly-effective way.

In addition, there’s also the philosophical appeal in abstracting functions away from computer architectures. In theory, serverless functions could mean that an entire app is operated on an as-needed basis. With its various functions floating about in the ether, the application can run anytime and anywhere, jumping about from server to server or lying dormant as needed.

Reality Check

The reality, of course, is that serverless technology comes with its own set of limitations and challenges. While it may simplify small tasks and provide effortless cloud connectivity, it can quickly turn into a logistical nightmare on the large scale. “I can imagine a situation where serverless functions are never deleted and new ones added because of the fear of breaking some part of an application,” wrote Steve Jones of SQLServerCentral in a recent editorial. He also pointed out the difficulty of controlling the costs of serverless functions, a problem all too familiar to organizations still feeling the pain from overruns on their cloud deployments.

In fact, the serverless model makes running software as a cohesive vision nearly impossible. Developers either have to streamline the functions they want performed, or they must build their application as a series of one-off functions. Each function also has its own associated trigger, which requires a start point and an end point and corresponding API and HTTP interactions to make it happen. This is, in effect, like wrapping each function in its own packaging, as opposed to having a separate set function manage multiple processes under the same workflow.

In this way, serverless “imposes too many constraints for a mid-to-large size applications/systems to be run on,” says computer science instruction Michal Boška. “… A huge drawback is that you can’t see the big picture. You only see your small function, but that small function is actually part of a larger world….”

Adding to these constraints are the somewhat limited capabilities of AWS’s Lambda and other serverless services. Currently, only Python, Node.js (JavaScript), and Java are supported, although it can be possible to launch processes for Ruby, Bash or Golang. Total function computing time is limited to 300 seconds (5 minutes), and each function has a limited disk capacity of 512 MB. These limitations could change in the near future, but they are a part of what serverless is for the time being.

The biggest barrier, though, to implementing entire fully functioned apps via serverless is support. Documentation is minimal — some would even say non-existent. “Error messages are often cryptic and the number of configuration options is large, making diagnosis time-consuming,” writes Jake Bennett. “Making matters worse, documentation and community support is still immature, so there isn’t a huge corpus of community knowledge to draw from.”

The Future

Of course, most applications as they exist now could never be translated into a 100% serverless model. That, and the inertia of breaking out of a full-product mindset, will likely confine the technology to limited app functions or small app services for things like IoT applications.

Many of the project examples you can find now are in the vein of Bun Alert: novelties or basic functions built to fulfill a highly specific need. But as the capabilities of serverless expand and the wheel of technology continues to steamroll forward, serverless may yet find its place in the computing world.

Leave a Reply