DEV Community

Cover image for Going Serverless for the Simple Architecture.
vishwasnarayanre
vishwasnarayanre

Posted on • Edited on • Originally published at vishwasnarayan.com

Going Serverless for the Simple Architecture.

Alt Text
In this article, I will focus on the Microservices, DevOps, and some parts of the certifications and the career development as a cloud consumer and also for the development.

Hai folks I am Vishwas N In this series of azure development I will focus more on the development side of the Cloud, how the cloud infrastructure was developed and evolved, also will write some necessary questions that I feel should be answered to solve the questions which the developers have. Thus I say follow me stay tuned for the posts that I will be giving here in the dev.to as I feel this is the best platform for you all to get a full "Azure Gyan"(#azuregyan on Twitter, LinkedIn)

let's just see what happened in the stone age of the servers and how cloud infrastructure was

Before the "Cloud" it was difficult to access the physical server, difficult to analyze what is the size of the cloud infrastructure that I just buy and utilize, all these processes were done manually for the procurement of the applications.

let just say 15 years back(i am not that od I am still 21 ish but these are some of the experience I gained when I talked to professional who worked in this field for years now) when we listen to the developers they had a new level of complexity, the cloud infrastructure was not that mature, Virtual machines were not that sophisticated it had to be configured manually, that is eery time you install a new version of the software or the new version of any wrapper of a compiler on to the cloud, the system was "ghosted"(was told that it was kept for the maintenance) or else whenever there was a "path" developers use to go and path the systems, that is a set of changes to a computer program or its supporting data designed to update, fix, or improve it also can be called as debugging). This process was more from a server and storage perspective.

When the system architecture was more digitalized the Era of "IaaS"(Infrastructure as a service) came into the picture where the virtual machines were sturdy enough for fault tolerance and also secure system accessibility thus developer started to develop remotely and also they just used this technique for upgrading the system architecture and also many of the way s in which the code was debugged remotely using a network through the internet.
but still, the IaaS had a flaw where the patching issue came into the picture, how to upgrade the operating system, how to use a developer tolls, and also how to get access to the server if I have lost the key(we are humans we will lose information thus we have to keep human limitation also into mind).

for more understanding please do refer to this paper in Arxiv

Seeing all these developers were happy about what they had, but there was something more that was required by the developer thus all the process came less complex with just one layer called PaaS(Platform-as-a-service ) layer. (this ruled for quite a good amount of time but there were some hidden complexity of the efficient usage of the infrastructure and also the deployment of the apps were more complex)
the PaaS layer gave a lot of the question solved but left some questions unsolved, they were:
What is the right size of the server?
How to increase server utilization?
How to scale my application?
How many servers do I need for scaling?

To answer all this problem statement Micrservies came into the picture, this solution really made tremendous improvement in the deployment and development of the application. Even a lot of the techniques have been developed as the stream software available are available today and also they are made available free and open source. Thus the era of automation came to the picture where the developers developed code t ease the work done and also to utilize the cloud to its fullest potential.

(keep in mind your deployment of the machine learning models are different compared to the actual infrastructure for the microservices where the application is deployed, here you look at the complexity of the actual servers that involve in getting the semi-structured data)

Here I would love to ask you a question is Azure notebook just a simple web deployment or an infrastructure(can be a microservices or a pub-sub deployment or any) it can be any just answer me your thoughts in the comment section.

Before the cloud it was difficult to access the physical server, difficult to analyze what is the size of the cloud infrastructure that I just buy and utilize, all these processes were done manually for the procurement of the applications.

thus all these complexities lead to a decrease in the productivity of the engineers as they will be left with less time to architect the application, thus serverless was the only option where the developer could have had got a chance to architect something new thus this solution made the developer only solve for one complexity that is to answer " How do we architect a serverless application?" and also in simple terms for the developers who don't even want that complexity that is involved with the cloud IT service engineers, the developers need to ask a question of " How can I architect my application to become serverless"

So let's jump to the serverless architecture

What is Serverless computing?

Serverless computing is a cloud computing execution paradigm in which the cloud vendor allocates computer capacity on-demand, taking control of servers on behalf of their clients.
some of their capabilities include

  1. They allow the developers to focus more on the code, not to be distracted by the server's management or any other complexity while developing.
  2. Instant event-driven scalability, they just scale based not eh events in real-time, and compute complexity is not taken into consideration. Resource as used when needed
  3. Pay per use, where you will be only billed for the function calls, code executed, memory used.

What is Serverless computing be inimical?

  1. Here in this architecture you only focus on the business logic, not technology problems related to calling your application(Heavy listing in the backend)
  2. shorter time to market
    • all the cost models(fixed to a variable) will be taken care of from the server's billing.
    • they also help in service stability.
    • Better development and testing management.
  3. Here this architecture is used mainly for the Flexibility, that is it's easy to pivot(In order to obtain a sustainable business model, software startups change their direction relentlessly or make a pivot in the Lean Startup approach. Ries defines a pivot as a strategic change, designed to test a fundamental hypothesis about a product, business model, or engine of growth.), experimentation(where the people can experiment with the requirement of the compute servers, compilers, and also developer tools in the real-time) and also we can look for the scaling at your desire.

All the above factors will be a natural fir in microservices where you need not have to worry about the factors as this is just embedded into all the serverless architecture, what you need is to learn how to make the application's architecture serverless.

Thus you need an application to be run then you need a programming style, you need a codebase to make it run thus FaaS(Function as a Service) came into the picture, for you to completely demystify this technology. this model use functions to achieve the true serverless compute.
This is a new way in which the programming of this application was developed this can be a Microsoft Azure Functions or else any Function that is called d from any cloud provider. This technology was first made available from hook.io. This provides the features like local testing, deployment, and also an application that had to be designed for the event stream and also database.

let's dig in with some more points about your functions(if you are able to call in the programing model in the "FaaS" style of programming)

  1. they just have a single responsibility.
  2. They just have one single function or single purpose.
  3. They are a reusable piece of code that processes the result and gives a result or a single output.
  4. "Functions" just don't have a complete execution through our codebase they are mainly used for the process(if necessary or if called) and when the process is finished they terminate and they free your resource for further execution. This says that you need to plan for the functions in your architecture when you are programming.
  5. they are stateless, they don't hold a persistent state for a long time, or neither they are not even dependent on the state of any other process this helps in allocating the process to the different functions also.(in Azure there are [durable functions(https://docs.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-overview?tabs=csharp) available) do check it out if you are working on the states that you need your function to persist and then give the service the capability to call other functions for the program.

This kind of function is most useful when your system architecture has less complex nodes that reach "Load Balancers" at a time. Load Balancer as a service refers to the allocation of client requests across various application servers in OpenStack environments, while Cloud Load Balancers follow a model close to LBaaS. The cloud load balancing service helps you to optimize server efficiency and stability.

ok let me just give you a hint of what this function is all about

what is a Durable Function?

Durable Function is an extension of Azure Functions that helps you to write state-of-the-art functions in a serverless computing environment. The extension helps you to describe state-of-the-art workflows by writing orchestrator functions and state-of-the-art entities by writing object functions using the Azure Functions programming model. Behind the scenes, the extension handles the condition, checkpoints, and restarts, allowing you to focus on your business logic and the architecture of the application.

  1. They are Event-Driven
  2. function responds to the predefined event and also are capable of replacing itself instantly based on the requirement of the compute, as the request comes in the calling of the application happen(you can also refer to some of the protocols that are used in communication in the cloud architecture I think there are going to be something old but still are effective to get to know what is the system architecture all about)

Thus this is part one of the serverless architecture there is going to be a new part of the article which is based on the Azure functions and the codebase so stay tuned for that folk. I will be writing an article on Azure Kubernetes, Azure DevOps, and so on thus stay tuned folks I will be coming up with some more articles like this and will give you all that is more required for the industry of Software Development.

Top comments (2)

Collapse
 
paramsiddharth profile image
Param Siddharth

Thank you for the amazing article, Vishwas! 🤗💕 Looking forward to learn more about
the Serverless architecture.

Collapse
 
vishwasnarayan5 profile image
vishwasnarayanre

Welcome bro