Serverless apps in Kubernetes, Azure Functions

Azure Functions provides serverless computing as Functions-as-a-Service, which provides a platform for you to develop, run, and manage application functionalities without the complexity of building and maintaining the infrastructure typically associated with developing and launching an app.

Azure Functions executes code to respond to changes in data, responding to messages, running on a schedule, or as the result of an HTTP request.

Typically, you just deploy the function into an existing base container provided by Microsoft. But if you specific needs, such as specific version, you can deploy your Function app as a custom container into the Azure Functions service.

As an alternative to Azure service, you can deploy Azure Functions into your own Kubernetes deployment and run Functions along side your other Kubernetes deployments.

With Azure Functions service you no longer need to manage disk capacity or memory. The compute requirements are are handled automatically. You pay for what and when you use it, rather than fixed sizes and memory required by other Azure services.

You can use a Docker container to deploy your function app to Azure Functions. You can also deploy Azure Functions app into your own Kubernetes.

In this article, you learn about the key features of Azure Functions with containers.

Let’s get started.

Functions helps you process bulk data, integrating systems, working with IoT, and building simple API and microservices. At its core, code is running triggered by specific events.

The following conceptual diagram shows the component parts of an Azure function.

Functions can be created either directly on Azure Portal or in Visual Studio and are executed when specific events, triggers, occur.  A trigger causes a function to run.

Function trigger

A trigger defines how a function is invoked and a function must have exactly one trigger. Triggers have associated data, which is often provided as the payload of the function.

Here are some available triggers:

Trigger Code executes on
HTTP New HTTP request
Timer Timer schedule
Blob New blob added to an Azure storage container
Queue New message arrives in Azure storage queue
Service Bus Nes message arrives on a Service bug queue or topic
Event Hub New message delivered to an Event Hub

The following illustration shows how an API call function triggers an Azure Function that uploads some data into a storage blob. When completed, an Azure Function does one additional processing and adds the uploaded filename to a storage table, and then responds by returning a URL to the user.

Function bindings

A binging is an optional way to connect to another resource.

Triggers and bindings let you avoid hardcoding access to other services. Your function receives data (for example, the content of a queue message) in function parameters. You send data (for example, to create a queue message) by using the return value of the function.

You can mix and match different bindings to suit your needs. Bindings are optional and a function might have one or multiple input and/or output bindings.

Binding to a function is a way of declaratively connecting another resource to the function; bindings may be connected as input bindings, output bindings, or both. Data from bindings is provided to the function as parameters.

Input bindings are used for data received by the function. Output bindings are used to send data either using the return result or specific collections.

A function can have multiple input and output bindings.

Key scenarios

A series of templates is available to get you started with key scenarios including:

For more information, see Azure Functions triggers and bindings concepts. Examples in the documentation show how to use functions with C#, Java, JavaScript, PowerShell, Python.

Durable Functions

When you want to provide stateful functions in a serverless environment, use durable functions. Durable functions currently support:

  • C#
  • JavaScript
  • Python
  • F#
  • PowerShell

More languages are on the way.

The following illustration shows the conceptual pattern of durable Azure functions:

Define stateful workflows by writing orchestrator functions and stateful entities by writing entity functions using the Azure Functions programming model.

A flow with Azure Durable Functions consists of three  types of Azure functions: StarterOrchestrator and Activity functions.

  • Starter Function: Simple Azure Function that starts the Orchestration by calling the Orchestrator function. It uses an OrchestrationClient binding
  • Orchestrator Function: Defines a stateful workflow in code and invokes the activity functions. Sleeps during activity invocation and replays when wakes up. The code in an orchestrator function MUST be deterministic because during the flow the code will be executed again and again till all activity functions finish. You declare a function as an orchestrator by using a DurableOrchestrationContext
  • Activity Functions: Simple Azure Functions that are part of the workflow and can receive or return data. An Activity function uses an ActivityTrigger so that can be invoked by the orchestrator

There are some coding constraints in using durable functions. For example, orchestrator functions must be deterministic: an orchestrator function will be replayed multiple times, and it must produce the same result each time.

For more information, see What are Durable Functions?.and Building serverless apps with Azure Functions.

Durable application patterns

The primary use case for Durable Functions is simplifying complex, stateful coordination requirements in serverless applications. The following sections describe typical application patterns that can benefit from Durable Functions:

Let’s take one of the examples, function chaining. The following diagram provided by Microsoft, shows a sequence of functions executes in a specific order.

You can implement control flow by using normal imperative coding constructs. Code executes from the top down. The code can involve existing language control flow semantics, like conditionals and loops. You can include error handling logic in try/catch/finally blocks.

Logic App Integration

Azure Functions integrates with Azure Logic Apps in the Logic Apps Designer. This integration lets you use the computing power of Functions in orchestrations with other Azure and third-party services.

For more information, see Create a function that integrates with Azure Logic Apps.

Deploy a container into Azure Functions

The Azure Functions Base images are provided for:.

If that meets your needs, you do not need to a custom container. But you may require a specific language version or have a specific dependency or configuration that isn’t provided by the built-in image. In that case, you need a custom container.

Deploying your function code in a custom Linux container requires Premium plan or a Dedicated (App Service) plan hosting.

For a tutorial, see Create a function on Linux using a custom container. The tutorial includes examples for C#, Java, JavaScript, PowerShell, Python, and TypeScript.

Function app scaling in Kubernetes

If your application exposes and http interface, you can use Kubernetes Event Driven Autoscaler (KEDA) to deploy your Azure Function app into Kubernetes with autoscaling. Azure Functions is unique that its runtime is open-source. The runtime and your code can therefore be deployed to a custom container or deployed on your own infrastructure including Kubernetes.

KEDA is a single-purpose and lightweight component that can be added into any Kubernetes cluster. KEDA works alongside standard Kubernetes components like the Horizontal Pod Autoscaler and can extend functionality without overwriting or duplication. With KEDA you can explicitly map the apps you want to use event-driven scale, with other apps continuing to function.

Scalers represent event sources that KEDA can scale based on:

  • ActiveMQ Artemis
  • Apache Kafka
  • AWS CloudWatch
  • AWS Kinesis Stream
  • AWS SQS Queue
  • Azure Blob Storage
  • Azure Event Hubs
  • Azure Storage Queue
  • CPU
  • Cron
  • External
  • External Pus
  • Google Cloud Platform Pub/Sub
  • Huawei Coudeve
  • IBM MQ
  • Liiklus Topic
  • Memory
  • Metrics API
  • MySQL
  • NATS Streaming
  • Postgress SQL
  • Prometheus
  • Rabbit MQ Series
  • Redis Lists
  • Redis Streams

See Scale a HTTP Triggered app up and down in Kubernetes using KEDA and Prometheus for how to scale a  function app in Kubernetes using the Prometheus KEDA scaled object and an Ingress Controller.

References

Next Steps

Try: