Industry use cases of Azure Kubernetes Service

Naveen Pareek
9 min readOct 9, 2021

--

Before discussing Azure Kubernetes Service (AKS) industrial use-cases. Let first take a view of Azure Kubernetes Service in brief.

What is Azure Kubernetes Service?

Azure Kubernetes Service is a managed container orchestration service based on the open-source Kubernetes system, which is available on the Microsoft Azure public cloud. An organization can use AKS to handle critical functionality such as deploying, scaling and managing Docker containers and container-based applications.

AKS became generally available in June 2018 and is most frequently used by software developers and IT operations staff.

Kubernetes is the de-facto open-source platform for container orchestration but typically requires a lot of overhead in cluster management. AKS helps manage much of the overhead involved, reducing the complexity of deployment and management tasks. AKS is designed for organizations that want to build scalable applications with Docker and Kubernetes while using the Azure architecture.

An AKS cluster can be created using the Azure command-line interface (CLI), an Azure portal or Azure PowerShell. Users can also create template-driven deployment options with Azure Resource Manager templates.

Azure Kubernetes Service Architecture

To illustrate how Kubernetes is typically implemented on Azure, Microsoft provides a reference architecture, which is a Microsoft application implemented in Azure Kubernetes Service (AKS). This reference architecture can be a starting point for most implementations.

The reference architecture is composed of:

  • Azure Kubernetes Service (AKS) — at the center of the architecture is AKS.
  • Kubernetes cluster — a cluster running your workloads, deployed on AKS. With AKS you only manage agent nodes; AKS assumes responsibility for the Kubernetes control plane.
  • Virtual network — AKS creates a virtual network in which agent nodes can be deployed. In advanced scenarios, you can create a virtual network first, to give you more control over the configuration of subnets, local connections, IP addresses, etc.
  • Ingress — the ingress provides an HTTP/HTTPS path to access cluster services. Behind it, you will typically deploy an API Gateway to manage authentication and authorization.
  • Azure Load Balancer — created when the NGINX ingress controller is implemented. Used to route incoming traffic to the ingress.
  • External data storage — microservices are usually stateless and save data to external data stores, such as relational databases like Azure SQL Database or NoSQL stores like Cosmos DB.
  • Azure Active Directory (AD) — AKS has its own Azure AD identity, used to generate and control Azure resources for Kubernetes deployments. In addition to these mechanisms, Microsoft recommends using Azure AD to establish user authentication in client applications that use the Kubernetes cluster.
  • Azure Container Registry (ACR) — used to store your organization’s Docker images and use them to deploy containers to the cluster. ACR can also leverage authentication by Azure AD. Another option is to store Docker images in a third party registry, like DockerHub.
  • Azure Pipelines — part of the Azure DevOps service, and can help you automate the build/test/deployment cycle. Alternatively, you can use a third-party CI/CD solution like Jenkins.
  • Helm — the Kubernetes package manager. You can use it to combine Kubernetes objects into a package for easier distribution and versioning.
  • Azure Monitor — collects and stores logs from Azure services that interact with your Kubernetes cluster, including AKS controllers, nodes, and containers. You can use this data to monitor applications, configure alerts and dashboards, and analyze the root causes of errors.

To learn more about AKS architecture, check out the given link.

Azure Kubernetes Service Features

Some additional features such as advanced networking, monitoring, and Azure AD integration can also be configured. Let’s take a look into the features that Azure Kubernetes Service (AKS) offers:

Open-source environment with enterprise commitment

Microsoft has inducted the number of employees in the last couple of years to make Kubernetes easier for the businesses and developers to use and participate in open-source projects and became the third giant contributor to make Kubernetes more business-oriented, cloud-native, and accessible by bringing the best practices and advanced learning with diverse customers and users to the Kubernetes community.

Nodes and clusters

In AKS, apps and supporting services are run on Kubernetes nodes and the AKS cluster is a combination of one or more than one node. And, these AKS nodes are run on Azure Virtual Machines. Nodes that are configured with the same configuration are grouped together called node pool. Nodes in the Kubernetes cluster are scaled-up and scaled-down according to the resources are required in the cluster. So, nodes, clusters, and node pools are the most prominent components of your Azure Kubernetes environment.

Role-based access control (RBAC)

AKS easily integrates with Azure Active Directory (AD) to provide role-based access, security, and monitoring of Kubernetes architecture on the basis of identity and group membership. You can also monitor the performance of your AKS and the apps.

Integration of development tools

Another important feature of AKS is the development tools such as Helm and Draft are seamlessly integrated with AKS where Azure Dev Spaces can provide a quicker and iterative Kubernetes development experience to the developers. Containers can be run and debugged directly in the Azure Kubernetes environment with less stress on the configuration.

AKS also offers support for Docker image format and can also integrate with Azure Container Registry (ACR) to provide private storage for Docker images. And, regular compliance with the industry standards such as System and Organization Controls (SOC), Payment Card Industry Data Security Standard (PCI DSS), Health Insurance Portability and Accountability Act (HIPAA), and ISO make AKS more reliable across the various businesses.

Running any workload in Azure Kubernetes Service

You can orchestrate any type of workload running in the AKS environment. You can move .NET apps to Windows Server containers, modernize Java apps in Linux containers, or run microservices in Azure Kubernetes Service. AKS will run any type of workload in the cluster environment.

Removes complexities

AKS removes your implementation, installation, maintenance, and security complexities in Azure cloud architecture. It also reduces substantial costs where no per-cluster charges are being imposed on you.

Azure Kubernetes Service Pricing

AKS is a free container service where nothing will be charged for Kubernetes cluster management. You’ll have to pay only for the cloud resources such as VMs, storage, and network resources you consume makes it the most cost-effective container orchestration service in the market. Microsoft Azure introduced the Container Services calculator to calculate the estimated cost of the consumed or required resources.

For this, all you need to create a free account, deploy and manage your Kubernetes environment while building microservices apps, deploying Kubernetes cluster, monitoring, and managing the Kubernetes environment.

Conclusion

Businesses are transforming from on-premises to the cloud very quickly while building and managing modern and cloud-native applications. Kubernetes is one of the solutions that is open-sourced and supports building and deploying cloud-native apps with complete orchestration. Azure Kubernetes Service is a robust and cost-effective container orchestration service that helps you to deploy and manage containerized applications in seconds where additional resources are assigned automatically without the headache of managing additional servers.

AKS nodes are scaled out automatically as the demand increases. It has numerous benefits such as security with role-based access, easy integration with other development tools, and running any workload in the Kubernetes cluster environment. It also offers efficient utilization of resources, removes complexities, easily scaled-out, and migrates any existing workload to a containerized environment and all containerized resources can be accessed via the AKS management portal or AKS CLI.

MAERSK Using Azure Kubernetes Service

Moller — Maersk is an integrated container logistics company and member of the A.P. Moller Group. Connecting and simplifying trade to help our customers grow and thrive. With a dedicated team of over 80,000 operating in 130 countries; we go all the way to enable global trade for a growing world.

As part of its digital transformation efforts, shipping giant A.P. Moller — Maersk needed to streamline IT operations and optimize the value of its IT resources. Maersk adopted Microsoft Azure, migrated key workloads to the cloud, and modernized its open-source software, which included the adoption of Kubernetes on Azure. Maersk software engineers now spend less time on container software management and more time on innovation and value-added projects. The resulting business value is savings on resource costs, faster solution delivery time, and the ability to attract expert IT talent.

Implementing a container strategy

As part of its overall cloud migration strategy, Maersk chose Azure Kubernetes Service (AKS) to handle the automation and management of its containerized applications. (A containerized application is portable runtime software that is packaged with the dependencies and configuration files it needs in order to run, all in one place.) AKS fully supports the dynamic application environment in Maersk without requiring orchestration expertise.

The company uses AKS to help set up, upgrade, and scale resources as needed, without taking its critical applications offline. “We want to focus on using containers as a way to package and run our code in the cloud, not focus on the software required to construct and run the containers,” Hald says. “Using Kubernetes on Azure satisfies our objectives for efficient software development. It aligns well with our digital plans and our choice of open-source solutions for specific programming languages.”

Additionally, Maersk chose Azure over other cloud platforms because Azure offers a wider variety of available services and global scalability that supports the number and type of tasks the company wants to undertake. “The key question we ask is, ‘Where does the cloud stop and where does our work begin?’ For the Connected Vessel program, Azure made the most business sense, and it promotes agility,” says Hald. “Just the fact that we’re asking questions like this illustrates our paradigm shift to support digital transformation.”

Freeing talent to create

These examples illustrate how important it is to Maersk to deploy IT engineers more effectively. “We want engineers to be an active part of our new way of working, which means spending their time and effort where it makes the most business impact,” Hald says. “When topped with open source, Azure gives engineers freedom. Software developers have had enough of servers. They want to create. And we want them to.” Combining advanced technology with this mindset also helps Maersk better attract talented engineers who value innovation and the opportunity to positively affect the business.

For example, with increased time and talent, Maersk engineers were able to address customer requests by adding additional shipment monitoring capabilities to the company’s portfolio of solutions. Namely, they are in the process of building an Internet of Things (IoT) solution that will use AKS along with Azure IoT Hub to more closely monitor shipments and physical shipping containers (not to be confused with software containers governed by AKS), including conditions within the containers. “Running an IoT hub may not differentiate Maersk in the market but having a connected vessel — with a growing multitude of our services to support it — soon will,” says Hald.

This project demonstrates the company’s new agility, such as greatly reducing the time and bureaucracy required to get IT resources in place to start it. “Without Azure, it could be six months before we had the first server ready for developers on our Connected Vessel program,” says Hald. “With it, we completed concepting, development, testing, and deployment in six months.”

Hald summarizes the progress Maersk has made, saying, “Imagine solutions that harvest data that has never been available before — data that informs customers and ships’ crews about the condition of their cargo. Imagine data that powers a model that calculates IoT locations even when physical devices can’t be tracked. That’s the world we’ll soon live in, and Maersk will help make it possible.”

Using Kubernetes on Azure satisfies our objectives for efficient software development. It aligns well with our digital plans and our choice of open-source solutions.

Without Azure, it could be six months before we had the first server ready for developers on our Connected Vessel program. With it, we completed concepting, development, testing, and deployment in six months.

Rasmus Hald: Head of Cloud Architecture
A.P. Moller — Maersk

Thank You!

Keep Learning & Sharing…

If this article is useful for you then don’t forget to press the clap 👏 icon and also follow me on medium for more such amazing articles.

Leave a comment if you have any doubts or you can connect me on LinkedIn.

--

--

No responses yet