Kubernetes | It’s Use cases |Future of Kubernetes

Abhayjit kumar
9 min readDec 27, 2020

--

What is Kubernetes ?

The definition of Kubernetes keeps changing because as it keeps growing, Kubernetes changes the world around it. Here now is the Fall 2019 edition: Kubernetes is a workload distribution and orchestration mechanism for clustered servers in a data center, ensuring resource availability, accessibility, and balanced execution for multiple services concurrently.

  • Containerization makes business software easier to manage. In the context of server-based computing, a container is a package that enables workloads to be virtualized (portable, self-contained, running in isolation) while still hosted by an operating system (as opposed to a hypervisor). Modern applications are made portable among servers by containerizing them, which is not just about packaging put deployment. In a containerized environment, the code for software is retrieved or “pulled” from repositories (some public, others private), then immediately deployed and run in the production environment. This automated deployment method enables software to be improved not just every eighteen months or so, but potentially every day, not just by its originators but by its users as well. In turn, this dramatically improves data center system integrity as well as security.

OK, so what specifically can Kubernetes do for us?

Here are five fundamental business capabilities that Kubernetes can drive in the enterprise–be it large or small. And to add teeth to these use cases, we have identified some real world examples to validate the value that enterprises are getting from their Kubernetes deployments

  1. Faster time to market
  2. IT cost optimization
  3. Improved scalability and availability
  4. Multi-cloud (and hybrid cloud) flexibility
  5. Effective migration to the cloud

1. Faster time to market

Kubernetes enables a “microservices” approach to building apps. Now you can break up your development team into smaller teams that focus on a single, smaller microservice. These teams are smaller and more agile because each team has a focused function. APIs between these microservices minimize the amount of cross-team communication required to build and deploy. So, ultimately, you can scale multiple small teams of specialized experts who each help support a fleet of thousands of machines.

Real World Case Study

Airbnb’s transition from a monolithic to a microservices architecture is pretty amazing. They needed to scale continuous delivery horizontally, and the goal was to make continuous delivery available to the company’s 1,000 or so engineers so they could add new services. Airbnb adopted Kubernetes to support over 1,000 engineers concurrently configuring and deploying over 250 critical services to Kubernetes. The net result is that AirBnb can now do over 500 deploys per day on average.

Tinder: One of the best examples of accelerating time to market comes from Tinder. This blog post describes Tinder’s K8 journey well. And here’s the cliff notes version of the story: Due to high traffic volume, Tinder’s engineering team faced challenges of scale and stability. And they realized that the answer to their struggle is Kubernetes. Tinder’s engineering team migrated 200 services and ran a Kubernetes cluster of 1,000 nodes, 15,000 pods, and 48,000 running containers. While the migration process wasn’t easy, the Kubernetes solution was critical to ensure smooth business operations going further.

2. IT cost optimization

Kubernetes can help your business cut infrastructure costs quite drastically if you’re operating at massive scale. Kubernetes makes a container-based architecture feasible by packing together apps optimally using your cloud and hardware investments. Before Kubernetes, administrators often over-provisioned their infrastructure to conservatively handle unexpected spikes, or simply because it was difficult and time consuming to manually scale containerized applications. Kubenetes intelligently schedules and tightly packs containers, taking into account the available resources. It also automatically scales your application to meet business needs, thus freeing up human resources to focus on other productive tasks.

Real World Case Study

Spotify is an early K8s adopter and has significant cost saving values by adopting K8s as described in this note. Leveraging K8s, Spotify has seen 2–3x CPU utilization using the orchestration capabilities of K8s, resulting in better IT spend optimization.

Pinterest is another early K8s customer. Leveraging K8s, the Pinterest IT team reclaimed over 80 percent of capacity during non-peak hours. They now use30 percent less instance-hours per day compared to the static cluster.

3. Improved scalability and availability

As an orchestration system, Kubernetes is a critical management system to “auto-magically” scale and improve app performance. Suppose we have a service which is CPU-intensive and with dynamic user load that changes based on business conditions . What we need here is a solution that can scale up the app and its infrastructure so that new machines are automatically spawned up as the load increases and scale it down when the load subsides. Kubernetes offers just that capability by scaling up the application as the CPU usage goes above a defined threshold — for example, 90 percent on the current machine. And when the load reduces, Kubernetes can scale back the application, thus optimizing the infrastructure utilization. The Kubernetes auto-scaling is not limited to just infrastructure metrics; any type of metric — resource utilization metrics — even custom metrics can be used to trigger the scaling process.

Real World Case Study

LendingTree: Here’s a great article from LendingTree. LendingTree has many microservices that make up its business apps. LendingTree uses Kubernetes and its horizontal scaling capability to deploy and run these services, and to ensure that their customers have access to service even during peak load. And to get visibility into these containerized and virtual services and monitor its Kubernetes deployment, LendingTree uses Sumo Logic

4. Multi-cloud flexibility

One of the biggest benefits of Kubernetes and containers is that it helps you realize the promise of hybrid and multi-cloud. Enterprises today are already running multi-cloud environments and will continue to do so in the future. Kubernetes makes it much easier is to run any app on any public cloud service or any combination of public and private clouds. This allows you to put the right workloads on the right cloud and to help you avoid vendor lock-in. And getting the best fit, using the right features, and having the leverage to migrate when it makes sense all help you realize more ROI (short and longer term) from your IT investments.

Need more data to validate the multi-cloud and Kubernetes made-in-heaven story? This finding from the Sumo Logic Continuous Intelligence Report identifies a very interesting upward trend on K8 adoption based on the number of cloud platforms organizations use, with 86 percent of customers on all three using managed or native Kubernetes solutions. Should AWS be worried? Probably not. But, it may be an early sign of a level playing field for Azure and GCP — because apps deployed on K8s can be easily ported across environments

Real World Case Study

Gannett/USA Today is a great example of a customer who is using Kubernetes to operate multi-cloud environments across AWS and Google Cloud platform. At the beginning, Gannett was an AWS shop. Gannett moved to Kubernetes to support their growing scale of customers (they did 160 deployments per day during the 2016 presidential news season!), but as their business and scaling needs changed, Gannett used the fact that they are deployed on Kubernetes in AWS to seamlessly run the apps in GCP.

5. Seamless migration to cloud

Since K8s runs consistently across all environments, on-premise and clouds like AWS, Azure and GCP, Kubernetes provides a more seamless and prescriptive path to port your application from on-premise to cloud environments. Rather than deal with all the variations and complexities of the cloud environment, enterprises can follow a more prescribed path:

  1. Migrate apps to Kubernetes on-premise. Here you are more focused on replatforming your apps to containers and bringing them under Kubernetes orchestration.
  2. Move to a cloud-based Kubernetes instance. You have many options here — run Kubernetes natively or choose a managed Kubernetes environment from the cloud vendor.
  3. Now that the application is in the cloud, you can start to optimize your application to the cloud environment and its services.

Real World Case Study

Shopify started as a data center based application and over the last few years has completely migrated all their application to Google Cloud Platform. Shopify first started running containers (docker); the next natural step was to use K8s as a dynamic container management and orchestration system.

Pokemon Go’s Kubernetes story

How was ‘Pokemon Go’ able to scale so efficiently & became so successful? The answer is Kubernetes. Pokemon Go was developed and published by Niantic Inc. 500+ million downloads and 20+ million daily active users. Pokemon Go engineers never thought their user base would increase exponentially surpassing the expectations within a short time, they were not ready for it, and even the servers couldn’t handle this much traffic.

The challenge

The horizontal scaling on one side but Pokemon Go also faced a severe challenge when it came to vertical scaling because of the real-time activity by millions of users worldwide. Niantic was not prepared for this.

The solution

The magic of containers. The application logic for the game ran on Google Container Engine (GKE) powered by the open source Kubernetes project. Niantic chose GKE for its ability to orchestrate their container cluster at planetary-scale, freeing its team to focus on deploying live changes for their players. In this way, Niantic used Google Cloud to turn Pokémon GO into a service for millions of players, continuously adapting and improving. This got them more time to concentrate on building the game’s application logic and new features rather than worrying about the scaling part.

Kubernetes is the future: But what does this future look like?

When industry influencers and CIOs talk about the future of computing, they typically aren’t only discussing hardware advancements or cloud-based software. Increasingly, these conversations center on transformation through application innovation, providing new predictive services to customers that are driven by an integrated user experience. This could be something like inspecting customer data patterns to promote new banking services, analyzing health indicators to proactively recommend treatment or an immersive interface for personalized interactions.

Gartner predicts that, by 2022, more than 75% of global organizations will be running containerized applications in production, which is a significant increase from fewer than 30% in 2019.

The future of IT is going to be about greater interactivity, seamless integrated experiences, predictive analytics, automation, decision making via machine learning, making sense of data exhaust, adding in augmented and virtual reality and a host of other applications we cannot even imagine yet. These applications will run most effectively when they are offered the greatest flexibility and agility. Container-based, cloud-native apps orchestrated by Kubernetes, offers those attributes to become the building blocks of the modern IT infrastructure. The future of IT requires a platform that supports all of this and that spans existing IT investments in data centers and clouds as well as embraces what is yet to come. This is why hybrid cloud approach are coming into the picture .

Conclusion.

Around the world, many CIO’s and technologists have chosen to use Kubernetes, and it is expected to evolve much more in the years to come.

Containers are becoming more and more popular in the software world and Kubernetes has become the industry standard for deploying containers in production. We will expect a high growth rate of Kubernetes in future also.

--

--

No responses yet