Help improve this page
Want to contribute to this user guide? Scroll to the bottom of this page and select Edit this page on GitHub. Your contributions will help make our user guide better for everyone.
Workloads
Your workloads are deployed in containers, which are deployed in Pods in Kubernetes. A Pod includes one or more containers. Typically, one or more Pods that provide the same service are deployed in a Kubernetes service. Once you've deployed multiple Pods that provide the same service, you can:
-
View information about the workloads running on each of your clusters using the Amazon Web Services Management Console.
-
Vertically scale Pods up or down with the Kubernetes Adjust pod resources with Vertical Pod Autoscaler.
-
Horizontally scale the number of Pods needed to meet demand up or down with the Kubernetes Scale pod deployments with Horizontal Pod Autoscaler.
-
Create an external (for internet-accessible Pods) or an internal (for private Pods) network load balancer to balance network traffic across Pods. The load balancer routes traffic at Layer 4 of the OSI model.
-
Create an Route application and HTTP traffic with Application Load Balancers to balance application traffic across Pods. The application load balancer routes traffic at Layer 7 of the OSI model.
-
If you're new to Kubernetes, this topic helps you Deploy a sample application.
-
You can restrict IP addresses that can be assigned to a service with
externalIPs
.