Help improve this page
Want to contribute to this user guide? Choose the Edit this page on GitHub link that is located in the right pane of every page. Your contributions will help make our user guide better for everyone.
Learn how to deploy workloads and add-ons to Amazon EKS
Your workloads are deployed in containers, which are deployed in Pods in Kubernetes. A Pod includes one or more containers. Typically, one or more Pods that provide the same service are deployed in a Kubernetes service. Once you’ve deployed multiple Pods that provide the same service, you can:
-
View information about the workloads running on each of your clusters using the Amazon Web Services Management Console.
-
Vertically scale Pods up or down with the Kubernetes Vertical Pod Autoscaler.
-
Horizontally scale the number of Pods needed to meet demand up or down with the Kubernetes Horizontal Pod Autoscaler.
-
Create an external (for internet-accessible Pods) or an internal (for private Pods) network load balancer to balance network traffic across Pods. The load balancer routes traffic at Layer 4 of the OSI model.
-
Create an Application Load Balancer to balance application traffic across Pods. The application load balancer routes traffic at Layer 7 of the OSI model.
-
If you’re new to Kubernetes, this topic helps you Deploy a sample application.
-
You can restrict IP addresses that can be assigned to a service with
externalIPs
.