Pages

Thursday, November 30, 2023

AWS Serverless note

 ec2 :

manage instance over time

patching instances

settingup scalling instances

high available manner




===========



// serverless

- cannot see or access underlying infra.



provision 

scanning

high availability 


udah diurus aws.




AWS LAMBA

- serverless

- upload code ke lambda function

- trigger via put Object => code run in managed environtment



For example, a simple Lambda function might involve automatically resizing uploaded images to the AWS Cloud. In this case, the function triggers when uploading a new image. 





1000 incoming trigger => lamba function will scale ur function to meet demand



lamba is designed to run code under 15 min.


- ga cocok buat deep learning.


- cocoknya buat quick process like web backend, handling request / backend expense report processing service. dmn takes less than 15 minutes to complete




goals:

- host short running functions

- service-oriented applications

- event driven applications

- no provision or manage server




==========


// container orchestration tools  => docker container 


- AMAZON ECS ( elastic container service ) = orchestration tool to manage container without hasle of managing ur own container orchestration software



- AMAZON EKS ( elastic kubernetes service ) = similar to ecs with different tool and features





Amazon EKS is a fully managed Kubernetes service. Kubernetes is open-source software that enables you to deploy and manage containerized applications at scale.





docker = using OS level virtualization to deliver software in container




container = package for ur code // dependency + configuration




container orchestration = manage multiple docker




** ecs and eks can run on top of ec2 

** atau bs dideploy di aws fargate  ( compute platform )





goals:

run docker container based workload on aws



=========


// aws fargate :

serverless compute platform for deploy ecs / eks   ( serverless environtment )



========



// container use case


Suppose that a company’s application developer has an environment on their computer that is different from the environment on the computers used by the IT operations staff. The developer wants to ensure that the application’s environment remains consistent regardless of deployment, so they use a containerized approach. This helps to reduce time spent debugging applications and diagnosing differences in computing environments.




// kenapa butuh orchestration tool 


- 10 host with 100 container 



When running containerized applications, it’s important to consider scalability. Suppose that instead of a single host with multiple containers, you have to manage tens of hosts with hundreds of containers. Alternatively, you have to manage possibly hundreds of hosts with thousands of containers. At a large scale, imagine how much time it might take for you to monitor memory usage, security, logging, and so on.



=======



" just code and configuration "


=====

No comments:

Post a Comment