AWS Compute

Lambda endpoint configuration

AWS Lambda is serverless managed service. Lambda code will run without managing any server and provisioning any hardware. This post is not for Lambda feature but to Lambda Endpoint configuration. I am intend to keep that way.

Endpoints are used to connect AWS service programmatically. This connection uses “AWS privatelink“. In case organization does not want to expose VPC to public internet to communicate with AWS Lambda, endpoint comes to rescue. Endpoint is regional service.

Endpoint are two types –

  • Interface
  • Gateway

I am using Endpoint Interface in my design. I have created two lambda function for testing. When endpoint is created and configured, we need to associate subnet with endpoint. Endpoint creates interface in environment and add allocate ip in all subnet where Endpoint is associated with. This interface\ip is been used for lambda invoking. Both lambda function will use same interface ip for communication.

Architecture Diagram image-1

Note : This is one of the usecase of endpoint. Main, usecase for endpoint is sharing lambda function or any other service as Software as service. Clients can share this service across AWS accounts or even across organization.

Above design, we can still implemented by defining VPC interface while creating Lamda function. Drawback here, if you have 10 lambda function to share in the VPC, we need 10 ip from subnet which is overkill. With endpoints you need just 1 ip to invoke all lambda function for given AWS account.

Source Code

Download “lambda-endpoint-connection.yaml” file from below link –


Create user with access to run formation template.

Install AWS SDK and configure AWS environment. I am using “us-west-2” region. In case if you want use different region please use appropriate AMI from parameter. (My apologies, I have added all regions ami in parameter)

Open file “lambda-endpoint-connection.yaml” in any text editor and change following ur systemip. default value is ssh is allowed from all instances

Change Default value to your ip/32 format – Image 2

Create and Download Ec2 instance keypair and update keypair name in this field. Download keypair key file same location as cloud formation template.

update correct EC2 keypair name – Image 3

Add correct VPC cidr information. If not, default will be used.


Cloud formation template will create following resources(not all resource mentioned in list) –

  • VpcName : VPC for this test
  • Subnet1 : Subnet totally private except to ssh
  • AppInternetGateway : Internet gateway used just to connect my system with EC2 instance.
  • AppSecurityGroup : Allows port no 22 from my system to EC2 and allows all communication within VPC
  • EC2AccessLambdaRole : This role allows EC2 instance to invoke lambda function.
  • LambdaRole : This role allows Lambda function to create log groups in cloud watch to check print output in cloud watch
  • RootInstanceProfile : Instance profile for instance. Uses EC2AccessLambdaRole for assuming permission
  • EC2Instance : Instance to invoke lambda function
  • LambdaFunction : First lambda function
  • SecoundLambdaFunction : Second lambda function
  • LambdaVPCEndpoint : Lambda vpc endpoint

Run following command to validate template is working fine

aws cloudformation validate-template –template-body file://lambda-endpoint-connection.yaml

Create Stack by executing following command –

aws cloudformation create-stack –stack-name lambdaEndpoint –template-body –capabilities CAPABILITY_IAM file://lambda-endpoint-connection.yaml

Stack creation via CLI – Image 4

This will create stack in background it will take couple of minutes. Check your stack is created successful in Events section of cloud formation.

Stack creation events – Image 5

Ensure Stack is created successfully.

Stack creation completed – Image 6

Stack outputs are saved in key value pair. Take a note of Instance publicIP. We need this output for ssh into EC2 instance to check lambda access. Take note of FirstLambdaFunction and SecondLambdaFunction values we need these value to invoke Lambda function.

Stack output – Image 7

Ensure two lambda functions created successfully. Keep a note of both function name. We need function name for invoking from our EC2 instance.

Lambda function – Image 8

VPC – Endpoint configuration is created. EC2 instance internally created via private DNS name. That name is derived as <servicename>.<region_name>

In out case servicename is “lambda”.

Endpoint configuration – Image 9

Endpoint assigns ip address in all allocated subnets. In our case we have assigned VPC to just one subnet so that it assigns single ip address. IP is part of subnet where endpoint is associated with.

Endpoint Subnet association – Image 10

Assign security group to endpoint. In case need to stop access for any EC2 instances to access lambda function. This security group can be used for security reason. We can use Iam policy as well to restrict access from invoking Lambda function.

Endpoint security group configuration – Image 11

Policy definition. Full access allows any user or service to access lambda function. I highly recommend to restrict access from any services or EC2 instance via Endpoint policy and security group.

Endpoint policy – Image 12

Endpoint creates network interface in VPC environment. IP is assigned to this network interface.

Network interface – Image 13

Subnet ip count also shows ip counts is reduced for /24 masked subnet.

VPC subnet configuration – Image 14

Routed table has route with internet gateway to connect my system via ssh.

Subnet route table configuration – Image 15

Security group only allows access to port 22 from entire world and all ports are open within VPC communication for inbound and outbound traffic.

Security group inbound rule – Image 16
Security group outbound rule – Image 17

Login to newly created instance use same keypair that created during pre-req phase –

SSH to system – image 18

Configure AWS with region “us-west-2” or select any region you may like to select.

Configure AWS using us-west-2 – Image 19

To check list of functions using “aws lambda list-functions

Lambda function – Image 20

To invoke function use following command. We dont have access to any external https connection but we are still able to access lambda function.

  • aws lambda invoke –function-name <first_function_name> first_response.json
  • aws lambda invoke –function-name <second_function_name> second_response.json

Output from lambda function is saved into json format.

Invoke lambda function – Image 21

Reading output files. “body” key matches output from lambda function.

Output from lambda function – Image 22

Cloud watch events. If payload is defined while invoking function, this will be visible in cloud watch event as well.

Output from Cloud watch events – Image 23


Delete stack to cleanup. Enter following command –

  • aws cloudformation delete-stack –stack-name lambdaEndpoint


Lambda Endpoint is new feature and connects lambda via AWS privatelinks via AWS internal network. Again, security is elevated as no need to open your VPC to external traffic for lambda execution. Great way to use Lambda for function as service or using Lambda across multiple AWS accounts across organization.

Enjoy !!! Keep Building !!!!

AWS Compute Database Featured Machine Learning Storage

Image Processing with Lambda/AWS API Gateway

We are clicking pics every day and the Image datastore industry is spreading its way to our lifestyle. Massive amounts of images are kept on adding every day. In this story, I like to present a tool to search for images of a given object or celebrity like Google images. Don’t get me wrong, this is nowhere near Google images. Google images crawl weblinks. This story just belongs to same object-store.


Images will be copied and stored in the S3 bucket. I am using external tools to copy images. This external tool can be anything like S3 CLI or simple AWS. S3 state change event will trigger the Lambda function to perform image recognition analysis. I am performing two types of analysis, general analysis about environment/object, secondly celebrity analysis. Once the analysis is performed, data will be stored in Dynamodb. Dynamodb is using “keyname” from S3 as the primary key for the images. All labels generated from image recognition will be stored as an attribute in the newly assigned item. 

API gateway will be used to search for any images containing any value or celebrity. That will trigger the Lambda function will generate a pre-signed URL for each image and deliver it to the client. This pre-signed URL will expire in 10mins if the user will not download those images.

Download code 

Feel free to download code from my GitHub repo.


Parameters to be added in parameter store — 

  • myregion = Region name where all environment is setup. Multiregion setup needed configuration change with Load balancer
  • imagedb = Dynamodb table name 
    • Create dynamodb table with the primary key as string and name of primary key attribute “s3key”. “s3key” attribute will store image S3 keyname.
  • s3bucket = S3 bucketname 
    • Create S3 bucket named specified in parameter store. Create “/image” folder where all images to be copied. 


IAM Role

Create Two IAM roles. First IAM role used with Readwrite access to Dynamodb, Log steam, image recognition and S3 access. Function( will assigned this role. Policy information as below. I am using AWS managed policy for simplicity but ensure to use the appropriate role of minimum access. Use following AWS managed policy — 

  • AmazonS3FullAccess
  • AmazonDynamoDBFullAccess
  • AmazonRecognitionFullAccess
  • AWSLambdaBasicFullAccess
  • AmazonSSMReadOnlyAccess

Second role use for second lambda function ( used read DynamoDB database for correct images and keyname. Following are AWS managed policy should be added into role 

  • AmazonDynamoDBReadOnlyAccess
  • AmazonS3readOnlyAccess
  • AmazonSSMReadOnlyAccess
  • AWSLambdaBasicFullAccess

Dynamodb Table

Create empty Dynamodb table “imagerek” to store all label information into database. Primary key for this database should be “s3key”. If primary key is not named as S3key this solution will not work.

S3key has keyname from S3 images datastore.

Imagerek table

Lambda Function (

Image function will get triggered after uploading images to S3 system. Function will perform two image recognition operations. First will verify all object and label all object discovered from image. Python function definition – “rek_labels”.

To Add, delete or update labels from imagerek database.

The second part of this function will check for images for any celebrity present. Python function definition – “rek_celebrities”.

Upon gathering information function will add this information into dynamodb table that has specified in the parameter store. The primary key for this image is “keyname” from S3 bucket. 

Lambda function (

Second Lambda function will be used to search images that input is provided by API gateway. Upon inputs are received, images will be searched for specific keywords in dynamodb database.

Once the file keyname is received same function will create “pre-signed” url for images and send those links back to API gateway as html page.

Image’s pre-signed url will be sent back to as html page that will be displayed by api gateway. In real life scenario, images will be processed and presented by application\web layer.

Images uploaded to S3

Use any technique to upload images to S3 storage. One can copy images to S3 storage via cli, boto sdk, Rest API or any other custom application. Once images are uploaded lambda function will be triggered. Ensure to create “image” folder into S3 bucket and upload all images to folder. Please ensure lambda functions are deployed before images are uploaded to S3 bucket.

API Gateway

An idea if this design mainly centered around solution designing than developing an application. So I am using API gateway to send inputs to the Lambda function. Currently, the application does not support for multiple inputs but certainly can be added. After receiving responses from Lambda function, API will display images. 

API gateway configuration

Search for API gateway in AWS console, Click on create api
Choose HTTP API type.
Integrate Lambda function “search_image_funct” with API gateway and select appropriate API name.
Add routes for “/”

Default stage will be used. For better CI/CD process, try using canary method for new version deployment.

We will use default stage.
Review API configuration

Selected url will be used to search for image.

Search link is api url then “?searchfor=” and things to search

<API gateway url>/?searchfor=<things to search>

Search Examples

I am going to search some of the images those are uploaded as testing images.

Searching for Jeff Bezos

Searching for Sundar pichai

Searching for water

Searching for roses

search for universe images

Architecture images

Images are used for educational purpose. Anyway if its not appropriate to use images, please post comments I will remove it.

AWS Compute Database Featured Management Security & Identity

Convert object-oriented data to Nosql Dynamodb — 101

The IoT Ecosystem is buzz words and needed lots of data management. We receive data but how to make use of data is the most important. This design is a very small portion of a bigger portfolio. Much more application can be integrated into this design. There are many ways to perform this transformation. Athena and Glue certainly can be used here.

Design overview

Consider this design is a bare minimum requirement to convert object-oriented data into data used for analytics. I am trying to use managed service as much as possible in this design but the 3rd party tool can be used for this design.

Application or IoT device will dump data into the S3 bucket. Data can have a variable field as long as the name file is common. S3 will trigger the Lambda function upon put request completion. This Lambda function will download files from the S3 bucket and copy in its temporary storage configured via lambda. To make historic trending of time-series data, Dynamodb hash key can be used with a combination of “name” and “timestamp”.

Lambda will convert the CSV file into JSON and add each row as an item into Amazon DynamoDB Table. Upon success, lambda will send a notification to SNS topics. SNS topic is configured with two types of subscriptions “SMS” and “SQS”.

Failure events can be sent to another topic for reiteration.

Image for post
Architechture Diag (image 1)

Use Case:

This code can be used with a little tweak to get a set of S3 data and perform analysis on them (Instead of put trigger use copy or post-event). Say some team likes to perform analytics from all data for last month. We can use this type of environment and provide the DynamoDB database for this specific analysis. Once work is done all configuration will be done.

Infrastructure as code (IaC)

IaC is one of the most important application deployment tools. It will reduce errors and provide highly repeatable infrastructure. This will help me not to manually configuring parameters. All parameter resource names are prefixed with “appname” variable. Thus, the same configuration can be used for different application environments or teams.

I chose Terraform to implement this so that a hybrid implementation is an option in case of any customer requirement. Terraform support all major cloud environment. Obviously, we need to appropriately change the resources.

Terraform Provider information. I highly recommend setting up a profile while running the “terraform init” command so that different environments used with different access.

Image for post

Avoid using “access_key” and “secret_key”. You can also create ec2 instance with proper IAM role for Terraform deployment.

Following resource configuration will be added into the environment for this implementation –

  • app_lambda_role

Lambda function will use this role for internal usage. Mainly, this role should include S3 read access, Cloud watch log group and stream write access, Dynamodb add/read/update item access and SNS publish access.

  • app-lambda-cloud-watch-policy

Policy created with the above role access.

  • app-lambda-cloud-watch-policy-attachment

Attach the policy to the role.

  • allow_bucket

This will be used to trigger the lambda function from S3.

  • app-lambda-func

Lambda function will be run after triggered by S3.

  • bucket_notification

S3 notification resource that will trigger lambda function on events specified in the notification. “prefix” and “suffix” configuration can be used for different types of environments.

  • app-snstopic

SNS topic where lambda function will send notification of successful events. PS. I have not configured notification on failure events. Create another topic for the same and update the lambda code accordingly.

  • app-sns-target

SNS-target will connect “SQS” as a subscription for “app-snstopic”

  • app-snstopic-sms

SMS topic is created. We can club topics with just another SMS subscription. I wanted to ensure we have different topics to send different kinds of data. Like for SQS, we can send information about which rows are failed and try that row information. SMS topic will have concise information.

  • app-sms-target

Sms-target topic will connect SMS phone no or list of phone nowhere an event is sent.

  • app-sqs

Queue with information. This can be used to notify topics that are not successful. Lambda function can be triggered to resolve those issues or try again. I have not added that functionality.

  • app-dynamodb-table

The table will be created as per input schema. Hash-key is important and all input data should have hash-key. If hash-key is not present then the item will not be inserted into the Dynamodb NoSQL database. In my input, “name” field is used as hash-key

Source code

Download source code from below Github link –

Download zip file and and terraform.tfvars. Change appropriate values in “terraform.tfvars” file.

Image for post

Download zip file at the same location as the terraform key. Lambda function will be created by the below terraform resource.

Image for post

Terraform apply command Output

The following resource will be created using

“terraform apply -auto-approve“ Command. All 12 resources will be created.

Image for post

Lambda function created.

Image for post

Input file uploaded to s3.

Image for post

Input file format.

Image for post

Dynamodb table created by terraform.

Image for post

Lambda function triggers after uploading the input file.

Image for post

Data inserted into nps_parks table by insertS3intoDynamodb lambda function

Image for post

SNS topic created

Image for post

SQS queue

The message is posted into SQS queue

Image for post

Next Step

  • Add an application to analyze data from DynamoDB and present virtualization information.
  • Add realist changeable data, not static data that I used in my case study.


1. Code is available with Apache license agreement

2. Do not use this code for production. Educational purposes only.

3. Needed to improve security around the environment

4. Tighten IAM policy required for production use

5. I have not created a topic for the failure event

6. A failure domain is not considered in this design

7. Lambda function is created with base minimum code and not performing data validation

AWS Compute

AWS EC2 instance management — Starting and Storage instances using tagging

Perform shutting and starting of instances during non-business hours to reduce operating cost.

Architecture Diagram

AWS Lambda function will be called using CloudWatch rule at regular intervals every hour. Lambda function will review tags of each instance and verify weather instance needed shutdown or startup depend upon local time. Lambda code can be placed on S3 for safekeeping to import from.

New Log stream and group created to keep track of shutdown and startup events. Amazon SNS topics can be used for event notification. A custom role is created to get EC2 instance information and write logs to the CloudWatch event.

Instance tagging will be used to determine the instance needed to shut down or startup.

Image for post
Architecture Diag. Image 1

Tagging format

Key name: “shutdown”

Values format :

weekday:<hours to shutdown separated by hours>;weekend: <hours to shutdown separated by hours>;

weekday: Monday to Friday

weekend: Saturday and Sunday

This format can be changed but code change needed for that.

PS. Keywords include colon(:) and semicolon(;)

Role Policy

Create Role policy with the following permissions.

Image for post
image 2

Create a role and attach it to the newly created policy.

Image for post
image 3

Lambda Function

Create a Lambda function with runtime python3.8. A select newly created role for the Lambda function.

Image for post
image 4

Download file code from my GitHub below link –

To add function code. Click on Action and upload a zip file.

Image for post
image 5

After uploading a zip file you would see “” is an actual script for instance. I am using the default handler in the script. In case, handler information is changed please make appropriate changes either in function name or file name.


Python function name: lambda_handler

Image for post
image 6

I am using “pytz” module to get the correct timezone conversion so needed to upload that module code.

For regions to timezone mapping, I have created a dictionary. In case your region is not part of the list feel free to add a new key with region name and respective timezone.

Image for post
image 7

In our magical earth, sunset and sunrise are happening at the same time for different regions. So I am calling function to perform both activities simultaneously. “system_function” will generate a list of all instances that need to stop and start.

Image for post
image 8

In the case of the large instance count, I am using a paginator.

Image for post
image 9

Paginator will reduce the list by filtering tag value. If the “shutdown” tag is not set then the instance will not be part of the list. Using the same function for starting and shutting down instances. Hence, for shutting down instances I am also filtering for running instances. Or else to start instances I am filtering for stopped instances.

Image for post
image 10

Set a timeout value of more than 3sec. Normally it takes me 10 to 15 sec. This value can vary with the environment and no of instances.

Please use the following handler or make any appropriate changes.

Lambda handler = lambda_function.lambda_handler

Image for post
image 11

Lambda Permissions :

Lambda permission for creating log stream/log group and adding events in cloud watch.

Image for post
image 12

Lambda function also has permission to get instance status, able to read instance tags. Start and stop instances.

Image for post
image 13

Cloud watch Rule

Create a CloudWatch rule that will trigger the Lambda function every start of the hour every day.

Image for post
image 14
Image for post
image 15
Image for post
image 16


Current time: Fri 4:18 pm PST

In the Oregon region, I have instance1 will be shut down and instance 2 will be started.

Image for post
image 17

In North Virginia region, east 2 and east 3 should be shutdown(+3hr)

Image for post
image 18

Manually run Lambda function

Image for post
image 19

Logwatch events

Image for post
image 20

Oregon instances Started and stopped as expected

Image for post
image 21

North Virginia instances stopped

Image for post
image 22

Mumbai region instance closed as well

Image for post
image 23

Disclaimer :

SNS topic is not triggered but can be added for any stopping and starting instances.

Database can be used to track no of hours saved due to starting and stopping Ec2 instance script

AWS Compute

AWS Lambda 101

Lambda function Introduction

Lambda function is AWS offering commonly known for Function as a Service. AWS Lambda function help running code without provisioning or managing underline server. Many languages are supported by the Lambda function and the list is keep on growing. As of Jul2020, .Net, Go, Java, Node.js, Ruby, and my personal favorite Python among supported languages. Lambda is developed with high availability in mind and It is capable to scale during burstable requests.

We need to grant access to the Lambda function as per its use. Normally access is granted via the IAM role.

IAM Policy

Create a policy that will be able to create a log group and log stream. This is the basic execution rule required for the lambda function. Without this access, the Lambda function will not able to generate logs. Logging can be used for custom triggering events or tracking\debugging purposes.

Image for post
Image for post

IAM Role

The role is created to grant permission for specific tasks. In case, the function needed to access S3 get access to add appropriate policy into a role or create a custom policy. Always grant Least Privilege to function as per AWS security best practice.

Image for post

Select newly created policy

Image for post

Attach appropriate policy(image-5). Adding role description and tags are good practice in IAM. Click on Create Role.

Image for post

Lambda function

To create Lambda function, Goto Services and select Lambda.

Click the Lambda function.

Image for post

We have three options to choose from. Simplest on “Author from scratch”. In this option we will create “Hello world function” will also verify logging is working as per expectation.

“Use a blueprint”: AWS already created lots of useful functions that we can use to get started. Like, returns current status on AWS Batch job or retrieve an object from S3.

“Browse serverless app repository”: This will deploy sample Lambda application from different application repository. We also can use a private repository to pull code from.

Select an appropriate runtime environment. A select role that we have created in image-5.

Image for post

The designer will guide you on how the Lambda function is triggered. It can be triggered by different events like SNS topics, SQS, or event cloud watch logs. There are multiple different ways to trigger the Lambda function.

The Lambda function can be used for batch-oriented work or scripting purposes, you can use cloud watch rule to trigger cloud function using crontab at scheduled interval.

A destination can be an SQS event or SNS topic or event cloud watch log stream. We can also upload a read file from S3 and upload it to S3 after performing transformation within the Lambda function.

Image for post
Image for post

This Hello world function is very simple, if it’s invoked from webhook it will return status code 200 with the body “Hello from Lamda!” It also writes log into the log stream. “event” and “context” been used to get input values as well as get HTTP context information that invokes the Lambda function.

An environment variable can be used to pass any static parameter to function like incase of downloading a file from S3 bucket, bucket name. Or while writing data into Dynamodb database. Its table name.

For security reasons, do not add your “Access Key” or “Secret key” values as an environment variable. One shall still use an encrypted parameter store for this purpose.

Image for post

The handler is the most important config parameter in the Lambda function. It has two parts separated by period (.)(image-10). The first part is nothing but a file name and the second part is a function definition that will read when the Lambda function is invoked(image-8a). I kept handler value default but I always recommend giving some meaning full name.

Memory is the amount of memory dedicated to Lambda function this depends upon activity you are performing and can be changed.

Timeout value determines how much time this function runs before times out. If the activity you like to perform will take more than the timeout value specified the Lambda function will be abruptly stopped. So, give some buffer time in the timeout value.

Image for post

Be default, the Lambda function does not require to be part of any VPC but in case Lambda functions needed to be able to communicate with EC2 instances or on-premise environment for data communication or invoking lambda function from EC2 instance we needed VPC configuration. You can still trigger the Lambda function using AWS SDK with VPC configuration.

It’s very common to store output data generated via Lambda function and store into Elastic Filesystem(EFS) since the Lambda function does not have static storage. We can use temporary ephemeral storage that lasted till the execution of function so EFS can be used to store all output.

Image for post

Permissions tab allows you to verify actions permitted for Lambda function on a given resource. The dropdown can be used to select on multiple resources. As per the below screenshot, the Lambda function can create Log group, Log stream, and able to put log messages on Amazon cloud watch logs.

Image for post

We can trigger the lambda function with different triggers. Here, I am creating a test event that will trigger the lambda function. Click on “Configure test events”. We are not sending any input value to function while invoking it. Key-value pair can be used (image-14) to send values to the Lambda function.

Image for post
Image for post

Once an event is created click on “Test” to invoke the Lambda function.

Image for post

The log output is shown below. Click on the “logs” URL to open the Cloud watch log group created by the Lambda function. This Log group will be available.

Image for post

Lambda event will create either a new Log stream or update into the existing log stream. Logs are put into the Log stream(image-17). Each log stream consists of many logs.

Image for post
Image for post

Deleting Lambda function

To delete the lambda function just select the Lambda function click on Actions and Delete.

Image for post
Image for post

The Lambda function successfully deleted.

Image for post

Cloud watch Log group and streams are not deleted by default. You can delete logs from cloud watch or exported into S3 for a cheaper cost.

To delete the log group. Go to Cloud watch select logs -> Log groups and select the appropriate log group names in the below format. Click on Action and delete Log groups.


Image for post
Image for post


Lambda function is the best way to run short jobs or create a script to run. This can each work with a webhook API or SNS/SQS environment. Have fun exploring multiple lambda function usage.