The IoT Ecosystem is buzz words and needed lots of data management. We receive data but how to make use of data is the most important. This design is a very small portion of a bigger portfolio. Much more application can be integrated into this design. There are many ways to perform this transformation. Athena and Glue certainly can be used here.
Consider this design is a bare minimum requirement to convert object-oriented data into data used for analytics. I am trying to use managed service as much as possible in this design but the 3rd party tool can be used for this design.
Application or IoT device will dump data into the S3 bucket. Data can have a variable field as long as the name file is common. S3 will trigger the Lambda function upon put request completion. This Lambda function will download files from the S3 bucket and copy in its temporary storage configured via lambda. To make historic trending of time-series data, Dynamodb hash key can be used with a combination of “name” and “timestamp”.
Lambda will convert the CSV file into JSON and add each row as an item into Amazon DynamoDB Table. Upon success, lambda will send a notification to SNS topics. SNS topic is configured with two types of subscriptions “SMS” and “SQS”.
Failure events can be sent to another topic for reiteration.
This code can be used with a little tweak to get a set of S3 data and perform analysis on them (Instead of put trigger use copy or post-event). Say some team likes to perform analytics from all data for last month. We can use this type of environment and provide the DynamoDB database for this specific analysis. Once work is done all configuration will be done.
Infrastructure as code (IaC)
IaC is one of the most important application deployment tools. It will reduce errors and provide highly repeatable infrastructure. This will help me not to manually configuring parameters. All parameter resource names are prefixed with “appname” variable. Thus, the same configuration can be used for different application environments or teams.
I chose Terraform to implement this so that a hybrid implementation is an option in case of any customer requirement. Terraform support all major cloud environment. Obviously, we need to appropriately change the resources.
Terraform Provider information. I highly recommend setting up a profile while running the “terraform init” command so that different environments used with different access.
Avoid using “access_key” and “secret_key”. You can also create ec2 instance with proper IAM role for Terraform deployment.
Following resource configuration will be added into the environment for this implementation –
Lambda function will use this role for internal usage. Mainly, this role should include S3 read access, Cloud watch log group and stream write access, Dynamodb add/read/update item access and SNS publish access.
Policy created with the above role access.
Attach the policy to the role.
This will be used to trigger the lambda function from S3.
Lambda function will be run after triggered by S3.
S3 notification resource that will trigger lambda function on events specified in the notification. “prefix” and “suffix” configuration can be used for different types of environments.
SNS topic where lambda function will send notification of successful events. PS. I have not configured notification on failure events. Create another topic for the same and update the lambda code accordingly.
SNS-target will connect “SQS” as a subscription for “app-snstopic”
SMS topic is created. We can club topics with just another SMS subscription. I wanted to ensure we have different topics to send different kinds of data. Like for SQS, we can send information about which rows are failed and try that row information. SMS topic will have concise information.
Sms-target topic will connect SMS phone no or list of phone nowhere an event is sent.
Queue with information. This can be used to notify topics that are not successful. Lambda function can be triggered to resolve those issues or try again. I have not added that functionality.
The table will be created as per input schema. Hash-key is important and all input data should have hash-key. If hash-key is not present then the item will not be inserted into the Dynamodb NoSQL database. In my input, “name” field is used as hash-key
Download source code from below Github link –
Download zip file and main.tf and terraform.tfvars. Change appropriate values in “terraform.tfvars” file.
Download zip file at the same location as the terraform key. Lambda function will be created by the below terraform resource.
Terraform apply command Output
The following resource will be created using
“terraform apply -auto-approve“ Command. All 12 resources will be created.
Lambda function created.
Input file uploaded to s3.
Input file format.
Dynamodb table created by terraform.
Lambda function triggers after uploading the input file.
Data inserted into nps_parks table by insertS3intoDynamodb lambda function
SNS topic created
The message is posted into SQS queue
- Add an application to analyze data from DynamoDB and present virtualization information.
- Add realist changeable data, not static data that I used in my case study.
1. Code is available with Apache license agreement
2. Do not use this code for production. Educational purposes only.
3. Needed to improve security around the environment
4. Tighten IAM policy required for production use
5. I have not created a topic for the failure event
6. A failure domain is not considered in this design
7. Lambda function is created with base minimum code and not performing data validation