In this blog post, I’ll walk you through a hands-on experience of creating an AWS Lambda function and setting it up with Amazon API Gateway using Terraform.
Getting started
The goal here is to replicate the AWS tutorial for creating a Lambda function triggered by API Gateway, but using Terraform for the infrastructure setup. The AWS tutorial guides you through creating a simple Lambda function that performs CRUD operations on a DynamoDB which can be triggered via an HTTP request through API Gateway.
Package the Lambda function
I began packaging the Lambda function using Terraform’s archive_file
data source. This data source creates a zip file of the Lambda function code and stores it as function.zip
.
data "archive_file" "lambda_zip" {
type = "zip"
source_file = "LambdaFunctionOverHttps.py"
output_path = "function.zip"
}
Setting up IAM role and policies
For the Lambda function to interact with other AWS services (DynamoDB for database operations), I defined an IAM policy granting the necessary permissions. This policy was then attached to an IAM role, which the Lambda function would assume.
resource "aws_iam_policy" "lambda_apigateway_policy" {
name = "lambda-apigateway-policy"
policy = jsonencode({
"Version" : "2012-10-17",
"Statement" : [
{
"Action" : [
"dynamodb:DeleteItem",
"dynamodb:GetItem",
"dynamodb:PutItem",
"dynamodb:Query",
"dynamodb:Scan",
"dynamodb:UpdateItem"
],
"Effect" : "Allow",
"Resource" : "*"
},
{
"Action" : [
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents"
],
"Effect" : "Allow"
"Resource" : "*",
}
]
})
}
resource "aws_iam_role" "lambda_execution_role" {
name = "lambda-apigateway-role"
assume_role_policy = jsonencode({
Version = "2012-10-17",
Statement = [
{
Action = "sts:AssumeRole",
Effect = "Allow",
Principal = {
Service = "lambda.amazonaws.com",
},
},
],
})
}
resource "aws_iam_role_policy_attachment" "lambda_policy_attach" {
role = aws_iam_role.lambda_execution_role.name
policy_arn = aws_iam_policy.lambda_apigateway_policy.arn
}
Deploying the Lambda Function
With the IAM roles and policies in place, I moved on to deploying the Lambda function itself. I specified the runtime, handler, and the role the function should use, alongside the zipped source code.
resource "aws_lambda_function" "lambda_function_over_https" {
function_name = "LambdaFunctionOverHttps"
handler = "LambdaFunctionOverHttps.handler"
filename = data.archive_file.lambda_zip.output_path
source_code_hash = data.archive_file.lambda_zip.output_base64sha256
role = aws_iam_role.lambda_execution_role.arn
runtime = "python3.9"
}
This step registers the Lambda function in AWS, ready to be invoked.
Integrating with API Gateway
To expose the Lambda function via HTTP, I set up an API Gateway. This involved creating a REST API, defining a resource (endpoint), and specifying the method (POST in this case) that the API Gateway would accept.
resource "aws_api_gateway_rest_api" "DynamoDBOperations" {
name = "DynamoDBOperations"
}
resource "aws_api_gateway_resource" "DynamoDBManager" {
rest_api_id = aws_api_gateway_rest_api.DynamoDBOperations.id
parent_id = aws_api_gateway_rest_api.DynamoDBOperations.root_resource_id
path_part = "DynamoDBManager"
}
resource "aws_api_gateway_method" "DynamoDBManagerPost" {
rest_api_id = aws_api_gateway_rest_api.DynamoDBOperations.id
resource_id = aws_api_gateway_resource.DynamoDBManager.id
http_method = "POST"
authorization = "NONE"
}
I then integrated this API with the Lambda function using AWS_PROXY integration, allowing for seamless communication between the API Gateway and Lambda.
resource "aws_api_gateway_integration" "LambdaIntegration" {
rest_api_id = aws_api_gateway_rest_api.DynamoDBOperations.id
resource_id = aws_api_gateway_resource.DynamoDBManager.id
http_method = aws_api_gateway_method.DynamoDBManagerPost.http_method
integration_http_method = "POST"
type = "AWS_PROXY"
uri = aws_lambda_function.lambda_function_over_https.invoke_arn
}
This setup enables the API Gateway to forward requests to the Lambda function and return responses to the caller.
DynamoDB and deployment
To store and manage data, I created a DynamoDB table with a simple schema. This table serves as the backend for the Lambda function’s database operations.
resource "aws_dynamodb_table" "lambda_apigateway" {
name = "lambda-apigateway"
billing_mode = "PROVISIONED"
read_capacity = 1
write_capacity = 1
hash_key = "id"
attribute {
name = "id"
type = "S"
}
}
Moreover, I granted the lambda function permission to be invoked by the API Gateway. Finally, I defined a deployment for the API Gateway to make the changes live and accessible over the internet. This included specifying a stage name for the deployment environment.
resource "aws_lambda_permission" "allow_apigateway" {
statement_id = "AllowExecutionFromAPIGateway"
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.lambda_function_over_https.function_name
principal = "apigateway.amazonaws.com"
source_arn = "${aws_api_gateway_rest_api.DynamoDBOperations.execution_arn}/*/POST/DynamoDBManager"
}
resource "aws_api_gateway_deployment" "v1_deployment" {
depends_on = [
aws_api_gateway_integration.LambdaIntegration
]
rest_api_id = aws_api_gateway_rest_api.DynamoDBOperations.id
stage_name = "v1"
lifecycle {
create_before_destroy = true
}
}
Hiccups
- When I first tested the API Gateway endpoint, I encountered an
Internal Server Error
.
Response body
{"message": "Internal server error"}
Response headers
{
"x-amzn-ErrorType": "InternalServerErrorException"
}
The code provided on the tutorial was not parsing the body from the event. I had to modify the Lambda function code to extract the body from the event and then get operation
and payload
from the body.
Furthermore, I modified the CRUD operations slightly to return appropriate responses and handle errors gracefully.
# Attempt to parse the stringified 'body' from the event
try:
body = json.loads(event.get('body', '{}'))
except json.JSONDecodeError:
print("Body parsing failed")
return {
'statusCode': 400,
'body': json.dumps({'error': "Could not decode the request body"})
}
# Extract 'operation' and 'payload' from the parsed body
operation = body.get('operation')
payload = body.get('payload')
if not operation or not payload:
return {
'statusCode': 400,
'body': json.dumps({'error': "Missing 'operation' or 'payload' keys"})
}
- The second issue I faced was with Python’s json module not recognizing the
Decimal
type returned by DynamoDB. I had to create a custom JSON encoder to handle this. Error I received:
{"error": "Object of type Decimal is not JSON serializable"
To fix the issue, I needed to convert Decimal
types to a format that can be JSON-serialized, such as converting numeric values to int
or float
.
def ddb_read(x):
response = dynamo.get_item(**x)
item = response.get('Item', {})
# Convert all Decimal values to int or float
for key, value in item.items():
if isinstance(value, Decimal):
item[key] = int(value) if value % 1 == 0 else float(value)
return {'item': item}
Testing
- Create an item in the DynamoDB table
url https://ez5lh0ve9c.execute-api.us-east-1.amazonaws.com/test/DynamoDBManager -d '{"operation": "create", "payload": {"Item": {"id": "a8hd8dfGH", "number": 187}}}'
{"message": "Item created successfully", "id": "a8hd8dfGH"}
- Update the item in the DynamoDB table
curl -X POST https://ez5lh0ve9c.execute-api.us-east-1.amazonaws.com/test/DynamoDBManager \
-d '{"operation": "update", "payload": {"Key": {"id": "a8hd8dfGH"}, "UpdateExpression": "set #num = :number", "ExpressionAttributeNames": {"#num": "number"}, "ExpressionAttributeValues": {":number": 200}}}'
{"message": "Item updated successfully"}
- Read the item from the DynamoDB table
curl -X POST https://ez5lh0ve9c.execute-api.us-east-1.amazonaws.com/test/DynamoDBManager \
-d '{"operation": "read", "payload": {"Key": {"id": "a8hd8dfGH"}}}'
{"Item": {"id": "a8hd8dfGH", "number": 200}}
- Delete the item from the DynamoDB table
curl -X POST https://ez5lh0ve9c.execute-api.us-east-1.amazonaws.com/test/DynamoDBManager \
-d '{"operation": "delete", "payload": {"Key": {"id": "a8hd8dfGH"}}}'
{"message": "Item deleted successfully"}
Conclusion
You can peruse the full Terraform code for this setup in the GitHub repository. This hands-on experience helped me understand the intricacies of setting up a Lambda function with API Gateway using Terraform. I encountered and resolved common issues faced during such setups, such as handling request payloads and JSON serialization. This exercise was a great learning experience, and I hope this blog post helps you in your Lambda and API Gateway journey.