March 18, 2024

Using Lambda with API Gateway Tutorial

In this blog post, I’ll walk you through a hands-on experience of creating an AWS Lambda function and setting it up with Amazon API Gateway using Terraform.

Getting started

The goal here is to replicate the AWS tutorial for creating a Lambda function triggered by API Gateway, but using Terraform for the infrastructure setup. The AWS tutorial guides you through creating a simple Lambda function that performs CRUD operations on a DynamoDB which can be triggered via an HTTP request through API Gateway.

Package the Lambda function

I began packaging the Lambda function using Terraform’s archive_file data source. This data source creates a zip file of the Lambda function code and stores it as

data "archive_file" "lambda_zip" {
  type        = "zip"
  source_file = ""
  output_path = ""

Setting up IAM role and policies

For the Lambda function to interact with other AWS services (DynamoDB for database operations), I defined an IAM policy granting the necessary permissions. This policy was then attached to an IAM role, which the Lambda function would assume.

resource "aws_iam_policy" "lambda_apigateway_policy" {
  name = "lambda-apigateway-policy"
  policy = jsonencode({
    "Version" : "2012-10-17",
    "Statement" : [
        "Action" : [
        "Effect" : "Allow",
        "Resource" : "*"
        "Action" : [
        "Effect" : "Allow"
        "Resource" : "*",

resource "aws_iam_role" "lambda_execution_role" {
  name = "lambda-apigateway-role"
  assume_role_policy = jsonencode({
    Version = "2012-10-17",
    Statement = [
        Action = "sts:AssumeRole",
        Effect = "Allow",
        Principal = {
          Service = "",

resource "aws_iam_role_policy_attachment" "lambda_policy_attach" {
  role       =
  policy_arn = aws_iam_policy.lambda_apigateway_policy.arn

Deploying the Lambda Function

With the IAM roles and policies in place, I moved on to deploying the Lambda function itself. I specified the runtime, handler, and the role the function should use, alongside the zipped source code.

resource "aws_lambda_function" "lambda_function_over_https" {
  function_name    = "LambdaFunctionOverHttps"
  handler          = "LambdaFunctionOverHttps.handler"
  filename         = data.archive_file.lambda_zip.output_path
  source_code_hash = data.archive_file.lambda_zip.output_base64sha256
  role             = aws_iam_role.lambda_execution_role.arn
  runtime          = "python3.9"

This step registers the Lambda function in AWS, ready to be invoked.

Integrating with API Gateway

To expose the Lambda function via HTTP, I set up an API Gateway. This involved creating a REST API, defining a resource (endpoint), and specifying the method (POST in this case) that the API Gateway would accept.

resource "aws_api_gateway_rest_api" "DynamoDBOperations" {
  name = "DynamoDBOperations"

resource "aws_api_gateway_resource" "DynamoDBManager" {
  rest_api_id =
  parent_id   = aws_api_gateway_rest_api.DynamoDBOperations.root_resource_id
  path_part   = "DynamoDBManager"

resource "aws_api_gateway_method" "DynamoDBManagerPost" {
  rest_api_id   =
  resource_id   =
  http_method   = "POST"
  authorization = "NONE"

I then integrated this API with the Lambda function using AWS_PROXY integration, allowing for seamless communication between the API Gateway and Lambda.

resource "aws_api_gateway_integration" "LambdaIntegration" {
  rest_api_id =
  resource_id =
  http_method = aws_api_gateway_method.DynamoDBManagerPost.http_method

  integration_http_method = "POST"
  type                    = "AWS_PROXY"
  uri                     = aws_lambda_function.lambda_function_over_https.invoke_arn

This setup enables the API Gateway to forward requests to the Lambda function and return responses to the caller.

DynamoDB and deployment

To store and manage data, I created a DynamoDB table with a simple schema. This table serves as the backend for the Lambda function’s database operations.

resource "aws_dynamodb_table" "lambda_apigateway" {
  name           = "lambda-apigateway"
  billing_mode   = "PROVISIONED"
  read_capacity  = 1
  write_capacity = 1
  hash_key       = "id"

  attribute {
    name = "id"
    type = "S"

Moreover, I granted the lambda function permission to be invoked by the API Gateway. Finally, I defined a deployment for the API Gateway to make the changes live and accessible over the internet. This included specifying a stage name for the deployment environment.

resource "aws_lambda_permission" "allow_apigateway" {
  statement_id  = "AllowExecutionFromAPIGateway"
  action        = "lambda:InvokeFunction"
  function_name = aws_lambda_function.lambda_function_over_https.function_name
  principal     = ""
  source_arn    = "${aws_api_gateway_rest_api.DynamoDBOperations.execution_arn}/*/POST/DynamoDBManager"

resource "aws_api_gateway_deployment" "v1_deployment" {
  depends_on = [

  rest_api_id =
  stage_name  = "v1"
  lifecycle {
    create_before_destroy = true


  1. When I first tested the API Gateway endpoint, I encountered an Internal Server Error.
Response body

{"message": "Internal server error"}

Response headers

  "x-amzn-ErrorType": "InternalServerErrorException"

The code provided on the tutorial was not parsing the body from the event. I had to modify the Lambda function code to extract the body from the event and then get operation and payload from the body. Furthermore, I modified the CRUD operations slightly to return appropriate responses and handle errors gracefully.

    # Attempt to parse the stringified 'body' from the event
        body = json.loads(event.get('body', '{}'))
    except json.JSONDecodeError:
        print("Body parsing failed")
        return {
            'statusCode': 400,
            'body': json.dumps({'error': "Could not decode the request body"})
    # Extract 'operation' and 'payload' from the parsed body
    operation = body.get('operation')
    payload = body.get('payload')
    if not operation or not payload:
        return {
            'statusCode': 400,
            'body': json.dumps({'error': "Missing 'operation' or 'payload' keys"})
  1. The second issue I faced was with Python’s json module not recognizing the Decimal type returned by DynamoDB. I had to create a custom JSON encoder to handle this. Error I received:
{"error": "Object of type Decimal is not JSON serializable"

To fix the issue, I needed to convert Decimal types to a format that can be JSON-serialized, such as converting numeric values to int or float.

def ddb_read(x):
    response = dynamo.get_item(**x)
    item = response.get('Item', {})
    # Convert all Decimal values to int or float
    for key, value in item.items():
        if isinstance(value, Decimal):
            item[key] = int(value) if value % 1 == 0 else float(value)
    return {'item': item}


  1. Create an item in the DynamoDB table
url -d '{"operation": "create", "payload": {"Item": {"id": "a8hd8dfGH", "number": 187}}}'
{"message": "Item created successfully", "id": "a8hd8dfGH"}
  1. Update the item in the DynamoDB table
curl -X POST \
-d '{"operation": "update", "payload": {"Key": {"id": "a8hd8dfGH"}, "UpdateExpression": "set #num = :number", "ExpressionAttributeNames": {"#num": "number"}, "ExpressionAttributeValues": {":number": 200}}}'
{"message": "Item updated successfully"}
  1. Read the item from the DynamoDB table
curl -X POST \
-d '{"operation": "read", "payload": {"Key": {"id": "a8hd8dfGH"}}}'
{"Item": {"id": "a8hd8dfGH", "number": 200}}
  1. Delete the item from the DynamoDB table
curl -X POST \
-d '{"operation": "delete", "payload": {"Key": {"id": "a8hd8dfGH"}}}'
{"message": "Item deleted successfully"}


You can peruse the full Terraform code for this setup in the GitHub repository. This hands-on experience helped me understand the intricacies of setting up a Lambda function with API Gateway using Terraform. I encountered and resolved common issues faced during such setups, such as handling request payloads and JSON serialization. This exercise was a great learning experience, and I hope this blog post helps you in your Lambda and API Gateway journey.

Powered by Hugo & Kiss.