Writing Lightweight APIs with Python, AWS API Gateway, Lambda, and Terraform
by Gary Worthington, More Than Monkeys

When you’re building products at pace, you don’t always need a full Django or FastAPI backend. Sometimes you just need an endpoint that responds quickly, scales elastically, and costs pennies until it grows.
This is where AWS API Gateway + Lambda shines. Add Terraform to the mix, and you’ve got a reproducible, lightweight API platform that can be stood up in minutes.
Why Lightweight APIs?
Traditional backends give you control and flexibility, but they also come with:
- Servers to patch and scale
- Framework overheads
- Higher base costs
For use cases like:
- Mobile app backends
- Event-driven features
- Prototypes or MVPs
- Simple JSON APIs
In the startup world, it’s often more effective to build a thin API surface on AWS’ serverless stack.
The Architecture
- AWS API Gateway: Handles HTTP requests, routing, and authentication.
- AWS Lambda (Python): Executes your logic on demand.
- Terraform: Defines everything as code so the stack is reproducible.
- LocalStack: Lets you run AWS services locally for development and testing.
Requests flow like this:
[Client] → [API Gateway] → [Lambda Function] → [Python Code] → [Response]
With Lambda Functions, there are no servers to pay for. You only pay for the time that the function is running.
Example: Hello API
Python Lambda (lambda.py)
from typing import Any, Dict
def handler(event: Dict[str, Any], context: Any) -> Dict[str, Any]:
"""
Simple Lambda handler for API Gateway.
Args:
event: The event payload from API Gateway.
context: Lambda runtime context.
Returns:
A JSON serialisable dictionary containing the API response.
"""
name = event.get("queryStringParameters", {}).get("name", "World")
return {
"statusCode": 200,
"headers": {"Content-Type": "application/json"},
"body": f'{{"message": "Hello, {name}!"}}'
}
Terraform, resource by resource
1) Environment toggle and provider
variable "environment" {
description = "Where to deploy: local or aws"
type = string
default = "local"
}
provider "aws" {
region = "eu-west-1"
access_key = var.environment == "local" ? "test" : null
secret_key = var.environment == "local" ? "test" : null
skip_credentials_validation = var.environment == "local"
skip_metadata_api_check = var.environment == "local"
skip_requesting_account_id = var.environment == "local"
endpoints = var.environment == "local" ? {
lambda = "http://localhost:4566"
apigatewayv2 = "http://localhost:4566"
iam = "http://localhost:4566"
logs = "http://localhost:4566"
} : {}
}
This block lets you switch between LocalStack and AWS with -var="environment=local|aws".
Terraform will talk to LocalStack on localhost:4566 when local, or to real AWS when deployed.
2) IAM role the Lambda assumes
resource "aws_iam_role" "lambda_exec" {
name = "lambda-exec-role"
assume_role_policy = jsonencode({
Version = "2012-10-17",
Statement = [{
Action = "sts:AssumeRole",
Effect = "Allow",
Principal = { Service = "lambda.amazonaws.com" }
}]
})
}
This block creates the execution role that the Lambda function assumes. Without this, Lambda has no identity or permissions.
3) Basic logging permissions for the role
resource "aws_iam_role_policy_attachment" "lambda_basic_logs" {
role = aws_iam_role.lambda_exec.name
policy_arn = "arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole"
}
This block attaches AWS’s managed policy that allows Lambda to write logs to CloudWatch. This is essential for debugging and general observability.
4) Package the Lambda from lambda.py
resource "archive_file" "lambda_zip" {
type = "zip"
source_file = "${path.module}/lambda.py"
output_path = "${path.module}/build/hello.zip"
}
This block builds a zip archive of your lambda.py. Terraform will upload this automatically to Lambda.
5) The Lambda function
resource "aws_lambda_function" "hello" {
function_name = "hello-api"
handler = "lambda.handler"
runtime = "python3.12"
role = aws_iam_role.lambda_exec.arn
filename = archive_file.lambda_zip.output_path
source_code_hash = archive_file.lambda_zip.output_base64sha256
}
This block defines the Lambda itself. handler points to the handler function inside lambda.py.
source_code_hash ensures redeployments onlyhappen when code changes.
6) The HTTP API (API Gateway v2)
resource "aws_apigatewayv2_api" "http_api" {
name = "hello-http-api"
protocol_type = "HTTP"
}
This block creates a lightweight HTTP API (API Gateway v2). This is the public entry point your clients will hit.
7) Integration between API Gateway and Lambda
resource "aws_apigatewayv2_integration" "lambda" {
api_id = aws_apigatewayv2_api.http_api.id
integration_type = "AWS_PROXY"
integration_uri = aws_lambda_function.hello.invoke_arn
payload_format_version = "2.0"
}
This block connects the API Gateway to your Lambda using proxy integration. Requests are passed through directly.
8) Route that maps an HTTP method and path
resource "aws_apigatewayv2_route" "hello_route" {
api_id = aws_apigatewayv2_api.http_api.id
route_key = "GET /hello"
target = "integrations/${aws_apigatewayv2_integration.lambda.id}"
}
This block defines a GET /hello route and points it to your Lambda integration.
9) Stage to deploy the API
resource "aws_apigatewayv2_stage" "dev" {
api_id = aws_apigatewayv2_api.http_api.id
name = "dev"
auto_deploy = true
}
This block deploys the API under the stage dev. Auto-deploy means changes go live instantly.
10) Permission for API Gateway to invoke the Lambda
resource "aws_lambda_permission" "allow_apigw_invoke" {
statement_id = "AllowInvokeFromHttpApi"
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.hello.function_name
principal = "apigateway.amazonaws.com"
source_arn = "${aws_apigatewayv2_api.http_api.execution_arn}/*/*"
}
This block adds a resource-based policy to the Lambda so API Gateway is allowed to call it.
Switching Between LocalStack and AWS
Deploy to LocalStack:
terraform apply -var="environment=local"
Deploy to AWS:
terraform apply -var="environment=aws"
Best practices:
- Use separate Terraform state for LocalStack and AWS (different workspaces).
- Keep environment-specific variables (names, tags, log retention).
- Always validate IAM policies in AWS — LocalStack doesn’t enforce them.
Local Development with LocalStack
Run LocalStack via pip:
pip install localstack
localstack start
Or Docker:
docker run -it -p 4566:4566 localstack/localstack
Terraform will create your resources in LocalStack. Call the API:
curl http://localhost:4566/restapis/<api_id>/dev/_user_request_/hello?name=Gary
Unit Testing with Pytest
import json
from lambda import handler
def test_handler() -> None:
"""Test the Lambda handler with a query parameter."""
event = {
"queryStringParameters": {"name": "Test"}
}
response = handler(event, None)
assert response["statusCode"] == 200
body = json.loads(response["body"])
assert body["message"] == "Hello, Test!"
def test_handler_no_name_param() -> None:
"""Test the Lambda handler without a query parameter."""
event = {
"queryStringParameters": {}
}
response = handler(event, None)
assert response["statusCode"] == 200
body = json.loads(response["body"])
assert body["message"] == "Hello, World!"
Unit Testing AWS SDK Calls with Moto
Sometimes you want to test code that calls AWS services via boto3 without spinning up LocalStack or hitting the real cloud. Moto is a Python library that mocks out AWS services in memory.
For example, imagine you add an S3 dependency to your Lambda. Here’s a minimal function that writes to a bucket:
import boto3
from typing import Any, Dict
s3 = boto3.client("s3", region_name="eu-west-1")
def save_message(bucket: str, key: str, content: str) -> Dict[str, Any]:
"""
Save a message to an S3 object.
Args:
bucket: S3 bucket name.
key: Object key.
content: Content to save.
Returns:
S3 put_object response metadata.
"""
return s3.put_object(Bucket=bucket, Key=key, Body=content)
Testing with Moto
import boto3
from moto import mock_aws
from lambda_with_s3 import save_message
@mock_aws
def test_save_message() -> None:
"""Test writing an object to S3 using Moto to mock AWS."""
s3 = boto3.client("s3", region_name="eu-west-1")
s3.create_bucket(Bucket="test-bucket")
response = save_message("test-bucket", "hello.txt", "Hello, Moto!")
assert response["ResponseMetadata"]["HTTPStatusCode"] == 200
# Validate the object exists
obj = s3.get_object(Bucket="test-bucket", Key="hello.txt")
body = obj["Body"].read().decode()
assert body == "Hello, Moto!
What this shows:
- Moto intercepts boto3 calls and simulates AWS behaviour in memory.
- No Docker or AWS account needed.
- Ideal for CI pipelines or rapid TDD loops.
End-to-End Tests
While unit tests are useful for Lambda handlers in isolation, it’s important to validate the entire flow, from API Gateway through to your Lambda code. With LocalStack running, you can do this using requests.
Example test_e2e.py
import os
import requests
LOCALSTACK_HOST = os.getenv("LOCALSTACK_HOST", "http://localhost:4566")
API_ID = os.getenv("API_ID") # Set this from Terraform output
def test_hello_endpoint() -> None:
"""
End-to-end test hitting the LocalStack API Gateway endpoint.
"""
url = f"{LOCALSTACK_HOST}/restapis/{API_ID}/dev/_user_request_/hello?name=Gary"
response = requests.get(url)
assert response.status_code == 200
payload = response.json()
assert payload["message"] == "Hello, Gary!"
Terraform Output for API ID
Add this to outputs.tf so your tests know the API ID:
output "api_id" {
value = aws_apigatewayv2_api.http_api.id
}
Running the Test
Start LocalStack (localstack start or via Docker).
Apply Terraform:
terraform apply -var="environment=local"
Export the API ID:
export API_ID=$(terraform output -raw api_id)
Run the test:
pytest test_e2e.py
This ensures your infrastructure and Python code work together as expected before deploying to AWS.
Local Testing Strategies Compared
Local development is one of the trickiest parts of serverless APIs. Three main tools cover most use cases:
LocalStack
- Best for: End-to-end simulation (API Gateway → Lambda → AWS Services)
- Strengths: High fidelity, works with Terraform, supports many AWS services
- Weaknesses: Requires Docker, slower than mocks
AWS SAM CLI
- Best for: Rapid Lambda development with API Gateway simulation
- Strengths: Backed by AWS, supports hot reloading of functions
- Weaknesses: Limited service coverage, tied closely to AWS tooling
Moto
- Best for: Unit testing Python code that calls boto3
- Strengths: Lightweight, very fast, no Docker required
- Weaknesses: Only mocks boto3, does not cover API Gateway events or full infra
Rule of thumb:
- Use Moto for fast unit tests.
- Use SAM CLI when working with a single Lambda + API Gateway.
- Use LocalStack when validating full stacks defined in Terraform.
Scaling Beyond “Hello”
Patterns that scale well:
- One Lambda per route once functions grow.
- Shared logging/middleware layers.
- Separate environments (dev, staging, prod) with Terraform modules.
- LocalStack for rapid iteration, AWS for production validation.
Cost and Performance
- Lambda + API Gateway costs scale with usage. For many startups, it’s single-digit dollars per month.
- Latency is usually 50–150ms per request. For most apps, that’s acceptable.
Final Thoughts
Lightweight APIs aren’t the answer to every problem, however they’re a powerful option when speed, simplicity, and cost-effectiveness matter. By combining AWS API Gateway, Lambda, Terraform, and LocalStack, you can build, test, and deploy robust APIs with confidence, switching seamlessly from laptop to cloud.
Gary Worthington is a software engineer, delivery consultant, and agile coach who helps teams move fast, learn faster, and scale when it matters. He writes about modern engineering, product thinking, and helping teams ship things that matter.
Through his consultancy, More Than Monkeys, Gary helps startups and scaleups improve how they build software — from tech strategy and agile delivery to product validation and team development.
Visit morethanmonkeys.co.uk to learn how we can help you build better, faster.
Follow Gary on LinkedIn for practical insights into engineering leadership, agile delivery, and team performance