Thursday, May 8, 2025

Local AWS API Gateway development with Python

I’m going to take a break from writing posts about the goblinfish-testing-pact package for a bit — The work on it is still going on in the background, but it’s going slowly because of constraints on my time (and, if I’m being honest, because I’m not looking forward to trudging through the remaining unit tests there). I needed to change things up a bit, and write about something different in order to have something to post to meet my Monday/Thursday posting plans.

What I opted to write about is the first iteration of another package that I came up with over the course of a technical challenge for a recent job prospect. I won’t go too deeply into the specifics of the challenge — the company in question might pose it to other candidates — but my solution for it got me to thinking about how, at my previous position, we handled developing APIs using AWS’ API Gateway, Lambda Functions, and the Lambda Proxy Integration between the two. We defined our infrastructure using AWS SAM, and testing it locally was not really an option without using the sam local command. By the time I was part of the team where local development and testing would have been most useful, other ways of handling it had been devised that did not involve using sam local. I wasn’t part of the discussions that led to the approach that team used, but I would guess that the decision was made to avoid using sam local because it was slow. When I looked into sam local for my own ends, it looked like it had to build and spin up local Docker containers for every Lambda for every API request, and did so even if one had already been created.

That, then, got me to thinking about how to provide a better way. Essentially, what I was aiming for was a way to set up a local API that would:

The Python ecosystem is not lacking for packages that provide locally-executable HTTP API functionality. Setting aside Django, which is more an application-development environment (though there is an add-on, the Django REST Framework that provides REST API functionality), the two that seem to be the most popular are Flask and FastAPI.

Flask

Flask is a lightweight WSGI web application framework. It is designed to make getting started quick and easy, with the ability to scale up to complex applications. It began as a simple wrapper around Werkzeug and Jinja, and has become one of the most popular Python web application frameworks.

Flask offers suggestions, but doesn’t enforce any dependencies or project layout. It is up to the developer to choose the tools and libraries they want to use. There are many extensions provided by the community that make adding new functionality easy.

FastAPI

FastAPI is a modern, fast (high-performance), web framework for building APIs with Python based on standard Python type hints.

The key features are:

  • Fast: Very high performance, on par with NodeJS and Go (thanks to Starlette and Pydantic). One of the fastest Python frameworks available.
  • Fast to code: Increase the speed to develop features by about 200% to 300%.
  • Fewer bugs: Reduce about 40% of human (developer) induced errors.
  • Intuitive: Great editor support. Completion everywhere. Less time debugging.
  • Easy: Designed to be easy to use and learn. Less time reading docs.
  • Short: Minimize code duplication. Multiple features from each parameter declaration. Fewer bugs.
  • Robust: Get production-ready code. With automatic interactive documentation.
  • Standards-based: Based on (and fully compatible with) the open standards for APIs: OpenAPI (previously known as Swagger) and JSON Schema.

Both offer a fairly simple decorator-based approach to providing API endpoints: Write a function, apply the appropriate decorator, and that’s all that the function needs to handle an incoming request and return a response. Both also offer a local server, allowing someone working on the API code to run and debug it locally. Both of those local servers can also pay attention to at least some of the local project-files, allowing a change to a relevant file to restart the local server. Even in cases where a change to a Lambda Function file are not picked up automatically, restarting the local API is much faster than waiting for the sam build and sam local processes to complete, and the resolution of a local API request, assuming that it can simply call the relevant Lambda handler function, is immediate, not requiring a Docker container to spin up first.

There are trade-offs, to be sure. The SAM CLI presumably supports other Serverless Application Model resources that may not have local-API equivalents. In particular, GraphQLApi, SimpleTable and StateMachine resources, if they are needed by an application, are likely to need special handling from a local development and testing perspective. All of the other API types, though, can be represented at a very basic level, accepting requests and returning responses, and Lambda Layers are just a code-import problem to be solved. The remaining SAM resource-types I cannot speak to, having never needed to use them, and any additional resources defined in a SAM template using standard CloudFormation are almost certainly not able to be represented in a local API implementation.

For the sake of this post, I’m going to start with Flask as the API provider, not because it’s necessarily better, but because I’m more familiar with it than any of the other options. A “toy” project layout will be helpful in describing what I’m trying to accomplish:

toy-project/
├─ Pipfile
│  │  # Packages managed in categories
│  ├─ [api-person-rest]
│  │  └─ ...
│  └─ [local-api]
│     └─ Flask
├─ Pipfile.lock
├─ .env
├─ src/
│  │  # The modules that define the Lambda Handler functions
│  └─ api_person_lambdas.py
│     │  # The functions that handle {HTTP-verb} requests
│     ├─ ::get_person(event, context)
│     ├─ ::post_person(event, context)
│     ├─ ::put_person(event, context)
│     ├─ ::patch_person(event, context)
│     └─ ::delete_person(event, context)
├─ local-api/
│  └─ api_person_rest.py
│     │  # Flask() object, accepts methods (e.g., 'GET', 'POST'),
│     │  # app provides a 'route' decorator.
│     ├─ ::app
│     │  # These are decorated with app.route('path', methods=[]),
│     │  # and Flask provides a request object that may be used
│     │  # in each.
│     ├─ ::api_get_person()
│     ├─ ::api_post_person()
│     ├─ ::api_put_person()
│     ├─ ::api_patch_person()
│     └─ ::api_delete_person()
└─ tests/

In this project, the Flask application lives entirely under the local-api directory, and its api_person_rest module defines a fairly typical set of CRUD operation functions for HTTP GET, POST, PUT, PATCH and DELETE requests. Each of those functions is decorated according to Flask standards; the bare bones of the code in api_person_rest.py would start with something like this, assuming a common /person route, and no other parameters defined at this point:

from flask import Flask, request

app = Flask(__name__)

@app.route('person', methods=['GET'])
api_get_person():
    """Handles GET /person requests"""
    # Needs to call get_person(event, context)
    ...

@app.route('person', methods=['POST'])
api_post_person():
    """Handles POST /person requests"""
    # Needs to call post_person(event, context)
    ...

@app.route('person', methods=['PUT'])
api_put_person():
    """Handles PUT /person requests"""
    # Needs to call put_person(event, context)
    ...

@app.route('person', methods=['PATCH'])
api_patch_person():
    """Handles PATCH /person requests"""
    # Needs to call patch_person(event, context)
    ...

@app.route('person', methods=['DELETE'])
api_delete_person():
    """Handles DELETE /person requests"""
    # Needs to call delete_person(event, context)
    ...

When the local API is actually running, requests to any of the /person-route endpoint functions would be received based on the HTTP verb/action involved. From there, what needs to happen is a series of steps that is simple to describe, but whose implementation may be quite a bit more complex:

  • The API function needs to know to call the appropriate function from src/api_person_lambdas. For example, if a GET /person request is received by the API, the routing defined will tell the API to call the api_get_person function, and that function will need to call the api_person_lambdas::get_person function.
  • Before actually making that function-call, the incoming request needs to be converted into a Lambda Proxy Integration input data-structure. The Lambda Powertools Parser package could be installed and leveraged to provide a pre-defined data-model, complete with validation of the data types, to that end.
  • Since the Lambda handler also has a context argument, and that may or may not be used by the handler, creation of a Lambda context object; also needs to happen.
  • Once the event and context have been created, the API function can call the Lambda handler: api_person_lambdas::get_person(event, context).
  • The Lambda handler is expected, at least in this case, to return a Lambda Proxy Integration output (which may also be represented in the Lambda Power Tools models, the naming of those models isn’t clear enough to say with any certainty whether that is the case or not).
  • The response from the Lambda handler will need to be converted to a Flask Response object, possibly using the make_response helper-function that Flask provides.
  • That response will be returned through normal Flask response processes.

With those in mind, the to-do list for this package effort boils down to these items, I think:

  • Figure out how to map API (Flask) endpoint-function calls to their corresponding Lambda Function handlers.
  • Figure out how to convert an API request into a Lambda Proxy Integration event structure.
    • Take a deeper look at the Lambda Power Tools parsing extra to see if it provides both input/request and output/response models for that integration.
  • Figure out how to generate a meaningful, realistic LambdaContext object from an API request.
    • If it’s not possible, or not a realistic expectation for complete LambdaContext objects to be populated, define a minimum acceptable basis for creating one.
  • Determine the best approach for having a route-decorated API function call the appropriate Lambda handler. Some possibilities to explore include:
    • An additional decorator between the app.route decorator provided by Flask and the target function.
    • Extending the Flask Application object to add a new route-equivalent decorator that handles the process.
    • Overriding the existing decorator to handle the process.
    • Manually dealing with it in some fashion is acceptable as a starting-point, but not where it should end up by the time the package reaches a Development Status :: 5 - Production/Stable / v.1.0.0 release.
  • Figure out how to convert a Flask Response object to a Lambda Proxy Integration output object.
  • Implement anything that wasn’t implemented as part of the discovery above.
  • Test everything that wasn’t tested during previous implementation.
  • Release v.1.0.0
  • Figure out how to read a SAM Template file to automate the creation of endpoint-function to Lambda-handler function processes, and implement it.
  • Release v.1.1.0

And with that roadmap all defined, at least for now, this post is done, I think. As I write more on this idea, and get the package moving and released, the relevant posts will all be tagged with “local.lpi_apis” (Lambda Proxy Integration APIs) for ease of following my stream of thoughts and work on it.

No comments:

Post a Comment

Local AWS API Gateway development with Python: The initial FastAPI implementation

Based on some LinkedIn conversations prompted by my post there about the previous post on this topic , I feel like I shoul...