Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for runtime: provided without requiring useDocker #1792

Open
wants to merge 10 commits into
base: master
Choose a base branch
from

Conversation

cnuss
Copy link
Contributor

@cnuss cnuss commented May 27, 2024

Description

This PR allows usage of runtime: provided in serverless.yml and useDocker is unset or false in custom.serverless-offline:

GET /dev/hello (λ: hello)
× Unsupported runtime
× Uncaught exception

It now uses the execa library to do a local execution of the bootstrap script:

GET /dev/hello (λ: hello)

(λ: hello) RequestId: b440ae80-b0ca-4d0d-ab6f-ea3226cb9d1a  Duration: 55.80 ms  Billed Duration: 56 ms

Motivation and Context

  • I have a project where I compile a Go application into a binary.
  • The compiled binary is so simple that it doesn't need Docker as an abstraction layer for execution
  • This allows for direct execution of the bootstrap script without launching it inside a docker container
  • This makes for a faster local execution of binaries
  • This allows people to connect to debug ports for their compiled binaries (such as launching with delve)

How this works:

Notes

  • Design choice: one RuntimeServer per invocation:
    • We could, possibly, create a single runtime server and share it between multiple invocations. I thought this to be a premature optimization, so I opted to make a brand new RuntimeServer for each invocation, for a more lambda-like experience.
    • A single RuntimeServer could be shared by setting the AWS_LAMBDA_RUNTIME_API environment variable to be: http://localhost:${some-non-random-port}/function-name instead of http://localhost:${some-random-port}
    • I'd love your feedback on if you want this or not...
  • This doesn't support Lambda Layers, yet. Although I presume it could
    • For now it will throw an error if useDocker: false or undefined and there are layers on the function

How Has This Been Tested?

  • I copied ./tests/integration/docker/provided to ./tests/lambda-run-mode/provided
    • I removed useDocker: true and began development until it worked

cnuss added 5 commits May 14, 2024 06:30
* master:
  refactor: use provided log utils (dherault#1784)
  fix: skip adding authorizer to event if no authorizer is configured (dherault#1786)
  Update README.md
* master:
  fix: Support httpApi authorizer with different config and function names (dherault#1763)
  fix: Support ALB Event response headers (dherault#1756)
  v13.6.0
  fix: treat application/octet-stream as a binary encoding (dherault#1587)
  feat: add support for provided.al2023 (dherault#1788)
  v13.5.1
@cnuss
Copy link
Contributor Author

cnuss commented May 27, 2024

Hi @dherault and @DorianMazur this PR might stir up some debate, so let me know what you think!

Long story short, I never liked the requirement to have Docker to run serverless applications that have a compiled binary. It also caused long-ish first-runs of a lambda function as it downloads and runs the base container and layers.

I found myself wishing Docker wasn't involved in the execution of the bootstrap script, so I decided to make this PR.

@cnuss
Copy link
Contributor Author

cnuss commented Jun 4, 2024

hi @dherault and @DorianMazur it's been about a week, have you been able to take a look at this?

@DorianMazur
Copy link
Collaborator

Thanks for the PR @cnuss, but I'd stick with docker in this case. Let's wait for @dherault's response.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants