The World's Smallest Lambda Function

The World's Smallest Lambda Function

A 6KB binary serving verified computation at sub-millisecond latency

I deployed a Lambda function yesterday. The zip file was 6KB. The cold start was 4.5 milliseconds. Warm invocations run in under a millisecond. AWS bills me the minimum: 1ms per request.

The function serves a complete website — 7 HTML pages with routing, content-type handling, and favicon support — plus API endpoints that compute GCD, primality testing, and factorial using verified machine code.

Every computation in the binary was synthesized from a natural language description, validated against test cases, and compiled to LLVM IR. The binary contains zero unverified computation. The only C code is 9 syscall wrappers that talk to the kernel.

The Numbers

For context:                                                                                                             

  • Cold start: 4.5ms (vs 200-500ms Node.js)
  • Warm invoke: 0.9ms (vs 5-20ms Node.js)
  • Zip size: 6KB (vs 50MB+ Node.js)
  • Memory: 12MB (vs 128MB+ Node.js)
  • Dependencies: 0 (vs hundreds)                                                                                          
  • Cost per request: $0.0000000021 (vs 50-100x more)
  • Verified computation: 100% (vs 0%)

The cold start is 3.4ms for init + 1.1ms for the first invocation. After that, every request completes in under a millisecond. AWS rounds up to 1ms — the minimum billable unit. You cannot pay less for compute on Lambda.

How It Works

The binary is a custom Lambda runtime — it implements the Lambda Runtime API directly:

  1. Read AWS_LAMBDA_RUNTIME_API from the environment
  2. GET /2018-06-01/runtime/invocation/next — poll for events
  3. Extract the request path from the JSON event
  4. Route dispatch via verified hash_string (FNV-1a, O(1))
  5. Compute the result using verified capabilities
  6. POST /2018-06-01/runtime/invocation/{id}/response — return result

Every step except the HTTP I/O is a verified function — synthesized from intent, tested against cases, compiled to native machine code. The route dispatch uses a verified hash function. The JSON parsing uses a verified key extractor. The string operations are all verified.

What's In The Binary

The binary is 17KB stripped, statically linked, no libc. It contains:

  • 20 verified capabilities — synthesized from English descriptions, each with test evidence
  • 9 syscall wrappers — the complete OS interface (open, read, write, close, socket, connect, bind, listen, accept)
  • Compiled Forth — route definitions compiled from Forth to LLVM IR at build time
  • Nothing else — no runtime, no interpreter, no framework, no garbage collector

The entire attack surface is:

  • 20 capabilities declared in catalog.conf
  • 9 syscall wrappers
  • A FROM scratch mentality — if it's not in the catalog, it doesn't exist in the binary

The Cost Story

At $2.10 per million requests for verified computation vs $100+ per million for an equivalent Node.js Lambda with unverified dependencies — the Semcom binary is cheaper AND proven correct.

For a typical enterprise API handling 10 million requests/month:

  • Semcom: $21/month compute
  • Node.js: $1,000+/month compute
  • The Node.js app: 1,200 dependencies, any of which could be compromised
  • The Semcom binary: zero dependencies, every function has test evidence

Why This Matters

The software industry has accepted two things as normal:

  1. Binaries are large because runtimes are large
  2. Dependencies are unauditable because there are too many

Both of these are choices, not constraints. A Lambda function that computes GCD doesn't need Node.js. It doesn't need npm. It doesn't need 50MB of code to add two numbers and check a remainder.

What it needs is:

  • A function that computes GCD (29 bytes of machine code)
  • A way to receive the request (syscall wrappers)
  • A way to send the response (syscall wrappers)

Everything else is weight. We removed the weight. What's left is 6KB and it's verified.

Try It

The Lambda is live:

curl https://bl2jmkxkns6pweupcmqjcbu54u0bxgje.lambda-url.us-east-1.on.aws/api/gcd/48/18
→ 6

curl https://bl2jmkxkns6pweupcmqjcbu54u0bxgje.lambda-url.us-east-1.on.aws/api/prime/97
→ 1

curl https://bl2jmkxkns6pweupcmqjcbu54u0bxgje.lambda-url.us-east-1.on.aws/api/factorial/10
→ 3628800
        

Each response is a verified computation. Each capability was synthesized from intent, validated against test cases, and compiled to native machine code. The binary that runs it is 6KB.

The Semantic Compiler is patent pending. If you're building systems where correctness matters — defense, medical devices, financial services, critical infrastructure — I'd like to talk.

Lane Thompson CEO, Apilify Inc. LinkedIn · lane@apilify.com

That is beyond impressive

Like
Reply

That's impressive! A full website in just 6KB is a serious feat. But what about scalability? When traffic spikes, how does that tiny Lambda handle the load? It's a delicate balance between size and performance, right?

Like
Reply

To view or add a comment, sign in

Others also viewed

Explore content categories