Run and Test¶
Verified status as of March 28, 2026. Runtime note: FastFN resolves dependencies and build steps per function: Python uses
requirements.txt, Node usespackage.json, PHP installs fromcomposer.jsonwhen present, and Rust handlers are built withcargo. Host runtimes/tools are required infastfn dev --native, whilefastfn devdepends on a running Docker daemon.
Quick View¶
- Complexity: Intermediate
- Typical time: 20-40 minutes
- Use this when: you need full platform validation locally or in CI
- Outcome: health, routing, OpenAPI and test gates are verified
This guide is the practical validation flow for teams treating FastFN as a serious FaaS platform.
It is organized as a staged checklist so you can run it locally and in CI with the same criteria.
Validation scope¶
This flow validates:
- runtime boot and health
- route discovery and request handling
- OpenAPI parity with discovered routes
- route conflict behavior
- regression test suites (unit + integration)
If you need architecture background first:
Prerequisites¶
Required:
./bin/fastfnavailablecurl,jq
Mode-specific:
- Docker mode (
fastfn dev): Docker CLI + daemon running - Native mode (
fastfn dev --native): OpenResty in PATH, plus runtime binaries you plan to use
Stage 1: build and start¶
Build:
Start stack (docker default):
Start stack (native):
Acceptance criteria:
- process starts without fatal errors
/_fn/healthreaches HTTP200
Stage 2: health and core endpoint checks¶
Health:
Public endpoint smoke:
Acceptance criteria:
- health payload reports enabled runtimes as up
- endpoint calls return HTTP
200
Stage 3: OpenAPI and routing parity checks¶
OpenAPI paths:
Catalog routes and conflict view:
Acceptance criteria:
- all expected public routes appear in OpenAPI
mapped_route_conflictsis empty in normal operation
Related behavior:
Stage 4: explicit conflict behavior check¶
FastFN should not silently override mapped URLs by default.
Verify conflict handling policy:
- create two functions claiming the same route
- call catalog and confirm conflict exposure
- ensure collision returns deterministic error behavior
Policy references:
Stage 5: run full regression suites¶
Core CI-like pipeline:
Focused suites:
bash tests/integration/test-openapi-system.sh
bash tests/integration/test-openapi-native.sh
bash tests/integration/test-api.sh
bash tests/integration/test-home-routing.sh
bash tests/integration/test-auto-install-inference.sh
bash tests/integration/test-platform-equivalents.sh
Acceptance criteria:
- no skipped mandatory checks in CI
- OpenAPI parity tests pass in both docker and native where required
Stage 6: publish-quality tracking checklist¶
Use this checklist before merge/release:
- [ ]
/_fn/healthreturns 200 and runtimes up - [ ] representative public routes return expected status/body
- [ ]
/_fn/openapi.jsonincludes mapped public routes - [ ]
mapped_route_conflictsis empty (or intentionally documented) - [ ]
test-openapi-system.shpasses - [ ]
test-openapi-native.shpasses (or explicitly justified for local non-native environments) - [ ]
test-home-routing.shpasses (root/override + folder home alias viafn.config.json) - [ ]
test-auto-install-inference.shpasses (strict inference + metadata visibility) - [ ]
test-platform-equivalents.shpasses (advanced auth/webhook/jobs/order examples) - [ ] docs links added/updated for changed behavior
For production hardening next:
Flow Diagram¶
flowchart LR
A["Client request"] --> B["Route discovery"]
B --> C["Policy and method validation"]
C --> D["Runtime handler execution"]
D --> E["HTTP response + OpenAPI parity"]
Objective¶
Clear scope, expected outcome, and who should use this page.
Validation Checklist¶
- Command examples execute with expected status codes
- Routes appear in OpenAPI where applicable
- References at the end are reachable
Troubleshooting¶
- If runtime is down, verify host dependencies and health endpoint
- If routes are missing, re-run discovery and check folder layout
See also¶
Unit and integration quick recipes¶
Use unit tests for function logic and integration scripts for routing/runtime parity.
Testing seams and mocking strategy¶
Prefer seams at:
- external HTTP clients
- database adapters
- clock/time providers
- process/signal boundaries in native mode
Debug checklist¶
- confirm route discovery output
- confirm
/_fn/healthstatus - test endpoint with verbose curl (
-i -v) - inspect runtime logs by language
- isolate minimal repro function
Read handler debug output¶
For a function like:
use these rules:
fastfn devterminal logs: fullstdout/stderr, with function prefixesX-Fn-Stdout/X-Fn-Stderr: useful from external clients, but truncated/_fn/invoke: fullstdout/stderrin JSON for console/admin flowsfastfn logs --native --file runtime: full local runtime log stream in native mode/_fn/logs?file=runtime...: full filtered runtime log stream for admin/internal clients
Example runtime log line:
Local native tail:
From another app or admin tool, read the internal log tail endpoint with an admin token:
curl -sS 'http://127.0.0.1:8080/_fn/logs?file=runtime&format=json&runtime=python&fn=hello&version=default&stream=stdout&lines=50' \
-H 'x-fn-admin-token: my-secret-token'
Use headers for a quick preview and runtime logs for the full stream.
Next step¶
Continue with Deploy to production after this checklist is consistently green in local and CI.
Related links¶
- Deploy to production
- Security confidence checklist
- Zero-config routing
- Platform runtime plumbing
- HTTP API reference
- Function specification
- Architecture
- Get help