Serverless Functions
Deploy individual functions that execute on demand without managing servers, scaling automatically from zero to thousands of concurrent invocations.
Description
Serverless functions (Function-as-a-Service) allow deploying individual functions that run in response to events -- HTTP requests, queue messages, database changes, scheduled timers -- without provisioning or managing servers. The cloud provider handles all infrastructure, scaling instances from zero to thousands automatically based on demand, and billing is based on actual execution time (measured in milliseconds) and memory allocation rather than reserved capacity.
AWS Lambda, Google Cloud Functions, Azure Functions, Cloudflare Workers, and Vercel Functions are the major platforms. Each function receives an event payload and a context object, executes its logic, and returns a response. Functions should be stateless and idempotent, storing any persistent state in external services (databases, S3, Redis). Cold starts -- the latency penalty when a new execution environment is initialized -- are a key concern, mitigated by keeping deployment packages small, using provisioned concurrency for latency-sensitive endpoints, and choosing runtimes with fast startup (Node.js, Python, Go over Java/C#).
Serverless functions excel for event-driven workloads (webhooks, image processing, ETL pipelines), APIs with variable traffic (scaling to zero during idle periods), and extending existing applications with isolated microservices. They are less suited for long-running tasks (most platforms cap execution at 15 minutes), workloads requiring persistent connections (WebSockets, database connection pooling requires RDS Proxy or similar), and latency-sensitive endpoints where cold starts are unacceptable. The Serverless Framework, SST, and SAM provide tooling for managing serverless applications across environments.
Prompt Snippet
Deploy serverless functions on AWS Lambda with Node.js 20 runtime using SST (or Serverless Framework). Configure memory at 256MB with 30-second timeout for API handlers and 512MB with 5-minute timeout for background processing. Use Lambda layers for shared dependencies to reduce cold start times. Set provisioned concurrency of 5 for latency-sensitive endpoints. Connect to RDS via RDS Proxy to manage database connection pooling. Use Lambda Powertools for structured logging, tracing (X-Ray), and idempotency middleware with a DynamoDB-backed idempotency store. Define API Gateway v2 (HTTP API) routes with request validation and JWT authorizer.
Tags
Related Terms
Edge Computing
Execute application logic at edge locations worldwide, minimizing latency by running code geographically close to users.
Auto-Scaling
Automatically adjust the number of running application instances based on real-time demand metrics.
12-Factor App Methodology
A set of twelve principles for building modern, scalable, maintainable software-as-a-service applications.
Environment Variable Management
Externalize application configuration into environment variables to separate config from code across environments.