Functions should always execute close to where your data source is to reduce latency. By default, functions using the Node.js runtime execute in Washington, D.C., USA (iad1
), a common location for external data sources. You can set a new default region through your project's settings on Vercel.
Edge Functions execute in the region closest to the user, which could result in longer response times when the function relies on a database located far away. For example, if a visitor triggers an Edge Function in Japan, but it depends on a database in San Francisco, the Function will have to send requests to and wait for a response from San Francisco for each call.
To avoid these long roundtrips, you can limit your Edge Functions to regions near your database, or you could use a globally-distributed database. Vercel's storage options allow you to determine the best location for your database.
“Compute” is an encompassing term used to describe the actions taken by a computer. In the context of web development and Vercel, we use compute to describe actions such as (but not limited to) building and rendering - essential operations needed to turn your code into a site that appears for users. It's often used to describe the work that functions do to create dynamic content on your page.
Serverless Functions, usually execute in one specified region (although this can be configured), and allow you to write small chunks of code to provide additional functionality in your application, such as handle authentication, stream data, and make database queries.
When a user makes a request to your site, a serverless function will run on-demand, without you needing to manage the infrastructure, provision servers, or upgrade hardware.
With Vercel, for each incoming request to a serverless function, a new invocation happens.
If a request is received shortly after a function is executed, Vercel optimizes performance by reusing that function for the subsequent invocation. Over time, only as many functions as necessary are kept active to accommodate incoming traffic.
In the absence of additional incoming traffic, functions on Vercel will scale down to zero.
A cold boot refers to a function starting from scratch. In contrast, a warm boot implies reusing a function, in which the underlying container that hosts it does not get discarded. State, such as temporary files, memory caches, sub-processes, is preserved. This empowers the developer not just to minimize the time spent in the booting process, but to also take advantage of caching data (in memory or filesystem) and memoizing expensive computations.
It's crucial to note that functions must not leave tasks running post-response, even during a hot state. If a sub-process is running by the time the response is returned, the entire container is frozen. When a new invocation happens, if the container is re-used, it is unfrozen, which allows sub-processes to continue running.
For an advanced configuration, you can create a vercel.json
file to use Runtimes and other customizations. To view more about the properties you can customize, see the Configuring Functions and Project config with vercel.json.
If your use case requires that you work asynchronously with the results of a function invocation, you may need to consider a queuing, pooling, or streaming approach because of how serverless functions are created on Vercel.
The following suggestions will help you ensure optimal performance of your Vercel Functions:
- Choose the correct region for your functions: All customers can change the default region for their functions in their project settings. Choose a region that's closest to your data source for optimal performance. See Functions and your data source for more information
- Choose smaller dependencies inside your functions: Cold start times are correlated to function size, which is often mostly from external dependencies. If you have large dependencies, parsing and evaluating JavaScript code can take 3-5 seconds or longer. Review your bundle and try to eliminate larger dependencies using a bundle analyzer
- Use proper caching headers: Function responses can be cached using Cache-Control headers. This will help ensure optimal performance for repeat visitors, and Vercel's Edge cache even supports stale-while-revalidate headers. Note that cache misses will still need to request data from your origin (e.g. database) rather than reading directly from the Edge cache (faster)
For more information see How can I improve serverless function cold start performance on Vercel?
Sometimes, you need to place extra code files, such as utils.js
or my-types.d.ts
, inside the /api
folder. To avoid turning these files into functions, Vercel ignores files with the following characters:
- Files that start with an underscore,
_
- Files that start with
.
- Files that end with
.d.ts
If your file uses any of the above, it will not be turned into a function.
In order to optimize resources, Vercel uses a process to bundle as many routes as possible into a single Serverless Function.
To provide more control over the bundling process, you can use the functions
property in your vercel.json
file to define the configuration for a route. If a configuration is present, Vercel will bundle functions based on the configuration first. Vercel will then bundle together the remaining routes, optimizing for how many functions are created.
This bundling process is currently only enabled for Next.js, but it will be enabled in other scenarios in the future.
In the following example, app/api/hello/route.ts
will be bundled separately from app/api/another/route.ts
since each has a different configuration:
{
"functions": {
"app/api/hello/route.ts": {
"memory": 3009,
"maxDuration": 60
},
"app/api/another/route.ts": {
"memory": 1024,
"maxDuration": 30
}
}
}
Was this helpful?