BLOG | NGINX

How to Manage Sudden Traffic Surges and Server Overload

NGINX-Part-of-F5-horiz-black-type-RGB
Frances Fedoriska サムネール
Frances Fedoriska
Published July 14, 2021

For many retailers, the COVID‑19 pandemic changed where and how their customers shop for the holidays. Locked out of brick-and-mortar stores, shoppers flocked to online retailers more than ever before, spiking e‑commerce sales to $189 billion in 2020, a 33% increase over 2019. With fewer than six months standing between retailers and this year’s holiday shoppers, we’re revealing several ways you can leverage NGINX and prepare for sudden traffic spikes.

Step 1: Know Where You Stand Against Your Competition

Before making changes to your stack, it helps to know just how much you need to maximize your “Cyber Season” performance to thrive instead of just survive. Start by collecting data about the performance of competing websites. Time to load a page is probably the most important metric – many of today’s impatient users abandon a site if it doesn’t load within three seconds. There are many tools available for measuring load time, lots of them free. An easy place to start looking is a recent Geekflare review of 11 testing tools.

Consider measuring load time for these pages:

  • Home page
  • Product search results
  • Product page details
  • Confirmation page after you hit the “Buy” button

Testing key pages for half a dozen competitors along with your own site takes just a few hours. (Be sure to clear your browser cache by using Shift+Refresh before measuring download times.) Armed with results, this is what you do next:

  • Index total performance. How long does the “soup to nuts” process (visit, search, and purchase) take on different sites?
  • Identify strengths and weaknesses. Find the specific areas where your site’s performance is ahead, competitive, or lagging.
  • Note feature differences. How do competitors add value to the shopping experience through extra features? What does your site offer to differentiate the user experience?
  • Create a plan. If you’re well behind competitors in one or more areas of site performance, aim to meet their average response times; if you’re already competitive, work to become #1.

Step 2: Run NGINX, Like the World’s Busiest Websites

Earlier this year NGINX became the #1 web server on the Internet. We’re honored so many sites trust us to deliver their websites and apps, and hope you will too. But NGINX is more than just a web server. It’s an all-in-one software reverse proxy, load balancer, cache, and API gateway.

One of NGINX’s most important benefits is how it optimizes the flow of traffic into your site. Think of NGINX as a doorkeeper, managing traffic at the front of your store. It gently queues up and admits each shopper (HTTP request), transforming the chaotic scrum on the sidewalk into a smooth, orderly procession in the store. Shoppers are directed to the specific location of items on their wish lists, ensuring that traffic is distributed evenly and all resources are equally used.

NGINX primarily employs two out-of-the-box techniques to achieve this:

  • HTTP offload using keepalive connections, to buffer HTTP requests that arrive slowly and forward them to backend servers only when they are ready. Transactions complete much more quickly when they originate from NGINX (on the fast local network) than when they originate from a distant client.
  • Sophisticated load balancing with multiple algorithms to choose from, to optimize traffic distribution and use server resources as efficiently as possible.

In addition to optimizing traffic flow, here are more four ways you can leverage NGINX to optimize your site and prevent server overload during a traffic surge.

Read on for details on:

Improving Web Page Response Times with Caching

Click-and-collect, online ordering for in‑store pickup, and even customer payment options increase the likelihood of a successful online transaction. Content caching with NGINX has a similar effect for web traffic. NGINX automatically stores each file it sends to clients, and serves subsequent requests for a given file directly from the cache. Caching not only gets responses to users faster, it reduces the load on your upstream servers because they don’t have to process the same requests over and over from scratch. Depending on your application, content caching can reduce the volume of internal traffic by a factor of up to 100, reducing the hardware capacity needed to serve your app.

For more details about caching and sample configurations, see Cache and Microcache Your Site Content and A Guide to Caching with NGINX and NGINX Plus on our blog.

Managing Visitor Traffic with Connection, Rate, and Bandwidth Controls

At the busiest times, the doorkeeper for your store might need to limit the number of shoppers coming in. This might be for safety reasons (avoiding overcrowding) or preferential treatment of valued customers (VIP hours, invitation‑only promotions, and so on). Web apps need to take similar measures. You can prevent server overload by limiting the amount of traffic entering your site, ensuring that clients get timely access to the required resources. NGINX (and in container environments, NGINX Ingress Controller) offer a range of methods for limiting incoming traffic, including:

  • Concurrency limits – Restrict the number of concurrent requests forwarded to each server, to match the limited number of worker threads or processes in each
  • Request rate limits – Apply a per‑second or per‑minute restriction on requests from each client, which prevents server overload for services such as a payment gateway or complex search (for details, see Rate Limiting with NGINX and NGINX Plus on our blog)
  • Bandwidth limits – Control the amount of data a client can download on each connection

You can differentiate between different types of clients if necessary. Perhaps the delivery area for your store does not extend to Asia, or you want to prioritize users who have items in their shopping baskets. You can leverage cookies, geolocation data, and other parameters to control how NGINX applies traffic limits.

Avoid Server Overload with Elastic Scaling

Traffic surges can easily cause server overload if you don’t have adequate infrastructure in place. NGINX’s lightweight, event‑driven architecture maximizes app delivery performance with the infrastructure you already have. Our sizing guides for NGINX Ingress Controller and for NGINX Plus on bare metal and virtualized environments help you determine accurate operating expenses for the performance and scale you are preparing for.

There are additional NGINX features DevOps teams can leverage to effectively scale for traffic spikes:

  • Deploy NGINX and your apps in cloud environments. NGINX is available in the marketplaces of major cloud environments like AWS, Google Cloud Platform, and Microsoft Azure. Each cloud provider supports autoscaling to adjust the number of app instances in response to changes in demand. For more information, see the autoscaling documentation on AWS, GCP, and Azure.
  • Deploy containers in a Kubernetes environment. NGINX Ingress Controller and NGINX Service Mesh include multiple features that boost the resilience of Kubernetes apps. You can scale your application pods horizontally based on user demand, with almost no added latency for real‑time users.
  • Use the NGINX Plus API to dynamically scale the backend servers being load balanced by NGINX Plus.

Protect Customer Data with Built-In Security

Making transactions secure is table stakes for any website, but especially for online retailers who handle credit card information. As unlucky retailers such as Target know firsthand, a breach can mean a tarnished brand and lawsuits. Target addressed its 2014 data breach with enhancements that improved visibility and tightened security. Stand out from the competition with one of the most secure shopping experiences in the market by implementing:

Now You’re Ready

Arming your web properties with our enterprise-grade solutions now means you can rest assured you’re ready for whatever the year‑end shopping season has in store (or online!) in 2021.

Free 30-day trials are available for all of our commercial solutions:

Or get started with free and open source offerings:

This blog includes contributions from Owen Garrett and Floyd Smith.


"This blog post may reference products that are no longer available and/or no longer supported. For the most current information about available F5 NGINX products and solutions, explore our NGINX product family. NGINX is now part of F5. All previous NGINX.com links will redirect to similar NGINX content on F5.com."