How We Helped a Magento Store Quickly Handle Thousands of Orders

magento_logo

One of our ecommerce clients was recently featured on NBC’s Today Show in the Segment “Jill’s Steals and Deals”. The Today Show highlights a few products with very aggressive discounts, and generates tens of thousands of visitors in just a few minutes.

This particular client uses Magento for their ecommerce software. Magento presents many unique challenges in a high traffic event such as this. This is because Magento uses a lot of resources (CPU and Memory) as compared to other ecommerce solutions.

We’ve had other clients featured on events like this. The easiest to manage are those websites running ShopSite as their ecommerce solution. ShopSite scales really well, handling thousands of orders without consuming too many resources. This is not the case with Magento.

So, we had our work cut out for us…

What we started with

The client already had a managed dedicated server with us, so that was a good start. We already had APC enabled and tuned. APC caches code, which helps with load and faster page loads.

The server had 8 GB of RAM. The store was running Magento Community edition, and had a number of custom modules and a mobile site as well.

What we changed

We’ve done other events like this with Magento, and we’ve learned a lot over the years. We put that knowledge to good use…

  • We increased the RAM from 8 GB to 32 GB to handle the spike in traffic.
  • We enabled a Content Delivery Network (CDN) in Magento itself to off-load many files from the server.
  • We installed a Full Page Cache Module for Magento so page loads would be fast and use very few resources.
  • We tweaked the Apache webserver to handle more traffic, and adjusted KeepAlives for optimal resource management.
  • The “splash page” where NBC would link to from the Today Show website was 100% on the CDN. This is key so the initial surge does not hit the server. This means every image, css, js, and even the page itself must use the CDN URL so no resources load off of the server.

We opted to not add an additional server just for MySQL. Budget and time constraints made this option unavailable. Ideally though, this dual-server set up would provide the best solution to handle high levels of traffic.

Side note: The concept of “putting the site on a cloud solution” is often raised around events like these. It’s actually not a great idea, as cloud hosting has major issues with I/O load (drive speed). Sure, you can spin up additional resources, or add additional containers, but this adds complexity and programming to handle effectively. A properly configured dedicated server will outperform cloud hosting in most cases.

The morning of the event arrives

At about 8:48 AM ET, the segment aired on television. Just a minute later, we saw the CDN light up as the Today Show website was updated with the order links.

20-30 seconds after that, items were being placed in the cart. Thousands of visitors hit the cart and site, and we watched Apache handle 500-600 simultaneous events. The load on the server went up quite high (greater than 150) right away. Even with this high load, the server was still very responsive and doing well. Memory consumption was only around 20 GB, so no issues there.

However, MySQL was set for a maximum number of connections of 400. This was quickly reached, at which point things started to back up, and the load sky-rocketed to over 280. We adjusted MySQL to handle 600 connections, and with Apache already set for 800 max. simultaneous connections, we were back in business with just a one to two minute slowdown.

How much traffic was there?

In the first 45 minutes, over 10 Gigabytes of data was transferred out from the server and CDN. That is quite a lot. The site saw over 100,000 page views in just a few short hours.

hourly_traffic.png

In terms of orders, we saw a peak rate of 7 orders per second! The site received thousands of orders on the day the show aired. It was quite impressive to see this level of traffic and orders streaming in without any major issues.

The event was a success!

Key takeaways

The more RAM, the better

Having 32 GB of RAM for this event gave us plenty of room to handle the traffic spike. This seemed to be a good number for Magento.

Proper use of a CDN is crucial

Not only enabling the CDN in Magento itself, but constructing the initial splash page to run entirely on the CDN is a key factor in making sure the entry page is up 100% of the time. This saves the server for handling the cart and checkout without being burdened by the thousands of visitors hitting the initial linked page.

Server config must be tuned and monitored

Having Apache preset to handle 700-800 max children, having MySQL set to allow 500 or more concurrent connections, and allowing the load on the server to spike above 200 are all needed to avoid a bottleneck or services failing.

Along with this, your host must be vigilant and able to adjust parameters as needed to keep things running as smoothly as possible. Expect a slowdown when the initial surge hits.

If possible, split MySQL on to its own server

When heavy traffic hits a Magento site, the web server and database server compete for CPU resources. This causes the load to rise, things to get slow, etc…

Ideally, having MySQL on a separate server connected via a Gigabit private network would be the best solution. It would allow each piece of software to use the full resources of the server.

Planning and being able to react quickly are key

Most of all, make sure you have a plan beforehand that your host understands and supports. Your host must also be able to react when issues arise to keep things running.

If your host isn’t acting as your partner for a high traffic event, it’s probably time to find a new host.

Looking for a web host that understands ecommerce and business hosting?
Check us out today!

Connect with me on Google+

Leave a Reply