如何评估Web服务器的性能?

时间:2022-11-25 11:26:04

I'm planing to deploy a django powered site. But I feel confused about the choice of web servers, which includes apache, lighttpd, nginx and others.

我正计划部署django支持的网站。但我对Web服务器的选择感到困惑,其中包括apache,lighttpd,nginx等。

I've read some articles about the performance of each of these choice. But it seems no one agrees. So I'm wondering why not test the performance by myself?

我已经阅读了一些关于每种选择的性能的文章。但似乎没有人同意。所以我想知道为什么不自己测试性能?

I can't find information about the best approach to performance testing web servers. So my questions are:

我找不到有关性能测试Web服务器的最佳方法的信息。所以我的问题是:

  1. Is there any easy approach to test the performance without the production site?
  2. 没有生产站点,有没有简单的方法来测试性能?

  3. Or can I have a method to simulate the heavy traffic to have a fair test?
  4. 或者我可以通过一种方法来模拟繁忙的交通以进行公平的测试吗?

  5. How can I keep my test fair and close to production situation?
  6. 如何保持我的测试公平并接近生产情况?

After the test, I want to figure out:

测试结束后,我想弄明白:

  1. Why some ones say nginx has a better performance when serving static files.
  2. 为什么有人说nginx在提供静态文件时性能更好。

  3. The cpu and memory needs of each web server.
  4. 每个Web服务器的CPU和内存需求。

  5. My best choice.
  6. 我最好的选择。

3 个解决方案

#1


3  

Tools like ab are commonly used towards testing how much load you can take from a battering of requests at once, alongside cacti/munin/your system monitoring tool or choice you can generate data on system load & requests/sec. The problem with this is many people benchmarking don't realise that they need to request a lot of different requests, as different parts of your code executes it will take varying amounts of time. Profiling and benchmarking code and not requests is also important, to which plenty of folk have already done so for django, benchrun is also not a bad tool either.

像ab这样的工具通常用于测试您可以同时从一系列请求中获取多少负载,以及cacti / munin /您的系统监控工具或您可以在系统负载和请求/秒上生成数据的选择。这个问题是许多人基准测试没有意识到他们需要请求很多不同的请求,因为代码的不同部分执行它将花费不同的时间。分析和基准测试代码而不是请求也很重要,很多人已经为django做过这样的事情,benchrun也不是一个糟糕的工具。

The other issue, is how many HTTP requests each page view takes. The less amount of requests, and the quicker they can be processed is the key to having websites that can sustain a high amount of traffic, as the quicker you can finish and close connections, the quicker you allocate resources for new ones.

另一个问题是每个页面视图需要多少HTTP请求。请求数量越少,处理速度越快是拥有可以维持大量流量的网站的关键,因为您可以更快地完成和关闭连接,为新的分配资源的速度越快。

In terms of general speed of web servers, it goes without saying that a proxy server (running reverse at your end) will always perform faster than a webserver with static content. As for Apache vs nginx in regards to your django app, it seems that mod_python is indeed faster than nginx/lighty + FastCGI but that's no surprise because CGI, regardless of any speed ups is still slow. Executing and caching code at the webserver and letting it manage it is always faster (mod_perl vs use CGI, mod_php vs CGI, etc) if you do it right.

就Web服务器的一般速度而言,不言而喻,代理服务器(在您的端运行反向)总是比具有静态内容的Web服务器执行得更快。至于Apache vs nginx关于你的django应用程序,似乎mod_python确实比nginx / lighty + FastCGI更快,但这并不奇怪,因为CGI,无论任何加速速度仍然很慢。如果你做得对,在网络服务器上执行和缓存代码并让它管理它总是更快(mod_perl vs使用CGI,mod_php vs CGI等)。

#2


2  

Apache JMeter is an excellent tool for stress-testing web applications. It can be used with any web server, not just Apache.

Apache JMeter是一个用于压力测试Web应用程序的出色工具。它可以与任何Web服务器一起使用,而不仅仅是Apache。

#3


1  

You need to set up the web server + website of your choice on a machine somewhere, preferably a physical machine with similar hardware specs to the one you will eventually be deploying to.

您需要在某个机器上设置您选择的Web服务器+网站,最好是具有与您最终部署到的硬件规格类似的硬件规格的物理机。

You then need to use a load testing framework, for example The Grinder (free), to simulate many users using your site at the same time.

然后,您需要使用负载测试框架,例如The Grinder(免费),以模拟同时使用您的站点的许多用户。

The load testing framework should be on separate machine(s) and you should monitor the network and CPU usage of those machines as well to make sure that the limiting factor of your testing is in fact the web server and not your load injectors.

负载测试框架应位于不同的机器上,您应该监视这些机器的网络和CPU使用情况,以确保测试的限制因素实际上是Web服务器而不是负载注入器。

Other than that its just about altering the content and monitoring response times, throughput, memory and CPU use etc... to see how they change depending on what web server you use and what sort of content you are hosting.

除此之外,它只是改变内容和监控响应时间,吞吐量,内存和CPU使用等......看看它们如何根据您使用的Web服务器以及您托管的内容进行更改。

#1


3  

Tools like ab are commonly used towards testing how much load you can take from a battering of requests at once, alongside cacti/munin/your system monitoring tool or choice you can generate data on system load & requests/sec. The problem with this is many people benchmarking don't realise that they need to request a lot of different requests, as different parts of your code executes it will take varying amounts of time. Profiling and benchmarking code and not requests is also important, to which plenty of folk have already done so for django, benchrun is also not a bad tool either.

像ab这样的工具通常用于测试您可以同时从一系列请求中获取多少负载,以及cacti / munin /您的系统监控工具或您可以在系统负载和请求/秒上生成数据的选择。这个问题是许多人基准测试没有意识到他们需要请求很多不同的请求,因为代码的不同部分执行它将花费不同的时间。分析和基准测试代码而不是请求也很重要,很多人已经为django做过这样的事情,benchrun也不是一个糟糕的工具。

The other issue, is how many HTTP requests each page view takes. The less amount of requests, and the quicker they can be processed is the key to having websites that can sustain a high amount of traffic, as the quicker you can finish and close connections, the quicker you allocate resources for new ones.

另一个问题是每个页面视图需要多少HTTP请求。请求数量越少,处理速度越快是拥有可以维持大量流量的网站的关键,因为您可以更快地完成和关闭连接,为新的分配资源的速度越快。

In terms of general speed of web servers, it goes without saying that a proxy server (running reverse at your end) will always perform faster than a webserver with static content. As for Apache vs nginx in regards to your django app, it seems that mod_python is indeed faster than nginx/lighty + FastCGI but that's no surprise because CGI, regardless of any speed ups is still slow. Executing and caching code at the webserver and letting it manage it is always faster (mod_perl vs use CGI, mod_php vs CGI, etc) if you do it right.

就Web服务器的一般速度而言,不言而喻,代理服务器(在您的端运行反向)总是比具有静态内容的Web服务器执行得更快。至于Apache vs nginx关于你的django应用程序,似乎mod_python确实比nginx / lighty + FastCGI更快,但这并不奇怪,因为CGI,无论任何加速速度仍然很慢。如果你做得对,在网络服务器上执行和缓存代码并让它管理它总是更快(mod_perl vs使用CGI,mod_php vs CGI等)。

#2


2  

Apache JMeter is an excellent tool for stress-testing web applications. It can be used with any web server, not just Apache.

Apache JMeter是一个用于压力测试Web应用程序的出色工具。它可以与任何Web服务器一起使用,而不仅仅是Apache。

#3


1  

You need to set up the web server + website of your choice on a machine somewhere, preferably a physical machine with similar hardware specs to the one you will eventually be deploying to.

您需要在某个机器上设置您选择的Web服务器+网站,最好是具有与您最终部署到的硬件规格类似的硬件规格的物理机。

You then need to use a load testing framework, for example The Grinder (free), to simulate many users using your site at the same time.

然后,您需要使用负载测试框架,例如The Grinder(免费),以模拟同时使用您的站点的许多用户。

The load testing framework should be on separate machine(s) and you should monitor the network and CPU usage of those machines as well to make sure that the limiting factor of your testing is in fact the web server and not your load injectors.

负载测试框架应位于不同的机器上,您应该监视这些机器的网络和CPU使用情况,以确保测试的限制因素实际上是Web服务器而不是负载注入器。

Other than that its just about altering the content and monitoring response times, throughput, memory and CPU use etc... to see how they change depending on what web server you use and what sort of content you are hosting.

除此之外,它只是改变内容和监控响应时间,吞吐量,内存和CPU使用等......看看它们如何根据您使用的Web服务器以及您托管的内容进行更改。