This is a read-only archive. Find the latest Linux articles, documentation, and answers at the new Linux.com!

Linux.com

Feature: Internet & WWW

Using free software for HTTP load testing

By Leslie P. Polzer on August 12, 2008 (4:00:00 PM)

Share    Print    Comments   

A good way to see how your Web applications and server will behave under high load is by testing them with a simulated load. We tested several free software tools that do such testing to see which work best for what kinds of sites.

If you leave out the load-testing packages that are no longer maintained, non-free, or fail the installation process in some obscure way, you are left with five candidates: curl-loader, httperf, Siege, Tsung, and Apache JMeter.

Daniel Rubio already covered JMeter in detail, so I will not go into it again here, but I will compare it to the others in the final evaluation at the end of the article.

curl-loader

The purpose of curl-loader is to "deliver a powerful and flexible open source testing solution as a real alternative to Spirent Avalanche and IXIA IxLoad." It relies on the mature and flexible cURL library to manage requests, authentication, and sessions.

Building the application is straightforward: download, untar, and make the code inside its directory. I had to add an #include <limits.h> to the file ip_secondary.c to make it build, probably due to some recent changes in the glibc headers. You also need to install the OpenSSL libraries and headers to compile and run curl-loader.

Getting started takes a bit more effort. curl-loader's configuration interface is split into two parts. The first part is a configuration file that contains the parameters for a specific scenario. Its simple format consists of newline-delimited variable assignments of the form VAR=VALUE. You can find a bunch of self-explanatory examples in the directory conf-examples in the curl-loader source tree. Because curl-loader can use multiple IP addresses to realistically simulate requests from different clients, you need to adjust the values for INTERFACE, CLIENTS_NUM_MAX, NETMASK, IP_ADDR_MIN, and IP_ADDR_MAX to suit your network environment. The number of maximum parallel clients must of course be supported by the IP address range you specify.

The second part is the command-line interface of the curl-loader binary. It offers one essential parameter, -f, to which you're expected to pass the location of the scenario file you wish to use. The other arguments let you fine-tune the test run. For example, by default curl-loader will issue its requests from a single thread. While this helps conserve system resources and increase performance, it's advisable to use the -t option to add one thread per additional CPU core you wish to utilize.

A screen-oriented result display shows a synopsis of the test results, updated at regular intervals, and more detailed results are available in the log files curl-loader generates. The file with the .log extension contains information about any errors, while the .ctx file shows per-client statistics and the .txt file shows statistics over time.

If you're looking for a utility similar to curl-loader but written in Python, check out Pylot. It comes with a GUI and uses XML as its configuration format.

httperf

httperf is a command-line single-thread load tester developed by Hewlett-Packard Labs.

You set the bulk of httperf's configuration via command-line parameters. Configuration files play an auxiliary role if you wish to specify a session scenario.

A sample invocation for a total of 5,000 connections, each one of which should try to issue 50 requests, looks like this:

httperf --server=localhost --uri=/ --num-conns=5000 --num-calls=50

The first line of output will show arguments that have been assigned their defaults because you haven't specified them:

httperf --client=0/1 --server=localhost --port=80 --uri=/ \ --send-buffer=4096 --recv-buffer=16384 \ --num-conns=5000 --num-calls=50

Unlike curl-loader, httperf doesn't keep you updated about the status of the test run, nor does it write detailed log files. It only shows a summary of the test results at the end of the test run. There's a debugging switch that helps you see what it's currently doing, but it has to be enabled at compile time.

I like the fact that httperf lets you specify all parameters on the command line. This lets you prototype your load test quickly and then put the finished invocation into a shell script. Running different tests sequentially or in parallel is also a no-brainer.

httperf could be a little smarter in its interpretation of its command-line arguments, though. For example, the separation of the target URI into server and path parts seems unnecessary, plus the latter one is specified by the --uri option, which is a misnomer, as valid URIs may contain the server name as well.

Try http_load if you want a simpler tool in the style of httperf.

Siege

Siege is similar to httperf in that it can be configured almost fully with command-line arguments. But Siege uses multiple concurrent threads to send its requests, has fewer low-level options than httperf, and is built to work with a list of URLs. It's also easier to use than httperf because its options are named in a more straightforward way.

A Siege run with default parameters can be as simple as

siege localhost

However, this doesn't access the full power of Siege, which is to test a list of URLs in a largely unpredictable way like a real user would. To gather URLs, Siege's author offers the auxiliary program Sproxy. After installing, run it like this:

sproxy -v -o urls.txt

Sproxy will keep the terminal open and list all recorded URLs, plus write them to the file urls.txt for Siege.

Configure your browser to use localhost:9001 as an HTTP proxy. Then it's time to start browsing your site, thereby letting Sproxy record information about the URLs for Siege.

When you have gathered some URLs you would like to test, start the next test run like this:

siege -v --internet --file=urls.txt

The --internet argument instructs Siege to hit the URLs in a random fashion, like a horde of surfers would.

Tsung

Tsung works in a way similar to using Siege with an URL file, but it offers more elaborate features, such as random user agents, session simulation, and dynamic data feedback. It also performs better by using Erlang's Green Threads.

This comes at a price, though: Tsung doesn't offer the ad-hoc command-line invocation we know from Siege, curl-loader, and httperf. You must either manually create a scenario file in ~/.tsung/tsung.xml or use the recording mode of Tsung, which works like Siege's Sproxy:

  1. Start the Tsung recording proxy with tsung recorder, then visit the target URLs with localhost:8090 as proxy server.
  2. Open the newly created session record in ~/.tsung and edit the details of your scenario.
  3. Save it as ~/.tsung/tsung.xml.
  4. Use tsung start to start the test.

Step two, the crafting of the configuration file, is the most difficult part. If you want to use the advanced features of Tsung you will definitely need to get acquainted with its format.

As an additional bonus, Tsung can also put PostgreSQL and Jabber servers under load.

Conclusion

Each of these tools has its advantages and disadvantages. All are documented well, offer a painless installation, and run reliably.

Here's a possible decision path to find the right tool for your particular job: start out simple with Siege and see if you can get away with its simple performance and feature set. After that, try httperf, which has a slightly expanded feature set and runs faster. If you need to set up more complex scenarios, move on to curl-loader and Tsung, which have the largest feature sets and best performance, but especially Tsung takes time to get used to.

Apache JMeter is the only GUI-based application in the crowd. Its feature set is pretty impressive, offering some unique things like content pre- and postprocessors. You should give it a try if you prefer GUI apps.

Leslie P. Polzer is an independent professional specializing in the development of dynamic Web sites.

Share    Print    Comments   

Comments

on Using free software for HTTP load testing

Note: Comments are owned by the poster. We are not responsible for their content.

Using free software for HTTP load testing

Posted by: Anonymous [ip: 66.151.227.20] on August 12, 2008 04:54 PM
also check out Pylot www.pylot.org

#

Re: a leagle ruling cleared the way for home school (not shure what state)

Posted by: Anonymous [ip: 72.148.34.161] on August 13, 2008 03:14 AM
Um.. What??

#

Re(1): a leagle ruling cleared the way for home school (not shure what state)

Posted by: Nathan Willis on August 13, 2008 04:47 AM
It's spam. Pay no mind.

Nate

#

Using free software for HTTP load testing

Posted by: Anonymous [ip: 24.34.95.112] on August 13, 2008 08:12 AM
No standard `ab` ?

#

Using free software for HTTP load testing

Posted by: Anonymous [ip: 78.86.116.238] on August 13, 2008 02:08 PM
Or you could use "The Grinder" which is far more flexible than Apache JMeter, and supports clustered slaves for intensive load testing.

Which you forgot to try...

#

Using free software for HTTP load testing

Posted by: Anonymous [ip: 69.69.28.85] on August 13, 2008 02:11 PM
Load tests are cool dude!

JT
www.FireMe.To/udi

#

And where's Apache Benchmark?

Posted by: Anonymous [ip: 201.76.153.10] on August 13, 2008 02:40 PM
I can't believe you've missed "ab"...
http://httpd.apache.org/docs/2.0/programs/ab.html

#

Using free software for HTTP load testing

Posted by: Anonymous [ip: 66.151.227.20] on August 13, 2008 02:57 PM
also include OpenSTA, IMO the best tool out there.

#

Using free software for HTTP load testing

Posted by: Anonymous [ip: 157.166.167.129] on August 13, 2008 03:11 PM
I found that Microsoft’s Web Application Stress (WAS) tool worked the best for me. I needed to test a huge list of URLs that represented several hours of real traffic and maintaining user session information. Using it I was able to run nearly 400 concurrent connections on a single test box (more memory would allow me to do more).

http://www.microsoft.com/technet/archive/itsolutions/intranet/downloads/webstres.mspx?mfr=true

#

Using free software for HTTP load testing

Posted by: Anonymous [ip: 158.169.131.14] on August 13, 2008 03:18 PM
I use The Grinder in a commercial setting, stress testing at very high levels of load, and I find it is very lightweight, incredibly flexible and easy to build a set of tools around - Java/Python based. I've tried a lot of the tools you've mentioned and I've never found anything that could compete.

Paul

#

Using free software for HTTP load testing

Posted by: Anonymous [ip: 80.144.47.168] on August 13, 2008 04:11 PM
Why didn't you test ab?

#

Using free software for HTTP load testing

Posted by: Anonymous [ip: 146.197.246.30] on August 13, 2008 06:21 PM
How about sun's Faban(http://faban.sunsource.net/docs/guide/harness/install.html) ?
some notes on benchmarking at :
1) http://weblogs.java.net/blog/sdo/archive/2007/04/simple_benchmar.html
2) http://blogs.sun.com/shanti/entry/http_load_generator

It would be nice if someone did a comparison of Grinder and Faban -- both seem to be fairly good.

BR,
~A

#

Using free software for HTTP load testing

Posted by: Anonymous [ip: 195.92.101.11] on August 14, 2008 10:40 AM
I'd second the Grinder recommendation, in the past I could record and scale quite complicated applications using Grinder. Also having a lot of Java developers around me, it was incredibly easy to hand out complicated tasks to be programmed. JMeter was never good enough - it might have caught up lately.

I haven't tried Sun's application yet, I'll have a look at it.

#

Using free software for HTTP load testing

Posted by: Anonymous [ip: 121.241.114.14] on August 22, 2008 12:37 PM
Try Selenium also

#

Using free software for HTTP load testing

Posted by: Anonymous [ip: 78.186.192.165] on September 11, 2008 10:15 AM
For an open source project run by volunteers, using a wiki for documentation just makes sense. It doesn't even have to look like a wiki, and it doesn't have to be editable by the general public. We did the web site for <a href="http://www.citadel.org">Citadel</a> using DokuWiki, after finding Joomla a bit too cumbersome for the task. DokuWiki was very easy to customize -- we were even able to convert our Joomla theme with about an hour of template editing.

Wiki editing controls do not appear unless you're logged in, and only the developers and site maintainers are allowed to log in. But we're very, very diligent about one thing: whenever someone asks a good question in the support forum, we add another wiki page documenting the answer. By doing this, we've ended up with a Knowledge Base for the program. And if you ask me, that's more comprehensive than a "FAQ" -- let's face it, the FAQ's for most open source projects aren't made up of actual questions that were asked -- they're more like a README written in question-and-answer format.

<a href="http://www.cinselsorular.com" title="doğum , gebelik , hamilelik , cinsellik , kızlık zarı , haydar dümen , cinsel sorunlar" target="_blank">doğum</a>

#

Using free software for HTTP load testing

Posted by: Anonymous [ip: 78.186.192.165] on September 11, 2008 10:16 AM

This story has been archived. Comments can no longer be posted.



 
Tableless layout Validate XHTML 1.0 Strict Validate CSS Powered by Xaraya