junitperf-1.9.1.orig/ 0000755 0001750 0001750 00000000000 10164733254 013456 5 ustar wbaer wbaer junitperf-1.9.1.orig/docs/ 0000755 0001750 0001750 00000000000 10164733254 014406 5 ustar wbaer wbaer junitperf-1.9.1.orig/docs/JUnitPerf.html 0000644 0001750 0001750 00000073311 10164733254 017147 0 ustar wbaer wbaer
Summary
JUnitPerf is a collection of JUnit test decorators used to measure the performance and scalability of functionality contained within existing JUnit tests.
If you like this kind of automation, you'll love my book, Pragmatic Project Automation.
The two-day, on-site Test-Driven Development with JUnit Workshop is an excellent way to learn JUnit and test-driven development through lecture and a series of hands-on exercises guided by Mike Clark.
JUnitPerf is a collection of JUnit test decorators used to measure the performance and scalability of functionality contained within existing JUnit tests.
JUnitPerf contains the following JUnit test decorators:
TimedTest
A TimedTest
is a test decorator that runs a test
and measures the elapsed time of the test.
A TimedTest
is constructed with a specified
maximum elapsed time. By default, a TimedTest
will wait for the completion of its decorated test and then
fail if the maximum elapsed time was exceeded. Alternatively,
a TimedTest
can be constructed to immediately
signal a failure when the maximum elapsed time of its decorated
test is exceeded.
LoadTest
A LoadTest
is a test decorator that runs a test
with a simulated number of concurrent users and iterations.
JUnitPerf tests transparently decorate existing JUnit tests. This decoration-based design allows performance testing to be dynamically added to an existing JUnit test without affecting the use of the JUnit test independent of its performance. By decorating existing JUnit tests, it's quick and easy to compose a set of performance tests into a performance test suite.
The performance test suite can then be run automatically and independent of your other JUnit tests. In fact, you generally want to avoid grouping your JUnitPerf tests with your other JUnit tests so that you can run the test suites independently and at different frequencies. Long-running performance tests will slow you down and undoubtedly tempt you to abandon unit testing altogether, so try to schedule them to run at times when they won't interfere with your refactoring pace.
JUnitPerf tests are intended to be used specifically in situations where you have quantitative performance and/or scalability requirements that you'd like to keep in check while refactoring code. For example, you might write a JUnitPerf test to ensure that refactoring an algorithm didn't incur undesirable performance overhead in a performance-critical code section. You might also write a JUnitPerf test to ensure that refactoring a resource pool didn't adversely affect the scalability of the pool under load.
It's important to maintain a pragmatic approach when writing JUnitPerf tests to maximize the return on your testing investment. Traditional performance profiling tools and techniques should be employed first to identify which areas of code exhibit the highest potential for performance and scalability problems. JUnitPerf tests can then be written to automatically test and check that requirements are being met now and in the future.
Here's an example usage scenario:
You've built a well-factored chunk of software, complete with the necessary suite of JUnit tests to validate the software. At this point in the process you've gained as much knowledge about the design as possible.
You then use a performance profiling tool to isolate where the software is spending most of its time. Based on your knowledge of the design you're better equipped to make realistic estimates of the desired performance and scalability. And, since your refactorings have formed clear and succinct methods, your profiler is able to point you towards smaller sections of code to tune.
You then write a JUnitPerf test with the desired performance and scalability tolerances for the code to be tuned. Without making any changes to the code, the JUnitPerf test should fail, proving that the test is written properly. You then make the tuning changes in small steps.
After each step you compile and rerun the JUnitPerf test. If you've improved performance to the expected degree, the test passes. If you haven't improved performance to the expected degree, the test fails and you continue the tuning process until the test passes. In the future, when the code is again refactored, you re-run the test. If the test fails, the previously defined performance limits have been exceeded, so you back out the change and continue refactoring until the test passes.
JUnitPerf 1.9 is the latest major version release. It includes all the minor version changes.
This version requires Java 2 and JUnit 3.5 (or higher).
The distribution contains a JAR file, source code, sample tests, API documentation, and this document.
Windows
To install JUnitPerf, follow these steps:
Unzip the junitperf-<version>.zip
distribution
file to a directory referred to as %JUNITPERF_HOME%
.
Add JUnitPerf to the classpath:
set CLASSPATH=%CLASSPATH%;%JUNITPERF_HOME%\lib\junitperf-<version>.jar
Unix (bash)
To install JUnitPerf, follow these steps:
Unzip the junitperf-<version>.zip
distribution
file to a directory referred to as $JUNITPERF_HOME
.
Change file permissions:
chmod -R a+x $JUNITPERF_HOME
export CLASSPATH=$CLASSPATH:$JUNITPERF_HOME/lib/junitperf-<version>.jar
The JUnitPerf distribution includes the pre-built classes in the
$JUNITPERF_HOME/lib/junitperf-<version>.jar
file.
Building
An Ant
build file is included in $JUNITPERF_HOME/build.xml
to
build the $JUNITPERF_HOME/dist/junitperf-<version>.jar
file from the included source code.
To build JUnitPerf, use:
cd $JUNITPERF_HOME ant jar
Testing
The JUnitPerf distribution includes JUnit test cases to validate the integrity of JUnitPerf.
To test JUnitPerf, use:
cd $JUNITPERF_HOME ant test
The easiest way to describe how to use JUnitPerf is to show examples of each type of test decorator.
The $JUNITPERF_HOME/samples
directory contains the set of
example JUnitPerf tests described in this section.
TimedTest
A TimedTest
test decorator is constructed with an
existing JUnit test and a maximum elapsed time in milliseconds.
For example, to create a timed test that waits for the completion of
the ExampleTestCase.testOneSecondResponse()
method and
then fails if the elapsed time exceeded 1 second, use:
long maxElapsedTime = 1000; Test testCase = new ExampleTestCase("testOneSecondResponse"); Test timedTest = new TimedTest(testCase, maxElapsedTime);
Alternatively, to create a timed test that fails immediately when the
elapsed time of
the ExampleTestCase.testOneSecondResponse()
test method
exceeds 1 second, use:
long maxElapsedTime = 1000; Test testCase = new ExampleTestCase("testOneSecondResponse"); Test timedTest = new TimedTest(testCase, maxElapsedTime, false);
The following is an example test that creates a TimedTest
to test the performance of the functionality being unit tested in the
ExampleTestCase.testOneSecondResponse()
method. The timed
test waits for the method under test to complete, and then fails if the
elapsed time exceeded 1 second.
import com.clarkware.junitperf.*; import junit.framework.Test; public class ExampleTimedTest { public static Test suite() { long maxElapsedTime = 1000; Test testCase = new ExampleTestCase("testOneSecondResponse"); Test timedTest = new TimedTest(testCase, maxElapsedTime); return timedTest; } public static void main(String[] args) { junit.textui.TestRunner.run(suite()); } }
The granularity of the test decoration design offered by JUnit, and
used by JUnitPerf, imposes some limitations. The elapsed time
measured by a TimedTest
decorating a
single testXXX()
method of a TestCase
includes the total time of the
setUp()
, testXXX()
, and tearDown()
methods. The elapsed time measured by a TimedTest
decorating
a TestSuite
includes the total time of all setUp()
,
testXXX()
, and tearDown()
methods for all the
Test
instances in the TestSuite
.
Therefore, the expected elapsed time measurements should be adjusted
accordingly to account for the set-up and tear-down costs of the
decorated test.
LoadTest
A LoadTest
is a test decorator that runs a test with a
simulated number of concurrent users and iterations.
In its simplest form, a LoadTest
is constructed with a
test to decorate and the number of concurrent users. By default, each
user runs one iteration of the test.
For example, to create a load test of 10 concurrent users with each
user running the ExampleTestCase.testOneSecondResponse()
method once and all users starting simultaneously, use:
int users = 10; Test testCase = new ExampleTestCase("testOneSecondResponse"); Test loadTest = new LoadTest(testCase, users);
The load can be ramped by specifying a pluggable Timer
instance that prescribes the delay between the addition of each
concurrent user. A ConstantTimer
has a constant delay,
with a zero value indicating that all users will be started
simultaneously. A RandomTimer
has a random delay with a
uniformly distributed variation.
For example, to create a load test of 10 concurrent users with each
user running the ExampleTestCase.testOneSecondResponse()
method once and with a 1 second delay between the addition of users,
use:
int users = 10; Timer timer = new ConstantTimer(1000); Test testCase = new ExampleTestCase("testOneSecondResponse"); Test loadTest = new LoadTest(testCase, users, timer);
In order to simulate each concurrent user running a test for a
specified number of iterations, a LoadTest
can be
constructed to decorate a RepeatedTest
. Alternatively,
a LoadTest
convenience constructor specifying the number
of iterations is provided which creates a RepeatedTest
.
For example, to create a load test of 10 concurrent users with each
user running the ExampleTestCase.testOneSecondResponse()
method for 20 iterations, and with a 1 second delay between the
addition of users, use:
int users = 10; int iterations = 20; Timer timer = new ConstantTimer(1000); Test testCase = new ExampleTestCase("testOneSecondResponse"); Test repeatedTest = new RepeatedTest(testCase, iterations); Test loadTest = new LoadTest(repeatedTest, users, timer);
or, alternatively, use:
int users = 10; int iterations = 20; Timer timer = new ConstantTimer(1000); Test testCase = new ExampleTestCase("testOneSecondResponse"); Test loadTest = new LoadTest(testCase, users, iterations, timer);
If a test case intended to be decorated as a LoadTest
contains test-specific state in the setUp()
method, then
the
TestFactory
should be used to ensure that each concurrent user
thread uses a thread-local instance of the test. For example, to create a load
test of 10 concurrent users with each user running a thread-local instance of
ExampleStatefulTest
, use:
int users = 10; Test factory = new TestFactory(ExampleStatefulTest.class); Test loadTest = new LoadTest(factory, users);
or, to load test a single test method, use:
int users = 10; Test factory = new TestMethodFactory(ExampleStatefulTest.class, "testSomething"); Test loadTest = new LoadTest(factory, users);
The following is an example test that creates a LoadTest
to test the scalability of the functionality being unit tested in the
ExampleTestCase.testOneSecondResponse()
test method.
The LoadTest
adds 10 concurrent users without delay,
with each user running the test method once. The LoadTest
itself is decorated with a TimedTest
to test the throughput
of the ExampleTestCase.testOneSecondResponse()
test method
under load. The test will fail if the total elapsed time
of the entire load test exceeds 1.5 seconds.
import com.clarkware.junitperf.*; import junit.framework.Test; public class ExampleThroughputUnderLoadTest { public static Test suite() { int maxUsers = 10; long maxElapsedTime = 1500; Test testCase = new ExampleTestCase("testOneSecondResponse"); Test loadTest = new LoadTest(testCase, maxUsers); Test timedTest = new TimedTest(loadTest, maxElapsedTime); return timedTest; } public static void main(String[] args) { junit.textui.TestRunner.run(suite()); } }
In the following example, the order of test decoration is reversed.
The
TimedTest
measures the elapsed time of the
ExampleTestCase.testOneSecondResponse()
method.
The LoadTest
then decorates the TimedTest
to simulate a 10-user load on the ExampleTestCase.testOneSecondResponse()
method. The test will fail if any user's response time exceeds 1 second.
import com.clarkware.junitperf.*; import junit.framework.Test; public class ExampleResponseTimeUnderLoadTest { public static Test suite() { int maxUsers = 10; long maxElapsedTime = 1000; Test testCase = new ExampleTestCase("testOneSecondResponse"); Test timedTest = new TimedTest(testCase, maxElapsedTime); Test loadTest = new LoadTest(timedTest, maxUsers); return loadTest; } public static void main(String[] args) { junit.textui.TestRunner.run(suite()); } }
Performance Test Suite
The following is an example TestCase
that combines the
ExampleTimedTest
and ExampleLoadTest
into
a single test suite that can be run automatically to run all performance-related tests:
import junit.framework.Test; import junit.framework.TestSuite; public class ExamplePerfTestSuite { public static Test suite() { TestSuite suite = new TestSuite(); suite.addTest(ExampleTimedTest.suite()); suite.addTest(ExampleLoadTest.suite()); return suite; } public static void main(String[] args) { junit.textui.TestRunner.run(suite()); } }
Timed Tests
Waiting Timed Tests
By default, a TimedTest
will wait for the completion of
its decorated test and then fail if the maximum elapsed time was
exceeded. This type of waiting timed test always allows its
decorated test to accumulate all test results until test completion
and check the accumulated test results.
If the test decorated by a waiting timed test spawns threads, either
directly or indirectly, then the decorated test must wait for those
threads to run to completion and return control to the timed test.
Otherwise, the timed test will wait indefinitely. As a general rule,
unit tests should always wait for spawned threads to run to
completion, using Thread.join()
for example, in order to
accurately assert test results.
Non-Waiting Timed Tests
Alternatively, a TimedTest
can be constructed to
immediately signal a failure when the maximum elapsed time of its
decorated test is exceeded. This type of non-waiting timed
test will not wait for its decorated test to run to completion if the
maximum elapsed time is exceeded. Non-waiting timed tests are more
efficient than waiting timed tests in that non-waiting timed tests
don't waste time waiting for the decorated test to complete only then
to signal a failure, if necessary. However, unlike waiting timed
tests, test results from a decorated test will not be accumulated
after the expiration of the maximum elapsed time in a non-waiting
timed test.
Load Tests
Non-Atomic Load Tests
By default, a LoadTest
does not enforce test atomicity
(as defined in transaction processing) if its decorated test spawns
threads, either directly or indirectly. This type
of non-atomic load test assumes that its decorated test is
transactionally complete when control is returned. For example, if
the decorated test spawns threads and then returns control without
waiting for its spawned threads to complete, then the decorated test
is assumed to be transactionally complete.
As a general rule, unit tests should always wait for spawned threads
to run to completion, using Thread.join()
for example, in
order to accurately assert test results. However, in certain
environments this isn't always possible. For example, as a result of
a distributed lookup of an Enterprise JavaBean (EJB), an application
server may spawn a new thread to handle the request. If the new
thread belongs to the same
ThreadGroup
as the thread running the decorated test (the
default), then a non-atomic load test will simply wait for the
completion of all threads spawned directly by the load test and the
new (rogue) thread is ignored.
To summarize, non-atomic load tests only wait for the completion of threads spawned directly by the load test to simulate more than one concurrent user.
Atomic Load Tests
If threads are integral to the successful completion of a decorated
test, meaning that the decorated test should not be treated as
complete until all of its threads run to completion,
then setEnforceTestAtomicity(true)
should be invoked to
enforce test atomicity (as defined in transaction processing). This
effectively causes the atomic load test to wait for the
completion of all threads belonging to the
same ThreadGroup
as the thread running the decorated
test. Atomic load tests also treat any premature thread exit as a
test failure. If a thread dies abruptly, then all other threads
belonging to the same ThreadGroup
as the thread running
the decorated test will be interrupted immediately.
If a decorated test spawns threads belonging to the
same ThreadGroup
as the thread running the decorated test
(the default), then the atomic load test will wait indefinitely for
the spawned thread to complete.
To summarize, atomic load tests wait for the completion of all threads
belonging to the same ThreadGroup
as the threads spawned
directly by the load test to simulate more than one concurrent user.
JUnitPerf has the following known limitations:
The elapsed time measured by a TimedTest
decorating
a single testXXX()
method of
a TestCase
includes the total time of
the setUp()
, testXXX()
,
and tearDown()
methods, as this is the granularity
offered by decorating any Test
instance. The
expected elapsed time measurements should be adjusted
accordingly to account for the set-up and tear-down costs of the
decorated test.
JUnitPerf is not intended to be a full-fledged load testing or performance profiling tool, nor is it intended to replace the use of these tools. JUnitPerf should be used to write localized performance unit tests to help developers refactor responsibly.
The performance of your tests can degrade significantly if too many concurrent users are cooperating in a load test. The actual threshold number is JVM specific.
If you have any questions, comments, enhancement requests, success stories, or bug reports regarding JUnitPerf, or if you want to be notified when new versions of JUnitPerf are available, please email mike@clarkware.com. Your information will be kept private.
A mailing list is also available to discuss JUnitPerf or to be notified when new versions of JUnitPerf are available.
Please support the ongoing development of JUnitPerf by purchasing a copy of the book Pragmatic Project Automation.
Thanks in advance!
Reduce defects and improve design and code quality with a two-day, on-site Test-Driven Development with JUnit Workshop that quickly spreads the testing bug throughout your team.
I also offer JUnit mentoring to help your keep the testing momentum.
Contact me for more details.
JUnitPerf is licensed under the BSD License.
Many thanks to Ervin Varga for improving thread safety and test
atomicity by suggesting the use of a ThreadGroup to catch and handle
thread exceptions. Ervin also proposed the idea and provided the
implementation for the
TimedTest
signaling a failure immediately if the maximum time
is exceeded and the TestFactory
. His review of JUnitPerf and
his invaluable contributions are much appreciated!
JUnit Primer
Mike Clark, Clarkware Consulting, Inc.
This article demonstrates how to write and run simple test cases and test suites using the JUnit testing framework.
Continuous Performance Testing With
JUnitPerf
Mike Clark (JavaProNews, 2003)
This article demonstrates how to write JUnitPerf tests that continually keep performance and scalability requirements in check.
Test-Driven Development: A Practical Guide
David Astels (Prentice Hall, 2003)
Includes a section written by yours truly on how to use JUnitPerf for continuous performance testing.
Java Extreme Programming Cookbook
Eric Burke, Brian Coyner (O'Reilly & Associates, 2003)
Includes a chapter of JUnitPerf recipes.
Java Tools for Extreme Programming: Mastering Open Source Tools Including Ant, JUnit, and Cactus
Richard Hightower, Nicholas Lesiecki (John Wiley & Sons, 2001)
Includes a chapter describing how to use JUnitPerf with HttpUnit.
ExampleLoadTest
demonstrates how to
* decorate a Test
as a LoadTest
.
*
* @author Mike Clark
* @author Clarkware Consulting, Inc.
*
* @see com.clarkware.junitperf.LoadTest
* @see com.clarkware.junitperf.TimedTest
*/
public class ExampleLoadTest {
public static final long toleranceInMillis = 100;
public static Test suite() {
TestSuite suite = new TestSuite();
//
// Pick any example tests below.
//
suite.addTest(makeStateful10UserLoadTest());
suite.addTest(makeStateful10UserLoadTestMethod());
suite.addTest(make1SecondResponse10UserLoad1SecondDelayIterationTest());
//suite.addTest(make1SecondResponse1UserLoadTest());
//suite.addTest(make1SecondResponse2UserLoadTest());
//suite.addTest(make1SecondResponse1UserLoadIterationTest());
//suite.addTest(make1SecondResponse1UserLoadRepeatedTest());
//suite.addTest(make1SecondResponse2UserLoad2SecondDelayTest());
return suite;
}
/**
* Decorates a stateful test as a 10 user load test,
* providing each user with a different test instance
* to ensure thread safety.
*
* @return Test.
*/
protected static Test makeStateful10UserLoadTest() {
int users = 10;
int iterations = 1;
Test factory = new TestFactory(ExampleStatefulTestCase.class);
Test loadTest = new LoadTest(factory, users, iterations);
return loadTest;
}
/**
* Decorates a stateful test method as a 10 user load test,
* providing each user with a different test instance
* to ensure thread safety.
*
* @return Test.
*/
protected static Test makeStateful10UserLoadTestMethod() {
int users = 10;
int iterations = 1;
Test factory =
new TestMethodFactory(ExampleStatefulTestCase.class, "testState");
Test loadTest = new LoadTest(factory, users, iterations);
return loadTest;
}
/**
* Decorates a one second response time test as a one
* user load test with a maximum elapsed time of 1 second
* and a 0 second delay between users.
*
* @return Test.
*/
protected static Test make1SecondResponse1UserLoadTest() {
int users = 1;
long maxElapsedTimeInMillis = 1000 + toleranceInMillis;
Test testCase = new ExampleTestCase("testOneSecondResponse");
Test loadTest = new LoadTest(testCase, users);
Test timedTest = new TimedTest(loadTest, maxElapsedTimeInMillis);
return timedTest;
}
/**
* Decorates a one second response time test as a two
* user load test with a maximum elapsed time of 1.5
* seconds and a 0 second delay between users.
*
* @return Test.
*/
protected static Test make1SecondResponse2UserLoadTest() {
int users = 2;
long maxElapsedTimeInMillis = 1500 + toleranceInMillis;
Test testCase = new ExampleTestCase("testOneSecondResponse");
Test loadTest = new LoadTest(testCase, users);
Test timedTest = new TimedTest(loadTest, maxElapsedTimeInMillis);
return timedTest;
}
/**
* Decorates a one second response time test as a one
* user load test with 10 iterations per user, a maximum
* elapsed time of 12 seconds, and a 0 second delay
* between users.
*
* @see testOneSecondResponseOneUserLoadRepeatedTest
* @return Test.
*/
protected static Test make1SecondResponse1UserLoadIterationTest() {
int users = 1;
int iterations = 10;
long maxElapsedTimeInMillis = 10000 + toleranceInMillis;
Test testCase = new ExampleTestCase("testOneSecondResponse");
Test loadTest = new LoadTest(testCase, users, iterations);
Test timedTest = new TimedTest(loadTest, maxElapsedTimeInMillis);
return timedTest;
}
/**
* Decorates a one second response time test as a one
* user load test with 10 iterations per user, a maximum
* elapsed time of 12 seconds, and a 0 second delay
* between users.
*
* @see testOneSecondResponseOneUserLoadIterationTest
* @return Test.
*/
protected static Test make1SecondResponse1UserLoadRepeatedTest() {
int users = 1;
int iterations = 10;
long maxElapsedTimeInMillis = 10000 + toleranceInMillis;
Test testCase = new ExampleTestCase("testOneSecondResponse");
Test repeatedTest = new RepeatedTest(testCase, iterations);
Test loadTest = new LoadTest(repeatedTest, users);
Test timedTest = new TimedTest(loadTest, maxElapsedTimeInMillis);
return timedTest;
}
/**
* Decorates a one second response time test as a two
* user load test with a maximum elapsed time of 4 seconds
* and a 2 second delay between users.
*
* @return Test.
*/
protected static Test make1SecondResponse2UserLoad2SecondDelayTest() {
int users = 2;
Timer timer = new ConstantTimer(2000);
long maxElapsedTimeInMillis = 4000 + toleranceInMillis;
Test testCase = new ExampleTestCase("testOneSecondResponse");
Test loadTest = new LoadTest(testCase, users, timer);
Test timedTest = new TimedTest(loadTest, maxElapsedTimeInMillis);
return timedTest;
}
/**
* Decorates a one second response time test as a 10
* user load test with 10 iterations per user, a maximum
* elapsed time of 20 seconds, and a 1 second delay
* between users.
*
* @return Test.
*/
protected static Test
make1SecondResponse10UserLoad1SecondDelayIterationTest() {
int users = 10;
int iterations = 10;
Timer timer = new ConstantTimer(1000);
long maxElapsedTimeInMillis = 20000 + toleranceInMillis;
Test testCase = new ExampleTestCase("testOneSecondResponse");
Test loadTest = new LoadTest(testCase, users, iterations, timer);
Test timedTest = new TimedTest(loadTest, maxElapsedTimeInMillis);
return timedTest;
}
public static void main(String args[]) {
junit.textui.TestRunner.run(suite());
}
}
junitperf-1.9.1.orig/samples/com/clarkware/junitperf/ExamplePerfTestSuite.java 0000644 0001750 0001750 00000001212 10164733254 026602 0 ustar wbaer wbaer package com.clarkware.junitperf;
import junit.framework.Test;
import junit.framework.TestSuite;
/**
* The ExamplePerfTestSuite
demonstrates how to
* assemble a test suite containing performance-related tests.
*
* @author Mike Clark
* @author Clarkware Consulting, Inc.
*/
public class ExamplePerfTestSuite {
public static Test suite() {
TestSuite suite = new TestSuite();
suite.addTest(ExampleTimedTest.suite());
suite.addTest(ExampleLoadTest.suite());
// Add more performance tests here.
return suite;
}
public static void main(String args[]) {
junit.textui.TestRunner.run(suite());
}
} junitperf-1.9.1.orig/samples/com/clarkware/junitperf/ExampleResponseTimeUnderLoadTest.java 0000644 0001750 0001750 00000001604 10164733254 031114 0 ustar wbaer wbaer package com.clarkware.junitperf;
import junit.framework.Test;
/**
* The ExampleResponseTimeUnderLoadTest
demonstrates
* how to decorate a TimedTest
as a LoadTest
* to measure response time under load.
*
* @author Mike Clark
* @author Clarkware Consulting, Inc.
*
* @see com.clarkware.junitperf.LoadTest
* @see com.clarkware.junitperf.TimedTest
*/
public class ExampleResponseTimeUnderLoadTest {
public static Test suite() {
int maxUsers = 10;
long maxElapsedTime = 1050;
Test testCase = new ExampleTestCase("testOneSecondResponse");
Test timedTest = new TimedTest(testCase, maxElapsedTime);
Test loadTest = new LoadTest(timedTest, maxUsers);
return loadTest;
}
public static void main(String args[]) {
junit.textui.TestRunner.run(suite());
}
}
junitperf-1.9.1.orig/samples/com/clarkware/junitperf/ExampleStatefulTestCase.java 0000644 0001750 0001750 00000003060 10164733254 027262 0 ustar wbaer wbaer package com.clarkware.junitperf;
import junit.framework.Test;
import junit.framework.TestCase;
import junit.framework.TestSuite;
/**
* The ExampleStatefulTestCase
is an example
* stateful TestCase
.
*
* If the testState()
test is run without a
* TestMethodFactory
, then all threads of a
* LoadTest
will share the objects in the
* test fixture and may cause some tests to fail.
*
* To ensure that each thread running the test method has a * thread-local test fixture, use: *
** * * @author Mike Clark * @author Clarkware Consulting, Inc. */ public class ExampleStatefulTestCase extends TestCase { private boolean _flag; private int _data; public ExampleStatefulTestCase(String name) { super(name); } protected void setUp() { _flag = true; _data = 1; } protected void tearDown() { _flag = false; _data = 0; } /** * This test may fail in a* Test factory = * new TestMethodFactory(ExampleStatefulTestCase.class, "testState"); * LoadTest test = new LoadTest(factory, numberOfUsers, ...); * ... **
LoadTest
if run
* without a TestMethodFactory
.
*/
public void testState() throws Exception {
assertEquals(true, _flag);
Thread.yield();
assertEquals(1, _data);
}
public static Test suite() {
return new TestSuite(ExampleStatefulTestCase.class);
}
public static void main(String args[]) {
junit.textui.TestRunner.run(suite());
}
}
junitperf-1.9.1.orig/samples/com/clarkware/junitperf/ExampleTestCase.java 0000644 0001750 0001750 00000001331 10164733254 025551 0 ustar wbaer wbaer package com.clarkware.junitperf;
import junit.framework.Test;
import junit.framework.TestCase;
import junit.framework.TestSuite;
/**
* The ExampleTestCase
is an example
* stateless TestCase
.
*
* @author Mike Clark
* @author Clarkware Consulting, Inc.
*/
public class ExampleTestCase extends TestCase {
public ExampleTestCase(String name) {
super(name);
}
protected void setUp() {
}
protected void tearDown() {
}
public void testOneSecondResponse() throws Exception {
Thread.sleep(1000);
}
public static Test suite() {
return new TestSuite(ExampleTestCase.class);
}
public static void main(String args[]) {
junit.textui.TestRunner.run(suite());
}
}
junitperf-1.9.1.orig/samples/com/clarkware/junitperf/ExampleThroughputUnderLoadTest.java 0000644 0001750 0001750 00000001574 10164733254 030656 0 ustar wbaer wbaer package com.clarkware.junitperf;
import junit.framework.Test;
/**
* The ExampleThroughputUnderLoadTest
demonstrates how to
* decorate a LoadTest
as a TimedTest
* to measure throughput under load.
*
* @author Mike Clark
* @author Clarkware Consulting, Inc.
*
* @see com.clarkware.junitperf.LoadTest
* @see com.clarkware.junitperf.TimedTest
*/
public class ExampleThroughputUnderLoadTest {
public static Test suite() {
int maxUsers = 10;
long maxElapsedTime = 1500;
Test testCase = new ExampleTestCase("testOneSecondResponse");
Test loadTest = new LoadTest(testCase, maxUsers);
Test timedTest = new TimedTest(loadTest, maxElapsedTime);
return timedTest;
}
public static void main(String args[]) {
junit.textui.TestRunner.run(suite());
}
}
junitperf-1.9.1.orig/samples/com/clarkware/junitperf/ExampleTimedTest.java 0000644 0001750 0001750 00000001330 10164733254 025737 0 ustar wbaer wbaer package com.clarkware.junitperf;
import junit.framework.Test;
/**
* The ExampleTimedTest
demonstrates how to
* decorate a Test
as a TimedTest
.
*
* @author Mike Clark
* @author Clarkware Consulting, Inc.
*
* @see com.clarkware.junitperf.TimedTest
*/
public class ExampleTimedTest {
public static final long toleranceInMillis = 100;
public static Test suite() {
long maxElapsedTimeInMillis = 1000 + toleranceInMillis;
Test testCase = new ExampleTestCase("testOneSecondResponse");
Test timedTest = new TimedTest(testCase, maxElapsedTimeInMillis);
return timedTest;
}
public static void main(String args[]) {
junit.textui.TestRunner.run(suite());
}
}
junitperf-1.9.1.orig/src/ 0000755 0001750 0001750 00000000000 10164733254 014245 5 ustar wbaer wbaer junitperf-1.9.1.orig/src/com/ 0000755 0001750 0001750 00000000000 10164733254 015023 5 ustar wbaer wbaer junitperf-1.9.1.orig/src/com/clarkware/ 0000755 0001750 0001750 00000000000 10164733254 016776 5 ustar wbaer wbaer junitperf-1.9.1.orig/src/com/clarkware/junitperf/ 0000755 0001750 0001750 00000000000 10164733254 021004 5 ustar wbaer wbaer junitperf-1.9.1.orig/src/com/clarkware/junitperf/ConstantTimer.java 0000644 0001750 0001750 00000001174 10164733254 024444 0 ustar wbaer wbaer package com.clarkware.junitperf;
/**
* The ConstantTimer
is a Timer
* with a constant delay.
*
* @author Mike Clark
* @author Clarkware Consulting, Inc.
*
* @see com.clarkware.junitperf.Timer
*/
public class ConstantTimer implements Timer {
private final long delay;
/**
* Constructs a ConstantTimer
with the
* specified delay.
*
* @param delay Delay (in milliseconds).
*/
public ConstantTimer(long delay) {
this.delay = delay;
}
/**
* Returns the timer delay.
*
* @return Delay (in milliseconds).
*/
public long getDelay() {
return delay;
}
}
junitperf-1.9.1.orig/src/com/clarkware/junitperf/LoadTest.java 0000644 0001750 0001750 00000020762 10164733254 023375 0 ustar wbaer wbaer package com.clarkware.junitperf;
import junit.framework.Test;
import junit.framework.TestResult;
import junit.extensions.RepeatedTest;
/**
* The LoadTest
is a test decorator that runs
* a test with a simulated number of concurrent users and
* iterations.
*
* In its simplest form, a LoadTest
is constructed
* with a test to decorate and the number of concurrent users.
*
* For example, to create a load test of 10 concurrent users
* with each user running ExampleTest
once and
* all users started simultaneously, use:
*
** or, to load test a single test method, use: ** Test loadTest = new LoadTest(new TestSuite(ExampleTest.class), 10); **
** ** Test loadTest = new LoadTest(new ExampleTest("testSomething"), 10); **
* The load can be ramped by specifying a pluggable
* Timer
instance which prescribes the delay
* between the addition of each concurrent user. A
* ConstantTimer
has a constant delay, with
* a zero value indicating that all users will be started
* simultaneously. A RandomTimer
has a random
* delay with a uniformly distributed variation.
*
* For example, to create a load test of 10 concurrent users
* with each user running ExampleTest.testSomething()
once and
* with a one second delay between the addition of users, use:
*
** ** Timer timer = new ConstantTimer(1000); * Test loadTest = new LoadTest(new ExampleTest("testSomething"), 10, timer); **
* In order to simulate each concurrent user running a test for a
* specified number of iterations, a LoadTest
can be
* constructed to decorate a RepeatedTest
.
* Alternatively, a LoadTest
convenience constructor
* specifying the number of iterations is provided which creates a
* RepeatedTest
.
*
* For example, to create a load test of 10 concurrent users
* with each user running ExampleTest.testSomething()
for 20 iterations,
* and with a one second delay between the addition of users, use:
*
** or, alternatively, use: ** Timer timer = new ConstantTimer(1000); * Test repeatedTest = new RepeatedTest(new ExampleTest("testSomething"), 20); * Test loadTest = new LoadTest(repeatedTest, 10, timer); **
** A* Timer timer = new ConstantTimer(1000); * Test loadTest = new LoadTest(new ExampleTest("testSomething"), 10, 20, timer); **
LoadTest
can be decorated as a TimedTest
* to test the elapsed time of the load test. For example, to decorate
* the load test constructed above as a timed test with a maximum elapsed
* time of 2 seconds, use:
* ** ** Test timedTest = new TimedTest(loadTest, 2000); **
* By default, a LoadTest
does not enforce test
* atomicity (as defined in transaction processing) if its decorated
* test spawns threads, either directly or indirectly. In other words,
* if a decorated test spawns a thread and then returns control without
* waiting for its spawned thread to complete, then the test is assumed
* to be transactionally complete.
*
* If threads are integral to the successful completion of
* a decorated test, meaning that the decorated test should not be
* treated as complete until all of its threads complete, then
* setEnforceTestAtomicity(true)
should be invoked to
* enforce test atomicity. This effectively causes the load test to
* wait for the completion of all threads belonging to the same
* ThreadGroup
as the thread running the decorated test.
*
LoadTest
to decorate
* the specified test using the specified number
* of concurrent users starting simultaneously.
*
* @param test Test to decorate.
* @param users Number of concurrent users.
*/
public LoadTest(Test test, int users) {
this(test, users, new ConstantTimer(0));
}
/**
* Constructs a LoadTest
to decorate
* the specified test using the specified number
* of concurrent users starting simultaneously and
* the number of iterations per user.
*
* @param test Test to decorate.
* @param users Number of concurrent users.
* @param iterations Number of iterations per user.
*/
public LoadTest(Test test, int users, int iterations) {
this(test, users, iterations, new ConstantTimer(0));
}
/**
* Constructs a LoadTest
to decorate
* the specified test using the specified number
* of concurrent users, number of iterations per
* user, and delay timer.
*
* @param test Test to decorate.
* @param users Number of concurrent users.
* @param iterations Number of iterations per user.
* @param timer Delay timer.
*/
public LoadTest(Test test, int users, int iterations, Timer timer) {
this(new RepeatedTest(test, iterations), users, timer);
}
/**
* Constructs a LoadTest
to decorate
* the specified test using the specified number
* of concurrent users and delay timer.
*
* @param test Test to decorate.
* @param users Number of concurrent users.
* @param timer Delay timer.
*/
public LoadTest(Test test, int users, Timer timer) {
if (users < 1) {
throw new IllegalArgumentException("Number of users must be > 0");
} else if (timer == null) {
throw new IllegalArgumentException("Delay timer is null");
} else if (test == null) {
throw new IllegalArgumentException("Decorated test is null");
}
this.users = users;
this.timer = timer;
setEnforceTestAtomicity(false);
this.barrier = new ThreadBarrier(users);
this.group = new ThreadedTestGroup(this);
this.test = new ThreadedTest(test, group, barrier);
}
/**
* Indicates whether test atomicity should be enforced.
*
* If threads are integral to the successful completion of
* a decorated test, meaning that the decorated test should not be
* treated as complete until all of its threads complete, then
* setEnforceTestAtomicity(true)
should be invoked to
* enforce test atomicity. This effectively causes the load test to
* wait for the completion of all threads belonging to the same
* ThreadGroup
as the thread running the decorated test.
*
* @param isAtomic true
to enforce test atomicity;
* false
otherwise.
*/
public void setEnforceTestAtomicity(boolean isAtomic) {
enforceTestAtomicity = isAtomic;
}
/**
* Returns the number of tests in this load test.
*
* @return Number of tests.
*/
public int countTestCases() {
return test.countTestCases() * users;
}
/**
* Runs the test.
*
* @param result Test result.
*/
public void run(TestResult result) {
group.setTestResult(result);
for (int i=0; i < users; i++) {
if (result.shouldStop()) {
barrier.cancelThreads(users - i);
break;
}
test.run(result);
sleep(getDelay());
}
waitForTestCompletion();
cleanup();
}
protected void waitForTestCompletion() {
//
// TODO: May require a strategy pattern
// if other algorithms emerge.
//
if (enforceTestAtomicity) {
waitForAllThreadsToComplete();
} else {
waitForThreadedTestThreadsToComplete();
}
}
protected void waitForThreadedTestThreadsToComplete() {
while (!barrier.isReached()) {
sleep(50);
}
}
protected void waitForAllThreadsToComplete() {
while (group.activeCount() > 0) {
sleep(50);
}
}
protected void sleep(long time) {
try {
Thread.sleep(time);
} catch(Exception ignored) { }
}
protected void cleanup() {
try {
group.destroy();
} catch (Throwable ignored) { }
}
public String toString() {
if (enforceTestAtomicity) {
return "LoadTest (ATOMIC): " + test.toString();
} else {
return "LoadTest (NON-ATOMIC): " + test.toString();
}
}
protected long getDelay() {
return timer.getDelay();
}
}
junitperf-1.9.1.orig/src/com/clarkware/junitperf/RandomTimer.java 0000644 0001750 0001750 00000001675 10164733254 024101 0 ustar wbaer wbaer package com.clarkware.junitperf;
import java.util.Random;
/**
* The RandomTimer
is a Timer
* with a random delay and a uniformly distributed variation.
*
* @author Mike Clark
* @author Clarkware Consulting, Inc.
*
* @see com.clarkware.junitperf.Timer
*/
public class RandomTimer implements Timer {
private final Random random;
private final long delay;
private final double variation;
/**
* Constructs a RandomTimer
with the
* specified minimum delay and variation.
*
* @param delay Minimum delay (ms).
* @param variation Variation (ms).
*/
public RandomTimer(long delay, double variation) {
this.delay = delay;
this.variation = variation;
this.random = new Random();
}
/**
* Returns the timer delay.
*
* @return Delay (ms).
*/
public long getDelay() {
return (long) Math.abs((random.nextDouble() * variation) + delay);
}
}
junitperf-1.9.1.orig/src/com/clarkware/junitperf/TestFactory.java 0000644 0001750 0001750 00000007242 10164733254 024123 0 ustar wbaer wbaer package com.clarkware.junitperf;
import junit.framework.Test;
import junit.framework.TestCase;
import junit.framework.TestResult;
import junit.framework.TestSuite;
/**
* The TestFactory
class creates thread-local
* TestSuite
instances.
*
* This factory class should be used in cases when a stateful test
* is intended to be decorated by a LoadTest
. A stateful
* test is defined as any test that defines test-specific state in
* its setUp()
method.
*
* Use of the TestFactory
ensures that each thread spawned
* by a LoadTest
contains its own TestSuite
* instance containing all tests defined in the specified
* TestCase
class.
*
* A typical usage scenario is as follows: *
** ** Test factory = new TestFactory(YourTestCase.class); * LoadTest test = new LoadTest(factory, numberOfUsers, ...); * ... **
* Of course, static variables cannot be protected externally, so tests * intended to be run in a multi-threaded environment should ensure * that the use of static variables is thread-safe. *
** This class is dependent on Java 2. For earlier platforms a * local cache implementation should be changed to use, for example, * a HashMap to track thread-local information. *
* @author Mike Clark * @author Clarkware Consulting, Inc. * @author Ervin Varga * * @see com.clarkware.junitperf.LoadTest */ public class TestFactory implements Test { protected final Class testClass; private TestSuite suite; private final TestCache testCache; /** * Constructs aTestFactory
instance.
*
* @param testClass The TestCase
class to load test.
*/
public TestFactory(Class testClass) {
if (!(TestCase.class.isAssignableFrom(testClass))) {
throw new IllegalArgumentException("TestFactory must " +
"be constructed with a TestCase class.");
}
this.testClass = testClass;
this.testCache = new TestCache();
}
/**
* Runs an instance of the Test
class and
* collects its result in the specified TestResult
.
*
* Each invocation of this method triggers the creation of a
* new Test
class instance as specified in the
* construction of this TestFactory
.
*
* @param result Test result.
*/
public void run(TestResult result) {
getTest().run(result);
}
/**
* Returns the number of tests in this test.
*
* @return Number of tests.
*/
public int countTestCases() {
return getTestSuite().countTestCases();
}
/**
* Returns the test description.
*
* @return Description.
*/
public String toString() {
return "TestFactory: " + getTestSuite().toString();
}
protected Test getTest() {
return testCache.getTest();
}
protected TestSuite getTestSuite() {
if (suite == null) {
suite = makeTestSuite();
}
return suite;
}
protected TestSuite makeTestSuite() {
return new TestSuite(testClass);
}
/*
* The TestCache
class provides thread-local
* instances of a TestSuite
class containing
* tests defined in the TestCase
class
* specified in the TestFactory
.
*/
private final class TestCache {
private final ThreadLocal _localCache = new ThreadLocal() {
protected Object initialValue() {
return makeTestSuite();
}
};
/*
* Returns the Test
instance local to the
* calling thread.
*
* @return Thread-local Test
instance.
*/
Test getTest() {
return (Test)_localCache.get();
}
}
}
junitperf-1.9.1.orig/src/com/clarkware/junitperf/TestMethodFactory.java 0000644 0001750 0001750 00000006117 10164733254 025264 0 ustar wbaer wbaer package com.clarkware.junitperf;
import java.io.*;
import java.lang.reflect.*;
import junit.framework.Test;
import junit.framework.TestCase;
import junit.framework.TestSuite;
/**
* The TestMethodFactory
class is a TestFactory
* that creates thread-local TestSuite
instances containing
* a specific test method of a TestCase
.
*
* A typical usage scenario is as follows: *
** * * @author Mike Clark * @author Clarkware Consulting, Inc. * * @see com.clarkware.junitperf.TestFactory * @see com.clarkware.junitperf.LoadTest */ public class TestMethodFactory extends TestFactory { private final String testMethodName; /** * Constructs a* Test factory = new TestMethodFactory(YourTestCase.class, "testSomething"); * LoadTest test = new LoadTest(factory, numberOfUsers, ...); * ... **
TestMethodFactory
instance.
*
* @param testClass The TestCase
class to load test.
* @param testMethodName The name of the test method to load test.
*/
public TestMethodFactory(Class testClass, String testMethodName) {
super(testClass);
this.testMethodName = testMethodName;
}
protected TestSuite makeTestSuite() {
TestSuite suite = new TestSuite();
Constructor constructor = null;
try {
constructor = getConstructor(testClass);
} catch (NoSuchMethodException e) {
suite.addTest(warning("Class " + testClass.getName() +
" has no public constructor TestCase(String name)"));
return suite;
}
if (!Modifier.isPublic(testClass.getModifiers())) {
suite.addTest(warning("Class " + testClass.getName() +
" is not public"));
return suite;
}
addTestMethod(suite, constructor, testMethodName);
if (suite.testCount() == 0) {
suite.addTest(warning("No tests found in " + testClass.getName()));
}
return suite;
}
private void
addTestMethod(TestSuite suite, Constructor constructor, String methodName) {
Object[] args = new Object[] { methodName };
try {
suite.addTest((Test)constructor.newInstance(args));
} catch (InstantiationException ie) {
suite.addTest(warning("Cannot instantiate test case: " +
methodName + " (" + toString(ie) + ")"));
} catch (InvocationTargetException ite) {
suite.addTest(warning("Exception in constructor: " +
methodName + " (" + toString(ite.getTargetException()) + ")"));
} catch (IllegalAccessException iae) {
suite.addTest(warning("Cannot access test case: " +
methodName + " (" + toString(iae) + ")"));
}
}
private Constructor getConstructor(Class theClass)
throws NoSuchMethodException {
Class[] args = { String.class };
return theClass.getConstructor(args);
}
private Test warning(final String message) {
return new TestCase("warning") {
protected void runTest() {
fail(message);
}
};
}
private String toString(Throwable t) {
StringWriter stringWriter = new StringWriter();
PrintWriter writer = new PrintWriter(stringWriter);
t.printStackTrace(writer);
return stringWriter.toString();
}
}
junitperf-1.9.1.orig/src/com/clarkware/junitperf/ThreadBarrier.java 0000644 0001750 0001750 00000002366 10164733254 024374 0 ustar wbaer wbaer package com.clarkware.junitperf;
/**
* The ThreadBarrier
class provides a callback
* method for threads to signal their completion.
*
* @author Mike Clark
* @author Clarkware Consulting, Inc.
*/
public class ThreadBarrier {
public int returnedCount;
public final int dispatchedCount;
/**
* Constructs a ThreadBarrier
with the
* specified number of threads to wait for.
*
* @param numDispatched Number of threads dispatched.
*/
public ThreadBarrier(int numDispatched) {
returnedCount = 0;
dispatchedCount = numDispatched;
}
/**
* Called when the specified thread is complete.
*
* @param t Completed thread.
*/
public synchronized void onCompletion(Thread t) {
returnedCount++;
}
/**
* Determines whether the thread barrier has been reached -
* when all dispatched threads have returned.
*
* @return true
if the barrier has been reached;
* false
otherwise.
*/
public boolean isReached() {
return (returnedCount >= dispatchedCount);
}
/**
* Cancels the specified number of threads.
*
* @param threadCount Number of threads to cancel.
*/
public synchronized void cancelThreads(int threadCount) {
returnedCount += threadCount;
}
}
junitperf-1.9.1.orig/src/com/clarkware/junitperf/ThreadedTest.java 0000644 0001750 0001750 00000003447 10164733254 024237 0 ustar wbaer wbaer package com.clarkware.junitperf;
import junit.framework.Test;
import junit.framework.TestResult;
/**
* The ThreadedTest
is a test decorator that
* runs a test in a separate thread.
*
* @author Mike Clark
* @author Clarkware Consulting, Inc.
*/
public class ThreadedTest implements Test {
private final Test test;
private final ThreadGroup group;
private final ThreadBarrier barrier;
/**
* Constructs a ThreadedTest
to decorate the
* specified test using the same thread group as the
* current thread.
*
* @param test Test to decorate.
*/
public ThreadedTest(Test test) {
this(test, null, new ThreadBarrier(1));
}
/**
* Constructs a ThreadedTest
to decorate the
* specified test using the specified thread group and
* thread barrier.
*
* @param test Test to decorate.
* @param group Thread group.
* @param barrier Thread barrier.
*/
public ThreadedTest(Test test, ThreadGroup group, ThreadBarrier barrier) {
this.test = test;
this.group = group;
this.barrier = barrier;
}
/**
* Returns the number of test cases in this threaded test.
*
* @return Number of test cases.
*/
public int countTestCases() {
return test.countTestCases();
}
/**
* Runs this test.
*
* @param result Test result.
*/
public void run(TestResult result) {
Thread t = new Thread(group, new TestRunner(result));
t.start();
}
class TestRunner implements Runnable {
private TestResult result;
public TestRunner(TestResult result) {
this.result = result;
}
public void run() {
test.run(result);
barrier.onCompletion(Thread.currentThread());
}
}
/**
* Returns the test description.
*
* @return Description.
*/
public String toString() {
return "ThreadedTest: " + test.toString();
}
}
junitperf-1.9.1.orig/src/com/clarkware/junitperf/ThreadedTestGroup.java 0000644 0001750 0001750 00000003533 10164733254 025250 0 ustar wbaer wbaer package com.clarkware.junitperf;
import junit.framework.AssertionFailedError;
import junit.framework.Test;
import junit.framework.TestResult;
/**
* The ThreadedTestGroup
is a ThreadGroup
* that catches and handles exceptions thrown by threads created
* and started by ThreadedTest
instances.
*
* If a thread managed by a ThreadedTestGroup
throws
* an uncaught exception, then the exception is added to the current
* test's results and all other threads are immediately interrupted.
*
ThreadedTestGroup
for the
* specified test.
*
* @param test Current test.
*/
public ThreadedTestGroup(Test test) {
super("ThreadedTestGroup");
this.test = test;
}
/**
* Sets the current test result.
*
* @param result Test result.
*/
public void setTestResult(TestResult result) {
testResult = result;
}
/**
* Called when a thread in this thread group stops because of
* an uncaught exception.
*
* If the uncaught exception is a ThreadDeath
,
* then it is ignored. If the uncaught exception is an
* AssertionFailedError
, then a failure
* is added to the current test's result. Otherwise, an
* error is added to the current test's result.
*
* @param t Originating thread.
* @param e Uncaught exception.
*/
public void uncaughtException(Thread t, Throwable e) {
if (e instanceof ThreadDeath) {
return;
}
if (e instanceof AssertionFailedError) {
testResult.addFailure(test, (AssertionFailedError)e);
} else {
testResult.addError(test, e);
}
super.interrupt();
}
} junitperf-1.9.1.orig/src/com/clarkware/junitperf/TimedTest.java 0000644 0001750 0001750 00000014264 10164733254 023560 0 ustar wbaer wbaer package com.clarkware.junitperf;
import junit.framework.AssertionFailedError;
import junit.framework.Test;
import junit.framework.TestResult;
import junit.extensions.TestDecorator;
/**
* The TimedTest
is a test decorator that
* runs a test and measures the elapsed time of the test.
*
* A TimedTest
is constructed with a specified
* maximum elapsed time. By default, a TimedTest
* will wait for the completion of its decorated test and
* then fail if the maximum elapsed time was exceeded.
* Alternatively, a TimedTest
can be constructed
* to immediately signal a failure when the maximum elapsed time
* of its decorated test is exceeded. In other words, the
* TimedTest
will not wait for its decorated
* test to run to completion if the maximum elapsed time is
* exceeded.
*
* For example, to decorate the ExampleTest
* as a TimedTest
that waits for the
* ExampleTest
test case to run
* to completion and then fails if the maximum elapsed
* time of 2 seconds is exceeded, use:
*
** or, to time a single test method, use: ** Test timedTest = new TimedTest(new TestSuite(ExampleTest.class), 2000); **
*** Test timedTest = new TimedTest(new ExampleTest("testSomething"), 2000); **
* Alternatively, to decorate the ExampleTest.testSomething()
* test as a TimedTest
that fails immediately when
* the maximum elapsed time of 2 seconds is exceeded, use:
*
** * * @author Mike Clark * @author Clarkware Consulting, Inc. * @author Ervin Varga */ public class TimedTest extends TestDecorator { private final long maxElapsedTime; private final boolean waitForCompletion; private boolean maxElapsedTimeExceeded; private boolean isQuiet; /** * Constructs a* Test timedTest = new TimedTest(new ExampleTest("testSomething"), 2000, false); **
TimedTest
to decorate the
* specified test with the specified maximum elapsed time.
*
* The TimedTest
will wait for the completion
* of its decorated test and then fail if the maximum elapsed
* time was exceeded.
*
* @param test Test to decorate.
* @param maxElapsedTime Maximum elapsed time (ms).
*/
public TimedTest(Test test, long maxElapsedTime) {
this(test, maxElapsedTime, true);
}
/**
* Constructs a TimedTest
to decorate the
* specified test with the specified maximum elapsed time.
*
* @param test Test to decorate.
* @param maxElapsedTime Maximum elapsed time (ms).
* @param waitForCompletion true
(default) to
* indicate that the TimedTest
should wait
* for its decorated test to run to completion and then
* fail if the maximum elapsed time was exceeded;
* false
to indicate that the
* TimedTest
should immediately signal
* a failure when the maximum elapsed time is exceeded.
*/
public TimedTest(Test test, long maxElapsedTime, boolean waitForCompletion) {
super(test);
this.maxElapsedTime = maxElapsedTime;
this.waitForCompletion = waitForCompletion;
maxElapsedTimeExceeded = false;
isQuiet = false;
}
/**
* Disables the output of the test's elapsed time.
*/
public void setQuiet() {
isQuiet = true;
}
/**
* Returns the number of tests in this timed test.
*
* @return Number of tests.
*/
public int countTestCases() {
return super.countTestCases();
}
/**
* Determines whether the maximum elapsed time of
* the test was exceeded.
*
* @return true
if the max elapsed time
* was exceeded; false
otherwise.
*/
public boolean outOfTime() {
return maxElapsedTimeExceeded;
}
/**
* Runs the test.
*
* @param result Test result.
*/
public void run(TestResult result) {
//
// TODO: May require a strategy pattern
// if other algorithms emerge.
//
if (waitForCompletion) {
runUntilTestCompletion(result);
} else {
runUntilTimeExpires(result);
}
}
/**
* Runs the test until test completion and then signals
* a failure if the maximum elapsed time was exceeded.
*
* @param result Test result.
*/
protected void runUntilTestCompletion(TestResult result) {
long beginTime = System.currentTimeMillis();
super.run(result);
long elapsedTime = getElapsedTime(beginTime);
printElapsedTime(elapsedTime);
if (elapsedTime > maxElapsedTime) {
maxElapsedTimeExceeded = true;
result.addFailure(getTest(),
new AssertionFailedError("Maximum elapsed time exceeded!" +
" Expected " + maxElapsedTime + "ms, but was " +
elapsedTime + "ms."));
result.endTest(getTest());
}
}
/**
* Runs the test and immediately signals a failure
* when the maximum elapsed time is exceeded.
*
* @param result Test result.
*/
protected void runUntilTimeExpires(final TestResult result) {
Thread t = new Thread(new Runnable() {
public void run() {
TimedTest.super.run(result);
// IBM's JVM prefers this instead:
// run(result);
}
});
long beginTime = System.currentTimeMillis();
t.start();
try {
t.join(maxElapsedTime);
} catch(InterruptedException ignored) {}
printElapsedTime(getElapsedTime(beginTime));
if (t.isAlive()) {
maxElapsedTimeExceeded = true;
result.addFailure(getTest(),
new AssertionFailedError("Maximum elapsed time (" + maxElapsedTime + " ms) exceeded!"));
result.endTest(getTest());
}
}
protected long getElapsedTime(long beginTime) {
long endTime = System.currentTimeMillis();
return endTime - beginTime;
}
protected void printElapsedTime(long elapsedTime) {
if (!isQuiet) {
System.out.println(toString() + ": " + elapsedTime + " ms");
System.out.flush();
}
}
/**
* Returns the test description.
*
* @return Description.
*/
public String toString() {
if (waitForCompletion) {
return "TimedTest (WAITING): " + super.toString();
} else {
return "TimedTest (NON-WAITING): " + super.toString();
}
}
}
junitperf-1.9.1.orig/src/com/clarkware/junitperf/Timer.java 0000644 0001750 0001750 00000000605 10164733254 022730 0 ustar wbaer wbaer package com.clarkware.junitperf;
/**
* The Timer
interface defines the common interface
* implemented by all classes whose instances serve as pluggable timers.
*
* @author Mike Clark
* @author Clarkware Consulting, Inc.
*/
public interface Timer {
/**
* Returns the timer delay.
*
* @return Delay (in milliseconds).
*/
public long getDelay();
}
junitperf-1.9.1.orig/test/ 0000755 0001750 0001750 00000000000 10164733254 014435 5 ustar wbaer wbaer junitperf-1.9.1.orig/test/com/ 0000755 0001750 0001750 00000000000 10164733254 015213 5 ustar wbaer wbaer junitperf-1.9.1.orig/test/com/clarkware/ 0000755 0001750 0001750 00000000000 10164733254 017166 5 ustar wbaer wbaer junitperf-1.9.1.orig/test/com/clarkware/junitperf/ 0000755 0001750 0001750 00000000000 10164733254 021174 5 ustar wbaer wbaer junitperf-1.9.1.orig/test/com/clarkware/junitperf/AllTests.java 0000644 0001750 0001750 00000001221 10164733254 023566 0 ustar wbaer wbaer package com.clarkware.junitperf;
import junit.framework.Test;
import junit.framework.TestSuite;
/**
* The AllTests
is a TestCase
* for all JUnitPerf tests.
*
* @author Mike Clark
* @author Clarkware Consulting, Inc.
*/
public class AllTests {
public static Test suite() {
TestSuite suite = new TestSuite();
suite.addTest(LoadTestTest.suite());
suite.addTest(TimedTestTest.suite());
suite.addTest(TestFactoryTest.suite());
return suite;
}
public static void main(String[] args) {
junit.textui.TestRunner.run(suite());
}
}
junitperf-1.9.1.orig/test/com/clarkware/junitperf/LoadTestTest.java 0000644 0001750 0001750 00000017470 10164733254 024427 0 ustar wbaer wbaer package com.clarkware.junitperf;
import junit.framework.*;
import junit.extensions.RepeatedTest;
/**
* The LoadTestTest
is a TestCase
* for the LoadTest
class.
*
* @author Mike Clark
* @author Clarkware Consulting, Inc.
* @author Ervin Varga
*
* @see junit.framework.TestCase
*/
public class LoadTestTest extends TestCase {
private TestSuite _successSuite;
private TestSuite _rogueThreadSuite;
private TestSuite _failureSuite;
private TestSuite _errorSuite;
public static final long tolerance = 100;
public LoadTestTest(String name) {
super(name);
_successSuite = new TestSuite();
_failureSuite = new TestSuite();
_rogueThreadSuite = new TestSuite();
_errorSuite = new TestSuite();
_successSuite.addTest(new MockTest("testSuccess"));
_successSuite.addTest(new MockTest("testSuccess"));
_rogueThreadSuite.addTest(new MockTest("testRogueThread"));
_failureSuite.addTest(new MockTest("testFailure"));
_errorSuite.addTest(new MockTest("testError"));
}
public void testOneUser() {
Test test = new LoadTest(_successSuite, 1);
assertEquals(2, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(2, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(0, result.failureCount());
}
/**
* This test will succeed properly only when the load
* test does not enforce thread atomicity.
* Otherwise, if the load test enforces thread atomicity,
* this test will hang indefinitely.
*/
public void testOneUserRogueThread() {
LoadTest test = new LoadTest(_rogueThreadSuite, 1);
//test.setEnforceThreadAtomicity(true);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(0, result.failureCount());
}
public void testMultiUser() {
Test test = new LoadTest(_successSuite, 3);
assertEquals(6, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(6, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(0, result.failureCount());
}
public void testMultiUserWithIterations() {
Test test = new LoadTest(_successSuite, 3, 10);
assertEquals(60, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(60, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(0, result.failureCount());
}
public void testMultiUserWithRepeatedTest() {
RepeatedTest repeat = new RepeatedTest(_successSuite, 10);
Test test = new LoadTest(repeat, 3);
assertEquals(60, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(60, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(0, result.failureCount());
}
public void testMultiUserWithDelay() {
Test test = new LoadTest(_successSuite, 3, new ConstantTimer(0));
assertEquals(6, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(6, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(0, result.failureCount());
}
public void testMultiUserWithFailure() {
Test test = new LoadTest(_failureSuite, 3);
assertEquals(3, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(3, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(3, result.failureCount());
}
public void testMultiUserWithError() {
Test test = new LoadTest(_errorSuite, 3);
assertEquals(3, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(3, result.runCount());
assertEquals(3, result.errorCount());
assertEquals(0, result.failureCount());
}
public void testMultiUserWithStop() {
Test test = new LoadTest(_failureSuite, 2);
assertEquals(2, test.countTestCases());
TestResult result = new TestResult();
result.stop();
test.run(result);
assertEquals(0, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(0, result.failureCount());
}
public void testNonPositiveUser() {
try {
Test test1 = new LoadTest(_successSuite, 0);
fail("Should throw an IllegalArgumentException");
Test test2 = new LoadTest(_successSuite, -1);
fail("Should throw an IllegalArgumentException");
} catch (IllegalArgumentException success) {
return;
}
}
public void testNullTimer() {
try {
Test test = new LoadTest(_successSuite, 1, null);
fail("Should throw an IllegalArgumentException");
} catch (IllegalArgumentException success) {
return;
}
}
public void testAtomic2SecondResponse() {
Test mockTest = new MockTest("testAtomic2SecondResponseWithWorkerThread");
Test loadTest = new LoadTest(mockTest, 1);
Test test = new TimedTest(loadTest, 1000 + tolerance);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(0, result.failureCount());
}
public void testAtomic2SecondResponseEnforceTestAtomicity() {
Test mockTest = new MockTest("testAtomic2SecondResponseWithWorkerThread");
LoadTest loadTest = new LoadTest(mockTest, 1);
loadTest.setEnforceTestAtomicity(true);
Test test = new TimedTest(loadTest, 1000 + tolerance);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(1, result.failureCount());
}
public void testNonAtomic2SecondResponse() {
Test mockTest = new MockTest("testNonAtomic2SecondResponseWithWorkerThread");
LoadTest loadTest = new LoadTest(mockTest, 1);
Test test = new TimedTest(loadTest, 1000 + tolerance);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(1, result.failureCount());
}
public void testNonAtomic2SecondResponseEnforceTestAtomicity() {
Test mockTest = new MockTest("testNonAtomic2SecondResponseWithWorkerThread");
LoadTest loadTest = new LoadTest(mockTest, 1);
loadTest.setEnforceTestAtomicity(true);
Test test = new TimedTest(loadTest, 1000 + tolerance);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(1, result.failureCount());
}
public void testTestStateConsistencyFailure() {
Test mockTest = new MockTestWithState("testInvariant");
LoadTest test = new LoadTest(mockTest, 10, 2);
test.setEnforceTestAtomicity(true);
assertEquals(20, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(20, result.runCount());
assertEquals(0, result.errorCount());
assertTrue(result.failureCount() > 0);
}
public void testTestStateConsistencyWithTestFactory() {
Test testFactory = new TestFactory(MockTestWithState.class);
LoadTest test = new LoadTest(testFactory, 10, 2);
test.setEnforceTestAtomicity(true);
assertEquals(20, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(20, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(0, result.failureCount());
}
public static Test suite() {
return new TestSuite(LoadTestTest.class);
}
public static void main(String args[]) {
junit.textui.TestRunner.run(suite());
}
}
junitperf-1.9.1.orig/test/com/clarkware/junitperf/MockTest.java 0000644 0001750 0001750 00000003165 10164733254 023575 0 ustar wbaer wbaer package com.clarkware.junitperf;
import junit.framework.TestCase;
public class MockTest extends TestCase {
public MockTest(String name) {
super(name);
}
public void testSuccess() {
}
public void testFailure() {
fail();
}
public void testError() {
throw new RuntimeException();
}
public void testOneSecondExecutionTime() throws Exception {
Thread.sleep(1000);
}
public void testOneSecondExecutionTimeWithFailure() throws Exception {
Thread.sleep(1000);
fail();
}
public void testInfiniteExecutionTime() {
while (true) {
}
}
public void testLongExecutionTime() {
try {
Thread.sleep(60000);
} catch (InterruptedException ignored) {}
}
public void testAtomic2SecondResponseWithWorkerThread() {
Thread t = new Thread(new Runnable() {
public void run() {
try {
Thread.sleep(2000);
} catch (InterruptedException ignored) {}
}
});
t.start();
try {
Thread.sleep(1000);
// don't wait for worker thread to finish
} catch (InterruptedException ignored) {}
}
public void testNonAtomic2SecondResponseWithWorkerThread() {
Thread t = new Thread(new Runnable() {
public void run() {
try {
Thread.sleep(2000);
} catch (InterruptedException ignored) {}
}
});
t.start();
try {
Thread.sleep(1000);
// wait for worker thread to finish
t.join();
} catch (InterruptedException ignored) {}
}
public void testRogueThread() {
Thread t = new Thread(new Runnable() {
public void run() {
while (true) {
try {
Thread.sleep(100);
} catch (Exception ignored) {}
}
}
});
t.start();
assertTrue(true);
}
}
junitperf-1.9.1.orig/test/com/clarkware/junitperf/MockTestFactoryTest.java 0000644 0001750 0001750 00000000406 10164733254 025760 0 ustar wbaer wbaer package com.clarkware.junitperf;
import junit.framework.TestCase;
public class MockTestFactoryTest extends TestCase {
public MockTestFactoryTest(String name) {
super(name);
}
public void testSuccess() {
}
public void testFailure() {
fail();
}
}
junitperf-1.9.1.orig/test/com/clarkware/junitperf/MockTestWithState.java 0000644 0001750 0001750 00000001074 10164733254 025427 0 ustar wbaer wbaer package com.clarkware.junitperf;
import junit.framework.TestCase;
public class MockTestWithState extends TestCase {
private boolean _flag;
private int _data;
public MockTestWithState(String name) {
super(name);
}
protected void setUp() {
_flag = true;
_data = 1;
}
protected void tearDown() {
_flag = false;
_data = 0;
}
public void testInvariant() {
assertEquals(true, _flag);
taskSwitch();
assertEquals(1, _data);
}
protected void taskSwitch() {
try {
Thread.yield();
Thread.sleep(10);
} catch(Exception ignore) {}
}
} junitperf-1.9.1.orig/test/com/clarkware/junitperf/TestFactoryTest.java 0000644 0001750 0001750 00000013127 10164733254 025152 0 ustar wbaer wbaer package com.clarkware.junitperf;
import junit.framework.*;
/**
* The TestFactoryTest
is a TestCase
* for the TestFactory
and TestMethodFactory
* classes.
*
* @author Mike Clark
* @author Clarkware Consulting, Inc.
*
* @see junit.framework.TestCase
*/
public class TestFactoryTest extends TestCase {
private TestSuite _allMethodsTestSuite;
private TestSuite _oneMethodTestSuite;
private Class _testClass;
public TestFactoryTest(String name) {
super(name);
_testClass = com.clarkware.junitperf.MockTestFactoryTest.class;
_allMethodsTestSuite = new TestSuite(_testClass);
_oneMethodTestSuite = new TestSuite();
_oneMethodTestSuite.addTest(new MockTestFactoryTest("testSuccess"));
}
public void testAllTestMethods() {
TestFactory testFactory = new TestFactory(_testClass);
assertEquals(2, testFactory.countTestCases());
assertEqualsTestSuite(_allMethodsTestSuite, testFactory.getTest());
TestResult result = new TestResult();
testFactory.run(result);
assertEquals(2, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(1, result.failureCount());
}
public void testAllTestMethodsSameSuiteForSameThread() {
TestFactory testFactory = new TestFactory(_testClass);
Test test1 = testFactory.getTest();
Test test2 = testFactory.getTest();
assertSame(test1, test2);
assertEqualsTestSuite(_allMethodsTestSuite, test1);
assertEqualsTestSuite(_allMethodsTestSuite, test2);
}
public void testAllTestMethodsDifferentSuiteForDifferentThread() {
TestFactory testFactory = new TestFactory(_testClass);
MockRunnable runner1 = new MockRunnable(testFactory);
MockRunnable runner2 = new MockRunnable(testFactory);
Thread thread1 = new Thread(runner1);
Thread thread2 = new Thread(runner2);
thread1.start();
thread2.start();
sleep();
Test test1 = runner1.getTest();
Test test2 = runner2.getTest();
assertTrue(!(test1 == test2));
assertEqualsTestSuite(_allMethodsTestSuite, test1);
assertEqualsTestSuite(_allMethodsTestSuite, test2);
}
public void testOneTestMethodSuccess() {
TestMethodFactory testFactory =
new TestMethodFactory(_testClass, "testSuccess");
assertEquals(1, testFactory.countTestCases());
assertEqualsTestSuite(_oneMethodTestSuite, testFactory.getTest());
TestResult result = new TestResult();
testFactory.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(0, result.failureCount());
}
public void testOneTestMethodFailure() {
TestMethodFactory testFactory =
new TestMethodFactory(_testClass, "testFailure");
assertEquals(1, testFactory.countTestCases());
assertEqualsTestSuite(_oneMethodTestSuite, testFactory.getTest());
TestResult result = new TestResult();
testFactory.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(1, result.failureCount());
}
public void testOneTestMethodNoSuchMethod() {
TestMethodFactory testFactory =
new TestMethodFactory(_testClass, "testFoo");
TestResult result = new TestResult();
testFactory.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(1, result.failureCount());
}
public void testOneTestMethodSameSuiteForSameThread() {
TestFactory testFactory =
new TestMethodFactory(_testClass, "testSuccess");
Test test1 = testFactory.getTest();
Test test2 = testFactory.getTest();
assertSame(test1, test2);
assertEqualsTestSuite(_oneMethodTestSuite, test1);
assertEqualsTestSuite(_oneMethodTestSuite, test2);
}
public void testOneTestMethodDifferentSuiteForDifferentThread() {
TestFactory testFactory =
new TestMethodFactory(_testClass, "testSuccess");
MockRunnable runner1 = new MockRunnable(testFactory);
MockRunnable runner2 = new MockRunnable(testFactory);
Thread thread1 = new Thread(runner1);
Thread thread2 = new Thread(runner2);
thread1.start();
thread2.start();
sleep();
Test test1 = runner1.getTest();
Test test2 = runner2.getTest();
assertTrue(!(test1 == test2));
assertEqualsTestSuite(_oneMethodTestSuite, test1);
assertEqualsTestSuite(_oneMethodTestSuite, test2);
}
public void testClassNotATestCase() {
try {
TestFactory testFactory = new TestFactory(java.lang.String.class);
fail("Class not assignable to TestCase!");
testFactory = new TestMethodFactory(java.lang.String.class, "");
fail("Class not assignable to TestCase!");
} catch(IllegalArgumentException success) {
}
}
protected void sleep() {
try {
Thread.sleep(250);
} catch (InterruptedException ignored) {
}
}
protected void assertEqualsTestSuite(Test t1, Test t2) {
assertTrue(t1 instanceof TestSuite);
assertTrue(t2 instanceof TestSuite);
TestSuite suite1 = (TestSuite)t1;
TestSuite suite2 = (TestSuite)t2;
assertEquals(suite1.countTestCases(), suite2.countTestCases());
assertEquals(suite1.getName(), suite2.getName());
}
private static class MockRunnable implements Runnable {
private TestFactory _factory;
private Test _test;
public MockRunnable(TestFactory factory) {
_factory = factory;
}
public void run() {
_test = _factory.getTest();
}
public Test getTest() {
return _test;
}
}
public static Test suite() {
return new TestSuite(TestFactoryTest.class);
}
public static void main(String args[]) {
junit.textui.TestRunner.run(suite());
}
}
junitperf-1.9.1.orig/test/com/clarkware/junitperf/TimedTestTest.java 0000644 0001750 0001750 00000024127 10164733254 024607 0 ustar wbaer wbaer package com.clarkware.junitperf;
import junit.framework.*;
/**
* The TimedTestTest
is a TestCase
* for the TimedTest
class.
*
* @author Mike Clark
* @author Clarkware Consulting, Inc.
* @author Ervin Varga
*
* @see junit.framework.TestCase
*/
public class TimedTestTest extends TestCase {
private Test _oneSecondTest;
private Test _oneSecondFailedTest;
private Timer _twoSecondDelayTimer;
public static final long tolerance = 100;
public TimedTestTest(String name) {
super(name);
_oneSecondTest =
new MockTest("testOneSecondExecutionTime");
_oneSecondFailedTest =
new MockTest("testOneSecondExecutionTimeWithFailure");
_twoSecondDelayTimer = new ConstantTimer(2000);
}
public void testOneSecondResponseDefault() {
Test test = new TimedTest(_oneSecondTest, 1000 + tolerance);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(0, result.failureCount());
}
public void testOneSecondResponseNoWaitForCompletion() {
Test test = new TimedTest(_oneSecondTest, 1000 + tolerance, false);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(0, result.failureCount());
}
public void testOneSecondResponseWaitForCompletion() {
Test test = new TimedTest(_oneSecondTest, 1000 + tolerance, true);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(0, result.failureCount());
}
public void testOneSecondResponseFailure() {
Test test = new TimedTest(_oneSecondTest, 900);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(1, result.failureCount());
}
public void testOneSecondResponseOneUserLoadSuccess() {
Test loadTest = new LoadTest(_oneSecondTest, 1);
Test test = new TimedTest(loadTest, 1000 + tolerance);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(0, result.failureCount());
}
public void testOneSecondResponseOneUserLoadFailure() {
Test loadTest = new LoadTest(_oneSecondTest, 1);
Test test = new TimedTest(loadTest, 900);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(1, result.failureCount());
}
public void testOneSecondResponseMultiUserLoadSuccess() {
Test loadTest = new LoadTest(_oneSecondTest, 2);
Test test = new TimedTest(loadTest, 1500);
assertEquals(2, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(2, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(0, result.failureCount());
}
public void testOneSecondResponseMultiUserLoadFailure() {
Test loadTest = new LoadTest(_oneSecondTest, 2);
Test test = new TimedTest(loadTest, 1000);
assertEquals(2, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(2, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(1, result.failureCount());
}
public void testOneSecondResponseMultiUserLoadTwoSecondDelaySuccess() {
Test loadTest = new LoadTest(_oneSecondTest, 2, _twoSecondDelayTimer);
Test test = new TimedTest(loadTest, 4000 + tolerance);
assertEquals(2, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(2, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(0, result.failureCount());
}
public void testOneSecondResponseMultiUserLoadTwoSecondDelayFailure() {
Test loadTest = new LoadTest(_oneSecondTest, 2, _twoSecondDelayTimer);
Test test = new TimedTest(loadTest, 3700 + tolerance);
assertEquals(2, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(2, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(1, result.failureCount());
}
/**
* This test will succeed properly only when the timed
* test does not wait for the decorated test completion.
* Otherwise, if the timed test waits for the decorated
* test completion, this test will hang indefinitely.
*/
public void testInfiniteNoWaitForCompletion() {
Test mockTest = new MockTest("testInfiniteExecutionTime");
Test test = new TimedTest(mockTest, 1000 + tolerance, false);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(1, result.failureCount());
}
/**
* This test will succeed properly regardless of whether the
* timed test waits for the decorated test completion or
* not. However, when the timed test does not wait for the
* decorated test completion, then it doesn't need to waste
* time waiting for the test to complete only then to
* signal a failure.
*/
public void testLongResponseNoWaitForCompletion() {
Test mockTest = new MockTest("testLongExecutionTime");
Test test = new TimedTest(mockTest, 2000 + tolerance, false);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(1, result.failureCount());
}
/**
* This test will cause the test to hang indefinitely.
* The test will not properly fail after expiration
* of the specified maximum elapsed time.
*/
/*
public void testInfiniteWaitForCompletion() {
Test mockTest = new MockTest("testInfiniteExecutionTime");
Test test = new TimedTest(mockTest, 1000 + tolerance, true);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(1, result.failureCount());
}
*/
public void testOneSecondResponseSuccessWaiting() {
Test test = new TimedTest(_oneSecondFailedTest, 1000 + tolerance, true);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(1, result.failureCount());
}
public void testOneSecondResponseSuccessNonWaiting() {
Test test = new TimedTest(_oneSecondFailedTest, 1000 + tolerance, false);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(1, result.failureCount());
}
/**
* When the timed test waits for completion, the effects of
* the decorated test are still traced and recorded.
*/
public void testOneSecondResponseFailureWaiting() {
Test test = new TimedTest(_oneSecondFailedTest, 900, true);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(2, result.failureCount());
}
/**
* Failure(s) from a decorated test will not be detected
* after the expiration of the max. elapsed time in a non-waiting
* timed test. This can cause possible ambiguities in the test,
* especially when a decorated test has a varying execution time.
*
* For example, if the decorated test would finish its execution in a
* 800 ms then the failureCount() will return 1 because the decorated
* test itself has failed. However, if a timing would cause a failure
* (the decorated test needed more then 900 ms to complete its execution)
* the failureCount() will be again 1. However, the root cause of the
* failure is ambiguous.
*/
public void testOneSecondResponseNonWaitingWithAmbiguousFailure() {
Test test = new TimedTest(_oneSecondFailedTest, 900, false);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(1, result.failureCount());
}
/**
* Differentiates the cause of a non-waiting timed test's
* failure as a result of either the max elapsed time being
* exceeded or the test itself has failed.
*/
public void testOneSecondResponseNonWaitingWithTimeFailure() {
TimedTest test =
new TimedTest(_oneSecondFailedTest, 900, false);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
if (test.outOfTime()) {
// success
} else {
fail("Max elapsed time exceeded!");
}
}
/**
* Differentiates the cause of a non-waiting timed test's
* failure as a result of either the max elapsed time being
* exceeded or the test itself has failed.
*/
public void testOneSecondResponseNonWaitingWithTestFailure() {
TimedTest test =
new TimedTest(_oneSecondFailedTest, 1100, false);
assertEquals(1, test.countTestCases());
TestResult result = new TestResult();
test.run(result);
if (test.outOfTime()) {
fail("Should never get here!");
} else {
//
// Failed due to invalid test state.
//
assertEquals(1, result.runCount());
assertEquals(0, result.errorCount());
assertEquals(1, result.failureCount());
}
}
public static Test suite() {
return new TestSuite(TimedTestTest.class);
}
public static void main(String args[]) {
junit.textui.TestRunner.run(suite());
}
}
junitperf-1.9.1.orig/CHANGES 0000644 0001750 0001750 00000007277 10164733254 014466 0 ustar wbaer wbaer JUnitPerf Change Log
Version 1.9 - 2/16/04
----------------------
- When using the Swing test runner, the progress bar now turns red
when a TimedTest fails. In prior releases, when a test failed a
failure message was printed but the progress bar stayed green.
Thanks to those folks who do use the graphical runner for
performance tests for pointing out this bug!
Version 1.8 - 9/3/02
--------------------
- Documentation edits and additions.
- Added the ExampleThroughputUnderLoadTest and
ExampleResponseTimeUnderLoadTest to the samples.
- Added the TimedTest.setQuiet() method to optionally disable output
of the test's elapsed time.
Version 1.7 - 2/26/02
---------------------
- The TestMethodFactory can be used to load test a single test method
while ensuring that each concurrent user thread uses a thread-local
instance of the test.
Version 1.6 - 11/23/01
----------------------
- If a threaded test in a load test has been stopped, either by using
the "Stop" button of the Swing UI or using the haltonfailure="yes"
attribute of a JUnit Ant task, the stopped or failed test is
cancelled and the currently active threaded tests of the load test
are allowed to complete. Prior to this upgrade, if a threaded test
was stopped, the load test would hang while waiting for the threaded
test to report its completion.
Version 1.5 - 9/8/01
---------------------
- Added the TestFactory class to allow stateful tests to be decorated
as LoadTest instances. Use of a TestFactory ensures that each
LoadTest thread uses its own decorated test instance.
Version 1.4 - 6/12/01
---------------------
- A TimedTest can now be constructed to fail immediately if the
maximum elapsed time of the decorated test is exceeded. In other
words, the TimedTest will not wait for the decorated test to run to
completion if the maximum elapsed time is exceeded.
- The TimedTest.outOfTime() method was added to unambiguously
determine whether the test failed due to the maximum elapsed time
being exceeded or the test itself failing.
- The LoadTest class now supports enforcing test atomicity using the
setEnforceTestAtomicity() method. By default, test atomicity is not
enforced for test cases that spawn threads, either directly or
indirectly.
- The TimedTest.toString() method now includes an indication of
whether the timed test will wait for test completion (WAITING) or
wait for the maximum elapsed time to expire (NON-WAITING).
- The LoadTest.toString() method now includes an indication of whether
the load test will enforce test atomicity by waiting for control to
return (ATOMIC) or waiting for all threaded tests to complete
(NON-ATOMIC).
Version 1.3 - 5/11/01
---------------------
- The LoadTest class now employs a ThreadBarrier to allow threads
spawned directly by a load test to properly signal their
completion. Threads spawned by decorated tests, either directly or
indirectly, without a specified thread group are added to the
ThreadedTestGroup by default. This was causing the active count of
the thread group to never fall to 0, thereby causing the load test
to hang indefinitely.
Version 1.2 - 4/23/01
---------------------
- Replaced the ThreadBarrier with a ThreadedTestGroup to catch and
handle uncaught exceptions thrown by threads spawned by
ThreadedTest. This improves thread safety and supports test
atomicity (as defined by transaction processing) when enabled.
- Added several variants of LoadTest constructors for convenience and
extensibility.
- Updated JUnitPerf.html and ExampleLoadTest.java to include more
examples for constructing LoadTest instances with various
constructors.
Version 1.1 - 3/3/01
--------------------
- Initial public release
junitperf-1.9.1.orig/LICENSE 0000644 0001750 0001750 00000003057 10164733254 014470 0 ustar wbaer wbaer Copyright (C) 2001 Clarkware Consulting, Inc.
All Rights Reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions
are met:
1. Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
2. Redistributions in binary form must reproduce the above copyright
notice, this list of conditions and the following disclaimer in the
documentation and/or other materials provided with the distribution.
3. Neither the name of Clarkware Consulting, Inc. nor the names of its
contributors may be used to endorse or promote products derived
from this software without prior written permission. For written
permission, please contact clarkware@clarkware.com.
THIS SOFTWARE IS PROVIDED ``AS IS'' AND ANY EXPRESSED OR IMPLIED WARRANTIES,
INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND
FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL
CLARKWARE CONSULTING OR ITS CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT,
INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA,
OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF
LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING
NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE,
EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
junitperf-1.9.1.orig/README 0000644 0001750 0001750 00000001715 10164733254 014342 0 ustar wbaer wbaer
J U N I T P E R F
What is it?
-----------
JUnitPerf is a collection of JUnit test extensions for performance and
scalability testing.
The Latest Version
------------------
The latest version of JUnitPerf is available at
http://www.clarkware.com/software/junitperf.zip.
Documentation
-------------
Documentation is available in HTML format, in the docs/ directory.
For the installation and user manual, see docs/JUnitPerf.html.
For the API documentation, see docs/api/index.html.
Support
---------
If you have any questions, comments, enhancement requests, success
stories, or bug reports regarding JUnitPerf, or if you want to be
notified when new versions of JUnitPerf are available, please email
mike@clarkware.com.
Licensing
---------
This software is licensed under the terms you may find in the file
named "LICENSE" in this directory.
Thanks for using JUnitPerf!
junitperf-1.9.1.orig/build.xml 0000644 0001750 0001750 00000011222 10164733254 015275 0 ustar wbaer wbaer