Import your JMeter test¶
PFLB Platform allows you to use Apache JMeter scripts for load testing.
Examples of using JMeter scripts in the PFLB Platform:
You can import and run a single script as a regular test.
You can create multiple copies of the container with the imported script to generate maximum stress load.
You can import different scripts into a single test to monitor complete application testing in a single Grafana dashboard.
You can specify a region for each container in which the script runs to simulate a geographically distributed load.
To create a test based on the JMeter script, follow these steps:
To upload tests from PFLB Platform and edit them in Apache JMeter, you can use a special plugin.
For more information about limitations of PFLB Platform when working with JMeter scripts, see Supported plugins and elements of JMX scripts.
Record a script¶
To create a script in JMeter, use the Apache JMeter documentation.
Import a script¶
In the upper menu, click + New Test.
In the Import data pane, click Import from JMeter. The dialog opens:
To upload a JMX file, click and select it, or drag and drop the file from the local folder to JMX file upload area. The file size shouldn’t exceed 100 MB.
Upload any supporting files that the script uses, as you did in the previous step.
Click Upload. The Platform creates a container for the imported script.The test overview opens:
Note
If the script contains unsupported plugins, then the script isn’t imported. For more information, see Supported plugins and elements of JMX scripts.
Optional: Add SLAs.
Optional: Link a settings set.
Optional: Add a container using one of the following methods:
To copy the current container with the imported script, click
.To create a new container, click + and import another script and supporting files.
In the container, review the imported files, thread groups, and load profile. If necessary, to replace the JMX script and supporting files, click Re-upload.
Note
For JMX scripts utilizing the WebSocket protocol, when viewing thread groups, you can see the WEBSOCKET label and the utilized samplers.
The WebSocket Samplers by Peter Doornbosch plugin:
OPEN CONNECTION: WebSocket Open Connection Sampler
PING PONG: WebSocket Ping/Pong Sampler
SINGLE READ: WebSocket Single Read Sampler
SINGLE WRITE: WebSocket Single Write Sampler
REQUEST RESPONSE: WebSocket request-response Sampler
CLOSE CONNECTION: WebSocket Close Connection Sampler
The WebSocket Samplers by Maciej Zaleski plugin:
MESSAGES: WebSocket Sampler
Optional: Edit a load profile.
Optional: Configure the profile graph display. Follow these steps:
To merge the graphs of several steps, click the Merge steps toggle. The toggle becomes available when there are several steps in the thread group.
Select the testing time that you want to consider in detail on the graph.
In the Generators location drop-down list, select the region in which you run the test:
AWS Asia Pacific (Mumbai);
AWS Asia Pacific (Tokyo);
AWS Europe (Frankfurt);
AWS Europe (Ireland);
AWS US East (N. Virginia);
AWS US West (N. California).
Note
Only the AWS US East (N. Virginia) region are available for free users.
Add SLA¶
In the Overall settings area, click SLA.
Go to the required tab.
Click Add SLA, and configure the metrics:
Test
Select one of the metrics for the test:
Average response time. Average system response time to a request or transaction.
Error rate. Errors are calculated only when executing all requests, excluding transactions.
Percentile 95. The value which is greater than 95% of response time for the test.
Request per second. Number of requests sent per second.
Select one of the conditions: <= or >.
Enter a threshold of the metric (SLA).
Optional: In the fields Start and End, enter the period for which PFLB Platform calculates the metric at the end of the test.
Transaction
Select the controller and container.
Select the transaction and use case.
Select one of the metrics:
Average response time. Average system response time to transaction.
Error rate. Errors are calculated only when executing a transaction, excluding requests.
Percentile 95. The value which is greater than 95% of response time for the transaction.
Transaction per second. Number of completed transactions per second.
Select one of the conditions: <= or >.
Enter a threshold of the metric (SLA).
Optional: In the fields Start and End, enter the period for which PFLB Platform calculates the metric at the end of the test.
Request
Select the sampler and container.
Select the request, transaction, and use case of the test.
Select one of the metrics:
Average response time. Average system response time to a request.
Error rate. Errors are calculated only when executing the request.
Percentile 95. The value which is greater than 95% of response time for the request.
Request per second. Number of requests sent per second.
Select one of the conditions: <= or >.
Enter a threshold of the metric (SLA).
Optional: In the fields Start and End, enter the period for which PFLB Platform calculates the metric at the end of the test.
System metrics
Note
Before adding SLA with system metrics, follow these steps:
To add an SLA with system metrics, follow these steps:
Enter the host name of the testing system.
Select one of the metrics:
Average cpu usage,
Average memory usage.
Select one of the conditions: <= or >.
Enter a threshold of the metric (SLA).
Optional: In the fields Start and End, enter the period for which PFLB Platform calculates the metric at the end of the test.
Click Go back to test overview.
Link a settings set¶
Settings sets allow you to run tests with different settings. You can reuse any settings set in new tests.
Note
For JMX tests, only the monitoring and webhook settings apply.
To link a settings set to a test, follow these steps:
In the Overall settings area, click Settings set. The sidebar opens:
Select the settings set on which you want to link to the test.
Click Link.
For more information, see Add a settings set.
Edit a load profile¶
In the container, in the Thread groups and workload area, click Edit. The page opens:
Optional: Configure the profile graph display. Follow these steps:
To merge the graphs of several steps, click the Merge steps toggle. The toggle becomes available when there are several steps in the thread group.
Select the testing time that you want to consider in detail on the graph.
Disable the thread groups you don’t want to use in the test.
For thread groups of the jp@gc - Ultimate Thread Group type, edit the load profile for each step:
Start delay (min). Delay before starting testing. Specified in minutes.
Duration (min). Test duration at maximum load with all VUsers running. Specified in minutes.
Ramp-up time (min). Time allocated to start all VUsers. If set to 0, all VUsers will start simultaneously. Specified in minutes.
Number of VUsers. The number of load threads. The intensity of the load depends on the number of virtual users, timers, and response time of the testing system.
Ramp-down time (min). Time allocated to stop all VUsers. If set to 0, all VUsers will stop simultaneously. Specified in minutes.
Add the required number of steps to the tread group and customize the load profile for each step as you did in the previous step of the guide.
Click Go back to test overview.
Run a test¶
Run the test immediately after creating it or later in the Tests page. For more information, see Run test.
To run the test, follow these steps:
Click Create test run. The sidebar opens:
Optional: Click Add label and enter the name of the label for the test. To display labels in the Test runs page, select the Copy labels from test checkbox. For more information, see Add labels to the test.
Optional: For Description, enter a test run comment. You will be able to see this information in the test run details, all test runs list, and trend reports, when selecting a specific test run.
Optional: To run a debugging test, click the Debug run toggle. For more information, see Run a debugging test.
Click Run test.
The test will run after some time.