Create a test#

Create a test in PFLB Platform in a few steps:

  1. Use cases & requests

  2. Load profile

  3. SLA

  4. Test overview

Use cases and requests#

To create a test, in the upper menu, click + New Test. The Use cases & requests step opens:

../_images/um_new_test.cloud.png

In this step, create use cases and add requests to them, add parameters and change the default test settings.

Create an use case#

A use case is an atomic, hierarchically ordered set of elements that simulate user behavior. The use case can include transactions, HTTP requests and logical elements such as conditions and loops. Logical elements allow you to customize the variability of the test scenario. Transactions send HTTP requests during test execution in the specified order.

You can create use cases manually or automatically, by importing and copying previously created use cases.

Create a use case manually#

  1. Click + Add a use case.

  2. Go to the use case or click edit_button. The editor opens:

    ../_images/um_editor.en.png
  3. Add HTTP request or gRPC request.

  4. Optional: Use Drag & Drop to add transactions. You can place it on the top level or inside another transaction.

  5. Click Save.

Import data#

In PFLB Platform, you can import data from:

To import data into a test, in the Import data pane, click on the one of the buttons:

../_images/um_import_data_project.cloud.png

To import data into a specific use case, go to the use case and, in the Import data pane, click on the one of the buttons:

../_images/um_import_data_group.en.png

Copy use cases#

You can create a use case based on another one.

To copy a use case, for the previously created use case, click copy_set_button.

The new use case will be called “copy <Name of the original use case>”.

Add parameters#

In PFLB Platform you can add literal and CSV parameters. To add parameters, in the Parameters area, click CSV file or Literal:

../_images/um_add_parameters.cloud.png

For more information, see Add and use parameters.

Set the default test settings#

In PFLB Platform, you can set the default settings for cookies, headers, requests, and timers for the test. To set the default settings, in the Default settings pane, click on the one of the buttons:

../_images/um_default_settings.cloud.png

For more information, see Default settings.

Load profile#

In the lower-right corner, click Load profile. The next step in creating a test opens:

../_images/um_load_profile.cloud.png

In the Load profile step, select the mode of the profile distribution, design the test, and disable the created use cases if needed.

Set profile distribution#

PFLB Platform has two modes of distribution of virtual users. In the Percent mode, it’s easy to configure a test, while in the Users mode, you can design tests independently for each use case.

Select the mode of distribution of virtual users:

  • Percent. The distribution of virtual users between use cases is set as a percentage. Use one of the following methods:

    • Enter the percentage of users for each use case. The sum should be equal to 100%.

    • Click the Distribute users automatically toggle. If enabled, users are evenly distributed between use cases.

  • Users. Design test for each use case.

Note

If you configure the load for several use cases in Users mode, and switch a test to Percent mode, then the use case settings will be lost.

Select test type#

For the Percent mode of the profile distribution, select the type of the test and design the test.

Stable test allows to run a stable load for a given time after a gradual ramp-up. The test reaches the specified load parameters for the specified acceleration time:

../_images/um_test_parametres1.en.png
  • Duration (min). Test duration at maximum load with all VUsers running. Specified in minutes.

  • Ramp-up time (min). Time allocated to start all VUsers. If set to 0, all VUsers will start simultaneously. Specified in minutes.

  • Number of VUsers. The number of load threads. The intensity of the load depends on the number of virtual users, timers and response time of the testing system.

  • Ramp-down time (min). Time allocated to stop all VUsers. If set to 0, all VUsers will stop simultaneously. Specified in minutes.

Scalability test allows to find system’s maximum capacity, gradually increasing the load during each step:

../_images/um_test_parametres2.en.png
  • Number of steps. Number of steps before test reaches maximum load.

  • Step VUsers increment. Number of VUsers started at one step.

  • Step duration (min). One step duration. Specified in minutes.

  • Duration at max load (min). Test duration at maximum load when all VUsers are running.

  • Ramp-up time for step (min). Time allocated to ramp-up VUsers started within the step. If set to 0, all VUsers will start simultaneously. Specified in minutes.

  • Ramp-down time (min). Time allocated to stop all VUsers. If set to 0, all VUsers will stop simultaneously. Specified in minutes.

To change the maximum value of a parameter, enter the value manually. The maximum value will increase by 10%.

For the Users mode of the profile distribution, configure the load profile for each use case:

../_images/um_test_parametres3.cloud.png
  • Number of steps. Number of steps before test reaches maximum load.

  • Step VUsers increment. Number of VUsers started at one step.

  • Step duration (min). One step duration. Specified in minutes.

  • Duration at max load (min). Test duration at maximum load when all VUsers are running.

  • Ramp-up time for step (min). Time allocated to ramp-up VUsers started within the step. If set to 0, all VUsers will start simultaneously. Specified in minutes.

  • Ramp-down time (min). Time allocated to stop all VUsers. If set to 0, all VUsers will stop simultaneously. Specified in minutes.

Enable and disable use cases#

After creating a use case, you can enable and disable:

  • Each use case separately:

    ../_images/um_group_toggle.png
  • All use cases at the same time. Click Actions ⋮ and select Enable all use cases or Disable all use cases.

SLA#

In the lower-right corner, click SLA. The next step in creating a test opens:

../_images/um_add_sla.en.png

In the SLA step, add SLAs for different scopes: tests, transactions, requests, and system metrics.

SLA (service-level agreement) is non-functional requirements that determine the criteria for the success of the tests performed.

To add an SLA, follow these steps:

  1. Go to the required tab.

  2. Click + Add SLA, and configure the metrics:

    • Test
      ../_images/um_sla_test.en.png
      1. Select one of the metrics for the test:

        • Average response time. Average system response time to a request or transaction.

        • Error rate. Errors are calculated only when executing all requests, excluding transactions.

        • Percentile 95. The value which is greater than 95% of response time for the test.

        • Request per second. Number of requests sent per second.

      2. Select one of the conditions: <= or >.

      3. Enter a threshold of the metric (SLA).

      4. Optional: In the fields Start and End, enter the period for which PFLB Platform calculates the metric at the end of the test.

    • Transaction
      ../_images/um_sla_transaction.en.png
      1. Select the transaction and use case.

      2. Select one of the metrics:

        • Average response time. Average system response time to transaction.

        • Error rate. Errors are calculated only when executing a transaction, excluding requests.

        • Percentile 95. The value which is greater than 95% of response time for the transaction.

        • Request per second. Number of requests sent per second.

      3. Select one of the conditions: <= or >.

      4. Enter a threshold of the metric (SLA).

      5. Optional: In the fields Start and End, enter the period for which PFLB Platform calculates the metric at the end of the test.

    • Request
      ../_images/um_sla_request.en.png
      1. Select the request, transaction, and use case of the test.

      2. Select one of the metrics:

        • Average response time. Average system response time to a request.

        • Error rate. Errors are calculated only when executing the request.

        • Percentile 95. The value which is greater than 95% of response time for the request.

        • Request per second. Number of requests sent per second.

      3. Select one of the conditions: <= or >.

      4. Enter a threshold of the metric (SLA).

      5. Optional: In the fields Start and End, enter the period for which PFLB Platform calculates the metric at the end of the test.

    • System metrics

      Note

      Before adding SLA with system metrics, follow these steps:

      1. Create a setting set with system metrics for the monitoring.

      2. Link the setting set to the test.

      ../_images/um_sla_system_metrics.en.png

      To add an SLA with system metrics, follow these steps:

      1. Enter the host name of the testing system.

      2. Select one of the metrics:

        • Average cpu usage,

        • Average memory usage.

      3. Select one of the conditions: <= or >.

      4. Enter a threshold of the metric (SLA).

      5. Optional: In the fields Start and End, enter the period for which PFLB Platform calculates the metric at the end of the test.

  3. Click Save or Test overview.

Test overview#

In the lower-right corner, click Test overview. The last step in creating a test opens:

../_images/um_test_overview.cloud.png

In the Test overview step, check the test use cases, load profile and SLA before running. Edit them if needed and link a settings set.

Optional: Configure the profile graph display. Follow these steps:

  • To merge the graphs of use cases, click the Merge steps toggle.

  • To view the graph in detail, select the test time.

Save the test and its version#

Test’s versions is created:

  • After each saving of changes in the test,

  • After moving on to the next step of test creation.

To change or run the specific version of the test, in the upper-right corner, click on the drop-down list and select the version:

../_images/um_project_version.png

To rename the version of the test, follow these steps:

  1. In the upper-right corner, click on the drop-down versions list.

  2. Hover over the required version and click edit_button:

    ../_images/um_project_version_comment.png
  3. Enter the name of the version and click ok_version_button.

To copy the version of the current test into a new test, follow these steps:

  1. Select the version of the test and click copy_ver_button. The dialog opens:

    ../_images/um_copy_version.en.png
  2. Enter a test’s name and click Create test.

Run a test#

Run the test immediately after creating it or later in the Tests page. For more information, see Run test.

To run the test, follow these steps:

  1. Click Create test run. The sidebar opens:

    ../_images/um_run_test.en.png
  2. Optional: Click Add label and enter the name of the label for the test. To display labels in the Test runs page, select the Copy labels from test checkbox. For more information, see Add labels to the test.

  3. Optional: For Description, enter a test run comment. You will be able to see this information in the test run details, all test runs list, and trend reports, when selecting a specific test run.

  4. In the drop-down list, select the region in which you run the test:

    • automatic. Automatic region selection.

    • AWS Asia Pacific (Mumbai);

    • AWS Asia Pacific (Tokyo);

    • AWS Europe (Frankfurt);

    • AWS Europe (Ireland);

    • AWS US East (N. Virginia);

    • AWS US West (N. California).

    Note

    Only the automatic and AWS US East (N. Virginia) values are available for free users.

  5. Optional: To run a debugging test, click the Debug run toggle. For more information, see Run a debugging test.

  6. Click Run test.

The test will run after some time.