Skip to main content

Reporting DSL

Architecture

The Reporting DSL is based on the possibility to inject a reporting client into the main test code with just a few annotations. The client can then be used to report progress of the test.

As opposed to the concept of remote execution where proxy agents can be used to actually call systems under test, here the reporting client is directly in the runtime where tests are run, usually pipelines or localhost. So that runtime needs to have access to the testing tool directly.

Reporting heavily depends on the tool used for test catalogue and test run reporting. Today Pumpo#5 uses an abstraction that might be extended or modified as new tools are being added to the list.

A key part is to configure mapping of the test to the right test case. This is done using the annotation @ReportTo with value set to the test case id.

There are two main possibilities how to use the reporting DSL. One that is easier to implement is to add the reporting client as another parameter of the test method. The parameter should be the interface Reporter from Pumpo#5 or extend that interface. Then it is possible to use methods provided by that interface.

Example:


@Capability(key = "browserName", value = "chrome")
public interface MySimpleWebApp extends WebApplication {
default LoginPage openLoginPage() {
return this.open(LoginPage.class);
}
}

@Navigate("url.com/login")
public interface LoginPage {

@SetValue("email")
LoginPage fillOutEmail(String email);

@Click("loginButton")
LoginPage clickLogin();

@AssertElementContent("loggedInDiv")
LoginPage verifyThatLoggedIn(String expectedText);

}

class TestClass {

@Test
@ReportTo("TA-305")
public void sampleTest(
@Capability(key = "browserName", value = "pn5-chrome")
MySimpleWebApp myApp,
Reporter reporter) {

reporter.startTestcase();
LoginPage loginPage = myApp.openLoginPage();
reporter.completeStepPassed();
loginPage.fillOutEmail("test@test.com");
reporter.completeStepPassed;
loginPage.clickLogin();
reporter.completeStepPassed;
loginPage.verifyThatLoggedIn("Your are logged in!");
reporter.completeTestcasePassed();

}
}

On the background Pumpo#5 will initialise the reporter client also based on necessary configuration provided in configuration files and will report progress of the test to the right test case (id TA-305 in this case) in the right test plan / pack and in case there is any failure (exception) then it will automatically report failure with additional details added to the test step where the failure occurred.

The issue with the previous example is that the fluent interface does not work anymore every time we need to report the success of a test step. There are additional types to specify in order to comply with Java syntax. To overcome that issue there is a more complex possibility how to use the Reporter interface, this is by extending the Reportable<> interface in all objects where reporting will be used:

@Capability(key = "browserName", value = "chrome")
public interface MyWebApp extends WebApplication, Reportable<MyWebApp> {
default LoginPage openLoginPage() {
return this.open(LoginPage.class);
}
}

@Navigate("url.com/login")
public interface LoginPage extends Reportable<LoginPage> {

@SetValue("email")
LoginPage fillOutEmail(String email);

@Click("loginButton")
LoginPage clickLogin();

@AssertElementContent("loggedInDiv")
LoginPage verifyThatLoggedIn(String expectedText);

}

The flow of the test then becomes:

class TestClass {
@Test
@ReportTo("TA-305")
public void sampleTest(
@Capability(key = "browserName", value = "pn5-chrome")
MyWebApp myApp,
Reporter reporter) {

myApp
.report().startTestcase()
.openLoginPage()
.report().completeStepPassed()
.fillOutEmail("test@test.com")
.report().completeStepPassed()
.loginPage.clickLogin()
.report().completeStepPassed()
.loginPage.verifyThatLoggedIn("Your are logged in!")
.report().completeTestcasePassed();

}
}

Reportable<MyClass> uses the Java "generics" where MyClass is the type to be returned once a method of the interface Reporter is used (startTestcase, completeStepPassed, etc...).

Both approaches produce the same result.

Method reference

Reporter::startTestcase

Starts the test case that is passed to the test using the annotation @ReportTo by setting it as in progress.

Reporter::setResult

ParameterTypeDescription
resultTestcaseResultResult of the test case

Sets the result of the test case to specific value of the enum of type TestcaseResult, that is PASSED_UNTIL_FAILURE, PASSED or FAILED. The default is the first value but under some conditions it may be useful to already set the final result while continuing reporting at step level.

Reporter::getResult

Returns the result of the test as enum of type TestcaseResult.

Reporter::getException

Returns the Throwable that has been previously recorded as the cause of the test case failure. Returns null if no such Throwable was set.

Reporter::setStepFailureBehaviour

ParameterTypeDescription
behaviourStepFailureBehaviourIntended behaviour

Configures the current step failure behaviour to one fo the values of the enum of type StepFailureBehaviour, that is: STOP_AS_FAILED, CONTINUE_AS_PASSED_UNTIL_FAILURE, CONTINUE_AS_FAILED or CONTINUE_AS_PASSED. The first one is the default and will cause the test to halt if a Throwable is thrown. In case the test should continue the other values allow setting what should be the result captured for the whole test case.

Note: Calling this method alone with a value such as CONTINUE_AS_FAILED is not sufficient to continue the test in case an exception is thrown. The exception needs to be caught and passed to the method completeStepFailed that will use the intended step failure behaviour to decide whether the exception needs to be rethrown or swallowed to allow continuing the test flow. Instead of manually catching the exception and passing it to completeStepFailed we advise to use the more concise method doAsSingleStep that allows to do all in one.

Reporter::setStepFailureBehaviourDefault

ParameterTypeDescription
behaviourStepFailureBehaviourIntended behaviour

Configures the default step failure behaviour for next steps. The current step failure behaviour is not affected. See setStepFailureBehaviour for details how is the value used.

Reporter::skipStep()

Marks the current step as skipped. The status of the step might differ based on the actual implementation, e.g. instead of skipped the value might be todo.

Reporter::skipStep(String)

ParameterTypeDescription
commentStringComment for the current step

Adds a comment to the current step and marks it as skipped. The status of the step might differ based on the actual implementation, e.g. instead of skipped the value might be todo.

Reporter::commentStep

ParameterTypeDescription
commentStringComment for the current step

Adds a comment to the current step without any other action. Based on the specific implementation any previous comment might be replaced if the reporting only allows one comment per step.

Reporter::completeStepPassed()

Marks the current step as passed and moves to the next step if any there.

Note: In case there is no next step the pointer stays at the current step and next call to any of the methods setting step result will override the step result again.

Reporter::completeStepPassed(String)

ParameterTypeDescription
commentStringComment for the current step

Adds a comment to the current step, marks it as passed and moves to the next step if any there.

Note: In case there is no next step the pointer stays at the current step and next call to any of the methods setting step result will override the step result again.

Reporter::completeStepFailed()

Marks the current step as failed. In case the failure behaviour is set to continue the test, then it simply moves to the next step if any there. Otherwise, it throws a Throwable with empty message to make JUnit stop the test now and report failure.

Note: We advise to rather use any other variant of this method and specify the cause of the step failure.

Note: In case there is no next step the pointer stays at the current step and next call to any of the methods setting step result will override the step result again.

Reporter::completeStepFailed(String)

ParameterTypeDescription
commentStringComment for the current step

Adds a comment to the current step, marks it as failed and then if the failure behaviour is set to continue the test, then it simply moves to the next step if any there. Otherwise, it throws a Throwable with the provided message to make JUnit stop the test now and report failure.

Note: In case there is no next step the pointer stays at the current step and next call to any of the methods setting step result will override the step result again.

Reporter::completeStepFailed(Throwable)

ParameterTypeDescription
throwableThrowableThrowable as cause of the failure

Adds the provided Throwable stack trace as comment to the current step, marks it as failed and then if the failure behaviour is set to continue the test, then it simply moves to the next step if any there. Otherwise, it throws the provided Throwable to make JUnit stop the test now and report failure.

Note: In case there is no next step the pointer stays at the current step and next call to any of the methods setting step result will override the step result again.

Reporter::doAsSingleStep

ParameterTypeDescription
behaviourStepFailureBehaviourBehaviour in case the step fails
flowFlowFlow to execute, Flow is any lambda taking no arguments and returning nothing

First sets the current step failure behaviour to the provided value out of STOP_AS_FAILED, CONTINUE_AS_PASSED_UNTIL_FAILURE, CONTINUE_AS_FAILED or CONTINUE_AS_PASSED. Then executes the provided lambda function while catching any exception and treating it as cause of the step failure. This is equivalent to calling setStepFailureBehaviour with the provided behaviour and then try-catching the lambda function, in case an exception is thrown then calling completeStepFailed with the exception, and in case no exception is thrown then calling completeStepPassed.

The resulting code is concise as follows:

reporter.startTestCase();
reporter.doAsSingleStep(CONTINUE_AS_FAILED, () -> {
subStep1();
subStep2();
});
reporter.completeTestCase();

Note: In case there is no next step the pointer stays at the current step and next call to any of the methods setting step result will override the step result again.

Reporter::completeTestcasePassed

Completes the test case with result PASSED irrespective of what result was considered until now.

Reporter::completeTestcaseFailed

Completes the test case with result FAILED irrespective of what result was considered until now.

Reporter::completeTestcase

Completes the test case with result based on what was considered until now. Both values PASSED_UNTIL_FAILURE and PASSED will resolve to PASSED.

Generic configuration

Configuration files

KeyTypeDescription
reporting.enabledbooleanEnables or disables reporting
jira.usernameStringUsername to authenticate with JIRA in case the reporting tool is a JIRA plugin
jira.passwordStringPassword to authenticate with JIRA in case the reporting tool is a JIRA plugin

Testing tools supported

TestFlo for JIRA

TestFlo is a JIRA plugin and is supported and constantly updated when new version of TestFlo are released. TestFlo for JIRA uses following data model: test plans, test plan iterations, test case templates and test case.

Configuration in configuration files

KeyTypeDescription
testflo.testplan.keyStringThe key of the test plan to report progress to
testflo.iteration.autostartbooleanWhether a new iteration should be started when a new JUnit run is initiated. Default is false.

Specification

  • The test plan to be used needs to be specified in configuration files and the @ReportTo annotation is to be used to specify the test case or test case template to be used.
  • In case a test case template is used, Pumpo#5 will automatically find the test case that comes from the specified templates and that is in the specified test plan.
  • In case the test plan does not contain the test case template or test case specified, an error will be thrown and the test will fail without any reporting.
  • New iteration of the provided test plan can be automatically started when set in the configuration files. This is done only once for the whole JUnit run even if several non-related test in various classes are run. A challenge in TestFlo is to know when the new iteration has been created because it is an asynchronous process and currently TestFlo does not provide any interface to check the status of that asynchronous task. What Pumpo#5 currently does is to check every 15 seconds the page of the test plan when a progress bar is displayed saying how far the preparation of the new iteration has gone. There is a timeout of 10 minutes currently hardcoded. In case the new iteration is not yet created Pumpo#5 will nevertheless proceed with tests. Some of them might end to be reported to the previous iteration as consequence of the new iteration creation too much time.
  • Reporter::comment will store the string comment to a current test step. This comment will be archived as part of current iteration as per standard TestFlo behavior.

Xray for JIRA (Cloud version)

Xray is a JIRA plugin. Xray for JIRA uses following data model: test execution, test case.

Configuration in configuration files

KeyTypeDescription
xray.base.urlStringURL of cloud instance of Xray https://xray.cloud.getxray.app
xray.api.suffixStringApi suffix of cloud instance of Xray /api/v2
xray.client.idStringClient ID for cloud instance of Xray
xray.client.secretStringClient secret for cloud instance of Xray
xray.jira.base.urlStringYour Jira base URL
xray.jira.api.suffixStringYour Jira API suffix (e.g. /rest/api/3)
xray.jira.usernameStringJira username
xray.jira.tokenStringJira token
xray.test.exec.keyStringThe key of the test execution to report progress to for (e.g. XRAY-123)
xray.que.reporting.url (Optional)StringIf this parameter exist override every xray param and just send request to this url is specified for custom service
xray.que.reporting.path (Optional)StringPath to send request to custom service

Specification

  • The test execution to be used needs to be specified in configuration files and the @ReportTo annotation is to be used to specify the test case can be also array of multiple test cases for 1 test.
  • If test case not exist in test execution, it will be added automatically.
  • Every run override previous run in test execution.
  • Report only PASS and FAIL status based on test result.

Prometheus

In case the user wishes to report a few specific metrics related to automated test execution to Prometheus, Pumpo#5 provides a dedicated reporter for that purpose.

The Prometheus Reporter is configured to collect metrics during a test execution, using JUnit extensions, and actively push those metrics to a Prometheus Pushgateway endpoint that is expected to be run and maintained on the user side.

Those metrics can then be used in any way the user wants, such as being scraped and visualized in a Grafana dashboard, for example.

The metrics currently reported are:

Suite metrics

Metric NameDescriptionBase Labels
suite_execution_totalTotal number of tests in the test suitesuite_name: a specific identifier for the test suite
suite_url: any URL the user wants the test suite to point to, e.g., a CI/CD pipeline
branch: the branch in the version control system from where the tests were run
suite_execution_failedTotal number of failing tests in the test suiteSame as above.
suite_execution_finish_timeTimestamp from when the last execution finishedSame as above.
suite_execution_duration_secondsHow long the suite took to finish, in secondsSame as above.

Test case metrics

Metric NameDescriptionBase Labels
test_execution_duration_secondsHow long the test case took to finish, in secondsSame as above, with the addition of:

test_case_name: the name of the test case.
test_execution_finish_timeTimestamp from when the last test case execution finishedSame as above.
test_execution_totalDisplays a test case execution result, in a labelSame as above, with the addition of:

result: what was the outcome from test case. Possible values are either passed or failed.

Configuration in configuration files

KeyTypeDescription
pn5.reporting.prometheus.enabledbooleanEnables the collection and pushing of metrics.
pn5.reporting.prometheus.testcases.trackbooleanEnables metrics for test cases. If disabled, only suite metrics will be created.
pn5.reporting.prometheus.endpointStringURL for the Pushgateway endpoint to which metrics should be pushed.
pn5.reporting.prometheus.suiteStringName to be used for the suite_name label in metrics. In case this is empty, some class name will be used as default, which can be confusing.
pn5.reporting.prometheus.suite_urlStringURL to be used as a value for the suite_url label in metrics.
pn5.reporting.prometheus.branchStringThe current development branch in the versioning system. Used as a value for the branch label in metrics.
pn5.reporting.prometheus.jobStringA name to be given for the job when pushing metrics, and for the job label. If empty, the suite name will be used as default.
pn5.reporting.prometheus.labelsStringCustom labels and values to be pushed in metrics, together with the base labels listed in the tables above. Should be in format label1=value1,label2=value2,....
pn5.reporting.prometheus.usernameStringThe user name, in case the endpoint requires basic authentication
pn5.reporting.prometheus.passwordStringThe password, in case the endpoint requires basic authentication

How to add the Prometheus Reporter to your tests

First, add the core and reporting-prometheus dependencies to your pom.xml:

<dependencies>
...

<dependency>
<groupId>dev.pumpo5</groupId>
<artifactId>core</artifactId>
<version>...</version>
</dependency>
<dependency>
<groupId>dev.pumpo5</groupId>
<artifactId>reporting-prometheus</artifactId>
<version>...</version>
</dependency>


...
</dependencies>

Second and last, make sure that, in your junit-platform.properties configuration file, you have

junit.jupiter.extensions.autodetection.enabled=true

in order for JUnit to look for, and automatically load and set up the Prometheus reporter before tests execution.

The Prometheus Reporter should then be added, and you should see messages in the logs about metrics being created and pushed while your tests are executed.