Skip to content

Exercise 3: Mutation Testing with Pitest (PIT Mutation Testing)

In this exercise, you will practice mutation testing and mutation analysis on a fork of Spring Petclinic REST. You will integrate the PIT Mutation Testing (Pitest) plugin into the Maven build, generate mutants automatically, interpret the mutation coverage metrics, analyze killed and surviving mutants, and design at least one new JUnit test that kills a previously surviving mutant.

Getting Started

Similar to previous exercises, you will work on a fork of the Spring Petclinic REST project. Clone the forked repository into your local development machine and pull the latest version if you have merged the branches from previous exercises into the main branch. Then, create a working branch for this exercise:

1
git switch -c task/3-mutation-testing

Task 1: Configure Pitest in Maven

You will add Pitest to the Maven build configuration of Spring Petclinic REST. We recommend using the Pitest Maven Plugin version 1.20 or newer.

  1. Open pom.xml and add the variables required for configuring Pitest into the <properties> section:
     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    <properties>
    <!-- Omitted for brevity -->
       <maven.pitest-plugin.version>1.20.3</maven.pitest-plugin.version>
       <maven.pitest-junit5-plugin.version>1.2.3</maven.pitest-junit5-plugin.version>
       <!--
       You can switch the mutation engine to your needs, either `gregor` (default, mandatory for this exercise) or `descartes` (extreme mutation operators)
       You can replace directly in this pom.xml or via CLI arguments, e.g., -Dpitest.mutationEngine=[engineName]
       -->
       <pitest.mutationEngine>gregor</pitest.mutationEngine>
       <descartes.version>1.3.4</descartes.version>
    </properties>
    
  2. In the <reporting><plugins> section, add the Pitest reporting configuration:
     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    <reporting>
      <!-- Omitted for brevity -->
      <plugins>
        <plugin>
          <groupId>org.pitest</groupId>
          <artifactId>pitest-maven</artifactId>
          <version>${maven.pitest-plugin.version}</version>
          <reportSets>
            <reportSet>
              <reports>
                <report>report</report>
              </reports>
            </reportSet>
          </reportSets>
        </plugin>
      </plugins>
    </reporting>
    
  3. In the <build><plugins> section, add the Pitest plugin configuration:
     1
     2
     3
     4
     5
     6
     7
     8
     9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    <build>
      <!-- Omitted for brevity -->
      <plugins>
        <plugin>
          <groupId>org.pitest</groupId>
          <artifactId>pitest-maven</artifactId>
          <version>${maven.pitest-plugin.version}</version>
          <configuration>
            <!--
            Use Gregor engine by default, as defined in the properties.
            You can also experiment using alternative engine called Descartes that implements "extreme mutation operators".
            To switch the engine, override the corresponding properties.
            -->
            <mutationEngine>${pitest.mutationEngine}</mutationEngine>
            <targetClasses>
              <!--
              Mutate significant application-related classes in service, model, and controller layers
              Though, it is also possible to mutate all classes in cost of longer execution time and larger amount of mutants. To do so, remove the entire <targetClasses> section.
              One participant reported that they took 1.5 minutes (using gaming laptop) to mutate all classes.
              -->
              <param>org.springframework.samples.petclinic.service*</param>
              <param>org.springframework.samples.petclinic.model*</param>
              <param>org.springframework.samples.petclinic.mapper*</param>
              <param>org.springframework.samples.petclinic.rest.controller*</param>
            </targetClasses>
          </configuration>
          <dependencies>
            <dependency>
              <groupId>org.pitest</groupId>
              <artifactId>pitest-junit5-plugin</artifactId>
              <version>${maven.pitest-junit5-plugin.version}</version>
            </dependency>
            <dependency>
              <groupId>eu.stamp-project</groupId>
              <artifactId>descartes</artifactId>
              <version>${descartes.version}</version>
            </dependency>
          </dependencies>
        </plugin>
      </plugins>
    </build>
    
  4. Commit the changes to your working branch.

Task 2: Run Pitest and Generate the Report

  1. Run Pitest from the project root using Maven wrapper (mvnw):
    1
    ./mvnw test-compile org.pitest:pitest-maven:mutationCoverage
    
  2. After a successful run, open the HTML report located at target/pit-reports/index.html
  3. Skim the top-level dashboard. You will see information such as:
  4. Line coverage
  5. Mutation coverage (number of killed mutants / total generated mutants)
  6. Test strength (number of killed mutants / total killable mutants)
  7. List of packages containing mutated classes and their metrics
  8. You can also drill down to the per-class reports to see the metrics for each method and the list of mutants generated from the implementation of each method. You will see that Pitest is not only reporting the mutants, also reporting uncovered statements based on the absence of tests that could kill the mutants generated from those statements.

Notes:

In Java, Pitest mutates bytecode (i.e., the intermediate representation of Java programs) and executes your test suite against the mutants. A mutant is "killed" when at least one test fails under mutation. A mutant "survives" if all tests pass under mutation (indicating a testing gap). Some mutants are "equivalent" to the original program (behaviorally indistinguishable) and may be flagged as survivors despite adequate testing. You can further manually verify the killed and the surviving mutants by using RIPR model.

Task 3: Analyze and Kill a Surviving Mutant

Let's see the mutation analysis for the getPets() method in the Owner class:

alt text

See that there are two mutants that were generated from the getPets() method. The first mutant at line 95 is generated by using EMPTY_RETURNS mutation operator that replaces the original return statement with return Collections.emptyList();. It is pretty obvious that the first mutant is killed because the output of the original method and the mutant is different when given the same test cases.

Meanwhile, the second mutant at line 94 is generated by using VOID_METHOD_CALLS mutation operator that removes the statement, essentially not calling sort() method. This mutant is surviving because there are no difference in the output of the original method and the mutant when given the same test cases.

To figure out the reason why the second mutant is surviving, let's look at the covering tests:

alt text

There are several tests that exercised the mutant. Based on current mutation analysis, you can be sure that those tests do not care about the order of the pets in the list since the output is the same with or without the sort() call even though the list is named as sortedPets.

To kill (or "unalive" according to Gen-Z speaks nowadays) the mutant, you can add a new test that actually checks the order of the pets in the list or modify an existing test to perform order check. The most relevant test case to modify is shouldInsertPetIntoDatabaseAndGenerateId() in AbstractClinicServiceTests. You can see that the current test implementation adds a new pet named bowser into the database. Moreover, if you look at the SQL files for database seeding, the owner with ID 6 has two pets named Max and Samantha already. So the final order of the pets should be bowser, Max, and Samantha.

Let's update the test by adding a check that ensures the first pet in the list is bowser and the last pet is Samantha:

1
2
assertThat(owner6.getPets().get(0).getName()).isEqualTo("bowser");
assertThat(owner6.getPets().get(2).getName()).isEqualTo("Samantha")

Then, re-run the Pitest and check the mutation analysis for the getPets() method again. You will see that the second mutant is now killed:

alt text

Do not forget to save and commit your updated test. Then, push the latest commit to GitLab.

Task 4: DIY - Kill Another Surviving Mutant

There exists a mutant in Pet.java whose reason for surviving is similar to the one in previous section. There is a call to sort() method that does not have any effect on the output. Your task is to kill that mutant by adding a new test or modifying an existing test. You need to document your steps in the written report and update the test suite.

Deliverables

At the end of this exercise, prepare the following artefacts in your fork:

  • The updated pom.xml with Pitest plugin configuration.
  • Updated test suite that kills two mutants (one from Owner, one from Pet).
  • A written report named EXERCISE_3_REPORT.md including:
  • Summary of mutation testing report before and after killing the mutants in this exercise.
  • Your write-up on how you kill the mutant in Pet class.
  • AI assistance log (if any).

Ensure the updated codebase and the written report is committed and pushed to your fork on GitLab CSUI.

The due date of this exercise is: Friday, 10 October 2025, 23:55 UTC+7. Submit the URL of your fork repository to the designated submission slot on SCELE. Please ensure any updates to the fork repository related to this exercise were made and pushed before the due date.

Generative AI Usage Policy

You may use generative AI tools for this exercise, but you must understand that you are trying to learn and practice the subject, not the AI itself. AI may produce incorrect results; review outputs critically.

If you do use generative AI, you must agree to the following:

Allowed Uses

  • Code generation (e.g., suggest test cases or refactorings)
  • Explanations (e.g., clarify mutation operators, PIT configuration)
  • Debugging help (e.g., interpret failing tests or timeouts)

For every AI-assisted change, document it in the report under AI Assistance Log.

Prohibited Uses

  • Direct submission of AI-generated content without understanding or testing it
  • Bypassing learning by outsourcing the entire report or analysis

Report Template

You can use the following Markdown template as a starting point:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
# Exercise 3 Report

- **Name:** <NAME>
- **NPM:** <NPM>

## Mutation Testing Summary

| Metric         | Value (Before) | Value (After) |
| -------------- | -------------- | ------------- |
| Line Coverage  |                |               |
| Mutation Score |                |               |
| Test Strength  |                |               |

## Write-up on Killing Mutant in `Pet` Class

<Your write-up here>

## AI Assistance Log

| Task | Prompt Used | Summary of AI Response | Your Evaluation | Final Implementation |
| ---- | ----------- | ---------------------- | --------------- | -------------------- |
|      |             |                        |                 |                      |

References

Generative AI Use Disclosure

GitHub Copilot with GPT-5 is used to draft and proofread this document.


Last update: 2025-10-02 02:03:52
Created: 2025-10-02 02:03:52