Skip to content

Project 2: Model-Based Test Design (IDM + CFG)

This assignment is a continuation of the second exercise and contributes to the project grade component.

You and your team members will select an existing, medium-sized codebase (e.g., a project from PPL/Propensi or another open-source project besides Spring Petclinic REST) and apply model-based testing techniques:

  • Build Input Domain Models (IDMs) using Pair Wise Coverage (PWC) to derive test values.
  • Build Control Flow Graphs (CFGs) using Edge-Pair Coverage (EPC) to derive test paths.
  • Map the derived values/paths to the project's test suite, adding tests where necessary.

Codebase Requirements

Choose a codebase that:

  • Has a runnable test suite (unit and/or integration tests).
  • Allows adding/maintaining tests using a standard build tool (e.g., Maven/Gradle for Java projects).
  • Contains multiple non-trivial methods with decisions/branches suitable for CFG modeling.
  • Has CI/CD or can be set up to build, test, and report coverage on GitLab CSUI.

You can reuse the previous project codebase from Project 1 if it meets the criteria.

Tasks

Once you have decided on the codebase, complete the following tasks:

  1. Fork the original codebase into GitLab CSUI. If the original is outside GitLab CSUI, mirror/push the repository to GitLab CSUI.
  2. Clone the forked codebase to your local development machines. Ensure every team member can run the project and its tests locally.
  3. Set up or update GitLab CI/CD so it automatically builds, runs tests, and reports coverage on your fork. Fix any failing pipeline stages, if any.
  4. Create a new SonarQube project on SonarQube CSUI and integrate Sonar Scanner in CI after tests and coverage.
  5. As a team, select a set of non-trivial methods across the codebase for modeling. Each team member should model at least one method and the chosen method should be non-trivial. Find methods that require more than one input parameters and/or have complex control flow (e.g., loops, conditionals, exceptions).
  6. For each selected method, perform the following work items (aligning with Exercise 2):
  7. IDM (Pair Wise Coverage): identify input characteristics and partitions; define constraints; enumerate PWC requirements; select concrete test values.
  8. CFG (Edge-Pair Coverage): draw the CFG; enumerate EPC requirements; select concrete test paths.
  9. Validation: map at least one IDM-derived value and one CFG-derived path to existing tests; if missing, add new tests to satisfy the requirements.
  10. Ensure the CI/CD pipelines pass on every Merge Request (MR) and on the main branch of your fork.
  11. Conduct code reviews within the team for all MRs related to modeling and test additions. Address review feedback before merging.
  12. Document the models, requirements, mapping to tests, and justifications in a project report (see Deliverables).

Deliverables

At the end of this project, you are required to prepare the following artefacts:

  • A fork repository of the chosen project in a namespace on GitLab CSUI. You can create a new group to contain the fork or invite your team members to the fork hosted under your own namespace.
  • An updated codebase that adds tests and documentation derived from IDM (PWC) and CFG (EPC) work.
  • CFG diagram files (e.g., screenshots or exports in PNG) stored in the repository and referenced by the report.
  • A written report in a Markdown file named PROJECT_2_REPORT.md in the repository. Use a structure consistent with Exercise 2 and include, for each modeled method:
  • The name of the author of the modeled method
  • The IDM (characteristics, partitions) and constraints
  • The PWC test requirements and selected test values
  • The CFG, the EPC test requirements, and selected test paths
  • The mapping from test values/paths to concrete test method(s) with brief justifications
  • AI Assistance Log entries (if any)

The due date of this project: Friday, 26 September 2025, 23:55 UTC+7. Submit the URL of your fork repository to the designated submission slot on SCELE. Only a representative from the group is required to submit the URL. Please ensure any updates to the fork repository related to this project were made and pushed before the due date.

Generative AI Usage Policy

You may use generative AI tools for this project, but you must understand that you are trying to learn and practice the subject, not the AI itself. Generative AI may produce incorrect results; review outputs critically.

If you do use generative AI, you must agree on the following constraints:

Allowed Uses

  • Code generation: suggesting test cases, scaffolding test data, or refactoring test code.
  • Explanations: clarifying partitioning decisions, coverage criteria (PWC/EPC), or build/CI errors.
  • Debugging help: interpreting failing tests or pipeline outputs.

For every AI-assisted change, document it in the written report under a section called AI Assistance Log.

Prohibited Uses

  • Direct submission of AI-generated models/tests without understanding or validation.
  • Bypassing learning by outsourcing the entire modeling/reporting to AI.

Generative AI Use Disclosure

GitHub Copilot with GPT-5 is used to draft and proofread this document.


Last update: 2025-10-02 02:03:52
Created: 2025-09-17 13:10:05