Skip to content

Midterm Exam - Ansible to Docker DevOps

Some projects in Pusilkom UI have automated provisioning and deployment implemented using Ansible. The application and environment configuration are parameterised using Ansible variables that written in Jinja syntax and stored in YAML files. Since some of the variables might contain sensitive information such as database password or API key, Ansible Vault is also used to separate sensitive variables to a new file and encrypt it for storage in the version control system (Git). The variables will only be decrypted during provisioning and deployment process.

In recent times, there are requests to prepare a Docker-based deployment using Docker Compose. One way to configure an application or a component running in a container is by providing the configuration through environment variables. To make it easier for DevOps Engineer or System Administrator to provide the environment variables, usually they write the variables into a text-based file called .env that subsequently will be read by Docker Compose.

The problem is how to keep existing Ansible-related artefact in place and make them still usable for Docker-based deployment. Since most of the configuration are written in YAML files, there must be a way to reuse the configuration variables in YAML files and transform them into a .env file. In addition, the values for some of the variables might also get passed through environment variables when run in CI environment.

The following example is a program that reads YAML files, merge them into single structure, and print the content as string that can be written into a .env file:

  1
  2
  3
  4
  5
  6
  7
  8
  9
 10
 11
 12
 13
 14
 15
 16
 17
 18
 19
 20
 21
 22
 23
 24
 25
 26
 27
 28
 29
 30
 31
 32
 33
 34
 35
 36
 37
 38
 39
 40
 41
 42
 43
 44
 45
 46
 47
 48
 49
 50
 51
 52
 53
 54
 55
 56
 57
 58
 59
 60
 61
 62
 63
 64
 65
 66
 67
 68
 69
 70
 71
 72
 73
 74
 75
 76
 77
 78
 79
 80
 81
 82
 83
 84
 85
 86
 87
 88
 89
 90
 91
 92
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
105
106
107
108
109
110
111
112
#!/usr/bin/env python
# File: create_dotenv.py
import os
import re
import sys

try:
    from yaml import safe_load
    from yaml.scanner import ScannerError
except ImportError:
    print("Cannot find PyYAML module in the environment.", file=sys.stderr)
    print("Please install it first using `pip` or other means", file=sys.stderr)
    sys.exit(1)

def is_valid_yaml_files(files: list) -> bool:
    '''
    Checks whether the given list of file paths contain all valid YAML files.

    Suppose that we have two valid YML files. If the paths to both files are
    passed to the function, then the function should return True.

    >>> import tempfile
    >>> with tempfile.NamedTemporaryFile(mode="w", newline="\\n", suffix=".yml") as file1:
    ...     with tempfile.NamedTemporaryFile(mode="w", newline="\\n", suffix=".yml") as file2:
    ...         file1.writelines(["---", "a: 1", "b: 2"])
    ...         file2.writelines(["d: 3", "e: 4"])
    ...         is_valid_yaml_files([file1.name, file2.name])
    True

    Otherwise, the function should return False.

    >>> import tempfile
    >>> with tempfile.NamedTemporaryFile(mode="w", newline="\\n", suffix=".yml") as file1:
    ...     with tempfile.NamedTemporaryFile(mode="w", newline="\\n", suffix=".yml") as file2:
    ...         file1.writelines(["{a: 1", "b: 2"])
    ...         file2.writelines(["d: 3}", "e: 4"])
    ...         is_valid_yaml_files([file1.name, file2.name])
    False

    TODO: Fix doctest when run under Windows
    '''
    if len(files) == 0:
        return False

    try:
        for file in files:
            with open(file, 'r') as yaml_file:
                safe_load(yaml_file)
    except OSError as os_error:
        print("Cannot open the given file", file=sys.stderr)
        print(os_error, file=sys.stderr)
        return False
    except ScannerError as scanner_error:
        print("Unable to parse the given YAML files correctly", file=sys.stderr)
        print(scanner_error, file=sys.stderr)
        return False

    return True

def parse_decrypted_vault(main_yaml: dict, vault_yaml: dict) -> dict:
    '''
    Replaces every occurences of Jinja variables with the corresponding values
    from a decrypted vault file.

    For example, suppose main_yaml and vault_yaml are illustrated as follows:

    >>> main_yaml = {'app_name': 'example', 'db_user': 'bangtoyib', 'db_pass': '{{ vault_db_pass }}'}
    >>> vault_yaml = {'vault_db_pass': 'cepatpulang'}
    >>> parse_decrypted_vault(main_yaml, vault_yaml)
    {'app_name': 'example', 'db_user': 'bangtoyib', 'db_pass': 'cepatpulang'}

    The function should return the dictionary whose keys are the same as the
    original dictionary and all of the values with Jinja variables have been
    replaced.

    TODO: Handle YAML structure with > 1 level (nested)
    '''
    from copy import deepcopy
    result_yaml = deepcopy(main_yaml)

    for key, value in main_yaml.items():
        if isinstance(value, str) and re.match(r'{{ vault_(\w)+ }}', value):
            result_yaml[key] = vault_yaml[f'vault_{key}']

    return result_yaml

def main() -> None:
    if is_valid_yaml_files(sys.argv[1:]):
        with open(sys.argv[1], 'r') as vars_file:
            with open(sys.argv[2], 'r') as vault_file:
                yaml = parse_decrypted_vault(
                    safe_load(vars_file),
                    safe_load(vault_file)
                )

                for key, value in yaml.items():
                    if isinstance(value, str):
                        contain_variable = re.match(r"{{ (?P<variable>\w+) }}", value)

                        if contain_variable:
                            # Assume environment variable is written in all uppercase
                            os_variable_name = contain_variable.group('variable').upper()
                            print(f'{key.upper()}={os.getenv(os_variable_name, "")}')
                        else:
                            print(f'{key.upper()}={value}')
                    else:
                        print(f'{key.upper()}={value}')
    else:
        print("One or more YAML files are invalid", file=sys.stderr)
        sys.exit(1)

if __name__ == '__main__': main()

Actual, but redacted running example:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
$ python create_dotenv.py \
  ../ansible/inventories/develop/group_vars/all/vars.yml \
  ../ansible/inventories/develop/group_vars/all/vault.yml
APP_DIR=/opt/unhan/euis
APP_USER=unhan
APP_THEME=unhan
APP_SOURCE=
APP_SUPERPASS=bangtoyib!
DATABASE_HOST=192.168.0.162
DATABASE_PORT=5432
DATABASE_NAME=euis_demo2
DATABASE_USERNAME=euis_demo2
DATABASE_PASSWORD=cepatpulang!
EMAIL_HOST=192.168.0.174
EMAIL_PORT=25
EMAIL_USERNAME=no-reply@pusilkom.com
EMAIL_PASSWORD=None
EMAIL_ENCRYPTION=None
PHP_VERSION=7.0

The example above reads two YAML files named vars.yml and vault.yml. The program replaces all occurences of Jinja variables in the vars.yml with values read from vault.yml. In addition, if there are still some Jinja variables left in the vars.yml, the program reads the OS' environment variables to obtain the values for the remaining Jinja variables. Finally, any remaining Jinja variables will be assigned with empty values then printed to the standard output.

SQA Problem - Input Space Partitioning

Estimated time: 5 minutes

You are asked to design an input space model for the API by following a functionality-based approach. The information required to develop the model can be derived by reading the code snippet.

Your tasks are as follows:

  1. Determine the characteristics and the partition for a function chosen by the proctor.

    Possible functions:

    • is_valid_yaml_files()
    • main()
    • The whole script, i.e. create_dotenv.py
  2. Based on the input space model that you have created, create the test requirement and the test cases based on certain coverage criteria chosen by the proctor.

Possible coverage criteria choices:

  • All Combinations Coverage (ACoC)
  • Each Choice Coverage (ECC)
  • Pair-Wise Coverage (PWC)
  • Base Choice Coverage (BCC)

Note: You do not have to write all test cases due to the time limit. However, make sure you can justify your subset of test cases match with the chosen coverage criteria!

Write your answer in a sheet of paper or Microsoft Word/Google Docs document. You may include illustrations in your answer. Please prepare to present your answer remotely via Zoom/Google Hangouts during discussion time.

SQA Problem - Graph Coverage

Estimated time: 5 minutes

You are asked to design a control flow graph (CFG), prepare the test requirement, and create the test paths.

Your tasks are as follows:

  1. Create the CFG for a function chosen by the proctor.

    Possible functions:

    • is_valid_yaml_files()
    • main()
    • The whole script, i.e. create_dotenv.py
  2. Based on the CFG that you have created, create the test requirement and the test paths based on certain coverage criteria chosen by the proctor.

    Possible coverage criteria choices:

    • Node Coverage (NC)
    • Edge Coverage (EC)
    • Edge-Pair Coverage (EPC)

    Note: You do not have to write all test paths due to the time limit. However, make sure you can justify your subset of test paths match with the chosen coverage criteria!

Write your answer in a sheet of paper or Microsoft Word/Google Docs document. You may include illustrations in your answer. Please prepare to present your answer remotely via Zoom/Google Hangouts during discussion time.

SQA Problem - Discussion

Estimated time: 10 minutes

You are asked to present your answers to the given problems and also to have one-on-one interview with the proctor during the discussion time.

The list of topics that might be discussed is as follows:

  • Code coverage (line coverage)
  • Test-Driven Development (TDD)
  • Test isolation
  • Writing test cases in Java (JUnit)/Python (unittest and Django)/PHP (PHPUnit)
  • Your experience in conducting SQA activities in academics and/or work environment
  • The ideas of mutation testing
  • And many more that may still related to SQA

Last update: 2021-11-10 11:48:10
Created: 2021-11-10 11:48:10