Skip to content

test_automater

TestAutomater(from_branch=None, into_branch='main', regression=True, unit=True, save=False, plot=False)

Runs through test suite only taking into account relevant tests for the modified files

Parameters:

Name Type Description Default
from_branch str

Git branch that you want to test. Defaults to currently active branch.

None
into_branch str

Git branch that 'from_branch' is being compared to. Defaults to main.

'main'
regression bool

Flag for running regression tests. Defaults to True.

True
unit bool

Flag for running unit tests. Defaults to True.

True

compare_meshes(old_json, new_json)

Runs mesh comparator on old and new mesh stored within test_output_file

Parameters:

Name Type Description Default
old_json dict

Mesh with old 'ground truth' values

required
new_json dict

Newly generated mesh to compare against old_mesh

required

Returns:

Name Type Description
dict

Mesh comparison dataframes indexed by human readable labels

extract_test_meshes(test_output_file) staticmethod

Reads test output file and extracts out two meshes; the old 'ground truth' mesh, and the newly generated mesh

Parameters:

Name Type Description Default
test_output_file str

Filename of json file holding both meshes

required

Returns:

Name Type Description
dict

Ground truth 'old' json mesh

dict

Updated 'new' json mesh

get_base_dir() staticmethod

Get base folder for repo.

get_diff_filenames(from_branch=None, into_branch=None) staticmethod

Gets a list of files that have changed between 'from_branch' and 'into_branch'

Parameters:

Name Type Description Default
from_branch str

Test branch to compare. Defaults to current working branch.

None
into_branch str

'Ground truth' branch to compare from_branch to. Defaults to main.

None

Returns:

Name Type Description
list str

List of files that are different between from_branch and into_branch

get_relevant_tests(diff_file, test_dict)

Determines the relevant tests to run from filename

Parameters:

Name Type Description Default
diff_file str

File that is modified from into_branch

required
test_dict dict

Mapping of modified file to relevant tests

required

Returns:

Name Type Description
list str

List of tests that need to be run due to diff_file being modified from ground truth

parse_pytest_stdout(stdout) staticmethod

Turns stdout of Pytest into TestInfo objects

Parameters:

Name Type Description Default
stdout str

Minimal output of Pytest

required

Returns:

Name Type Description
tuple

lists of TestInfo objects, organised into pass, fail, and error

plot_test(test_output, save_to=None)

Creates a plot of the differences between the newly generated mesh and the ground truth mesh. The mesh displayed will be the new mesh, with cellboxes different to the ground truth mesh being highlighted in a unique colour depending on the difference.

Saves image to current working directory under './pytest_meshiphi/.svg'

run_regression_tests(diff_files, save_to=None, plot=False)

Runs relevant regression tests for files within 'diff_files'

Parameters:

Name Type Description Default
diff_files list

List of files that are different from comparison branch

required

run_unit_tests(diff_files, save_to=None)

Runs relevant unit tests for files within 'diff_files'

Parameters:

Name Type Description Default
diff_files list

List of files that are different from comparison branch

required

save_tests(tmp_dir, output_folder, passes=False, fails=True, errors=True)

Saves copy of newly generated test meshes to 'pytest_meshiphi' folder in current working directory. Meshes will be saved as './pytest_meshiphi/.json'

Parameters:

Name Type Description Default
passes bool

Choice to save tests that pass. Defaults to False.

False
fails bool

Choice to save tests that fail. Defaults to True.

True
errors bool

Choice to save tests that error. Defaults to True.

True

summarise_reg_tests(test_output)

Write out a summary of the difference in cellboxes in the terminal

Parameters:

Name Type Description Default
test_output str

Filename of saved test mesh

required

summarise_test_stats()

Summarise statistics about the tests and print to terminal Example: 10 / 12 tests passed for test_boundary.py