ushnisha/jobshop-minimal

Create suitable output files for test plans (for regression testing)

Closed this issue · 4 comments

Come up with a suitable format for the output files for test plans for regression testing of functionality. This should include:

(1) Sorting order of output
(2) Metrics that will be compared
(3) How to check for multiple/alternate solutions
(4) Test success/failure reporting

This is the first of a series of ongoing changes/commits that are required to address this issue. Commit: fd26680 modifies the output format for various objects for better regression testing (for expect/output file generation). Other changes and commits will continue to follow.

One minor change to the Demand class; a new computed field, planqty has been added to print the final planned quantity (required when we may need test cases for demand shortages enhancement).

Commit: c05c12f to improve the output for Demand class. Include the demand priority in the printed information. Also, sort demands by priority rather than by name since we are planning the demands one by one based on the priority.

Commit: 4e17871 to sort demands based on priority before printing. This should have been included with commit: c05c12f but the file was missed during git add/commit.

Commit: 01b8c6c has been made to address the following issues:

(1) Added the tests/expects subdirectory with expected output files.

(2) Makefile modified to add an option to run tests ("make tests").
This will run all of the tests, one by one and compare the expect files
in the above expects directory against the output file and print a
pass/fail message.

(3) A single testplan can be run using "make TESTNAME=testXXXX tests" to
run a single test in the directory tests/testXXXX

As of now, the following is the status:
() Sorting of taskplans and output lines to be consistent
(
) List of TaskPlans with details of plan dates, quantity planned, and workcenter loaded should match exactly (these are the KPIs)
() Testplans so far have unique results; so alternate solutions not addressed. This must be addressed through a separate issue
(
) Automatic pass/fail message on running tests. Can be used for regression testing.

For now, this issue can be closed. Additional issues can be opened for addressing other enhancements to the testing logic like alternate solutions, improved KPI's, exact matches of TaskPlans vs. other metrics, fuzzy matches, matches within a margin of error, etc.