Running test_modules.py script results in error?
Closed this issue · 4 comments
Running the test_modules.py
module results in the following output:
(py36) [dav@ncc1701d test]$ python test_modules.py -v
Opening file MCM_APINENE.eqn.txt for parsing
Calculating total number of equations = 836
Parsing each equation
Total number of species = 305
Saving all equation information to dictionaries
Mapping species names to SMILES and Pybel objects
No SMILES entry for species NO3
No SMILES entry for species CO
No SMILES entry for species H2
No SMILES entry for species HNO3
No SMILES entry for species NO
No SMILES entry for species NO2
No SMILES entry for species SO2
No SMILES entry for species SO3
No SMILES entry for species H2O2
Traceback (most recent call last):
File "test_modules.py", line 446, in <module>
setup(filename)
File "test_modules.py", line 111, in setup
reaction_dict=outputdict['reaction_dict']
KeyError: 'reaction_dict'
Many thanks for ponting this out. Id left a component dictionary in the script that is used for testing purposes. This has now been removed.
Thanks @loftytopping - the tests run now - but I get two failures:
test_reactants_numba (__main__.TestParsing) ... ok
test_size_array (__main__.TestParsing) ... ok
======================================================================
FAIL: test_dy_dt_calc (__main__.TestParsing)
----------------------------------------------------------------------
Traceback (most recent call last):
File "test_modules.py", line 449, in test_dy_dt_calc
npt.assert_almost_equal(dy_dt_calc_base, dy_dt_calc, decimal=5)
File "/home/dav/miniconda2/envs/py36/lib/python3.6/site-packages/numpy/testing/nose_tools/utils.py", line 565, in assert_almost_equal
return assert_array_almost_equal(actual, desired, decimal, err_msg)
File "/home/dav/miniconda2/envs/py36/lib/python3.6/site-packages/numpy/testing/nose_tools/utils.py", line 963, in assert_array_almost_equal
precision=decimal)
File "/home/dav/miniconda2/envs/py36/lib/python3.6/site-packages/numpy/testing/nose_tools/utils.py", line 779, in assert_array_compare
raise AssertionError(msg)
AssertionError:
Arrays are not almost equal to 5 decimals
(mismatch 19.787849566055925%)
x: array([ 1.09978e+13, 4.55964e+13, -4.90689e+10, ..., 9.60977e+08,
-3.43927e+09, 9.46911e+08])
y: array([ 1.09978e+13, 4.55964e+13, -4.90689e+10, ..., 9.60977e+08,
-3.43927e+09, 9.46911e+08])
======================================================================
FAIL: test_dydt_fortran (__main__.TestParsing)
----------------------------------------------------------------------
Traceback (most recent call last):
File "test_modules.py", line 394, in test_dydt_fortran
npt.assert_almost_equal(dydt_fortran_base, dydt_fortran, decimal=5)
File "/home/dav/miniconda2/envs/py36/lib/python3.6/site-packages/numpy/testing/nose_tools/utils.py", line 565, in assert_almost_equal
return assert_array_almost_equal(actual, desired, decimal, err_msg)
File "/home/dav/miniconda2/envs/py36/lib/python3.6/site-packages/numpy/testing/nose_tools/utils.py", line 963, in assert_array_almost_equal
precision=decimal)
File "/home/dav/miniconda2/envs/py36/lib/python3.6/site-packages/numpy/testing/nose_tools/utils.py", line 779, in assert_array_compare
raise AssertionError(msg)
AssertionError:
Arrays are not almost equal to 5 decimals
(mismatch 16.06557377049181%)
x: array([-6.18233e+13, -1.84997e+14, -2.52450e+13, -4.67138e+13,
2.57350e+13, -1.00000e+18, -1.00000e+18, 7.42982e+19,
7.80177e+12, -7.53942e+13, 7.07570e+13, 4.80382e+19,...
y: array([-6.18233e+13, -1.84997e+14, -2.52450e+13, -4.67138e+13,
2.57350e+13, -1.00000e+18, -1.00000e+18, 7.42982e+19,
7.80177e+12, -7.53942e+13, 7.07570e+13, 4.80382e+19,...
----------------------------------------------------------------------
Ran 14 tests in 0.118s
Looks like it is only a minor difference in small values in the arrays though - maybe expected on a different machine or not?
Ah, yes. Given the huge range of component concentrations, I will need to re-factor the error checking. Apologies. I will try to get round to checking whether this is appropriate today before mid-afternoon. Thanks.
Thanks for noticing this. I should have been using the 'numpy.allclose' function with a defined relative error. I didnt realise that the using 'numpy.testing.assert_almost_equal' would create errors based on defining only absolute errors. This will be a general problem for any model with a wide range of numeric values. I have made the change to using numpy.allclose.