activations.cpp testing in python w/ SWIG
etash47 opened this issue · 2 comments
Ran 8 tests (7 failed)
(attached is output test results)
-
exponential (Input integers from -100 to 100): most tests failed due to decimal imprecision, tests with input larger than 88 failed due by returning 'inf' in swig
-
hard sigmoid (Input integers from -100 to 100): most tests passed, tests with inputs -2, -1, 1, 2 failed due to decimal imprecision
-
hyper tan (Input integers from -100 to 100): some tests passed, all tests with input between -19 and 19 failed due to decimal imprecision
-
relu (Input integers from -100 to 100): all tests passed
-
sigmoid (Input integers from -100 to 100): some tests passed, all tests with input between -10 and 36 failed due to decimal imprecision, tests with input larger than 88 failed by returning 'nan'
-
softmax (Input integers from -100 to 100): most tests failed due to decimal imprecision, tests with input larger than 88 failed by returning '0'
-
softplus (Input integers from -100 to 100): some tests passed, all tests with input between -10 and 33 failed due to decimal imprecision, tests with input larger than 88 failed by returning 'inf'
-
softsign (Input integers from -100 to 100): most tests failed due to decimal imprecision
All test cases are attached
testing_module_errors.txt
Relative tolerance for Tensorflow is 1e-5
Refactored, debugged & tested all layers, all layer types passed at tolerance of 1e-5, softmax passes at tolerance of 1e-3