/anguita

Fast approximation of tanh with good enough results.

Primary LanguagePythonMIT LicenseMIT

Efficient tanh approximation

Efficient tanh algorithm using an approximation of the Hyperbolic function:

Speed Improvement of the Back-Propagation on Current Generation
Workstations" D. Anguita, G. Parodi and R. Zunino. Proceedings of the World
Congress on Neural Networking, 1993.

Error of the approximation

The approximation of tanh is not precise, but much faster:

x math.tanh anguita.tanh delta
-0.9 -0.716297870199 -0.689095742562 -0.0272021276373
-0.8 -0.664036770268 -0.633359378142 -0.0306773921261
-0.7 -0.604367777117 -0.572415613722 -0.0319521633955
-0.6 -0.537049566998 -0.506264449302 -0.0307851176963
-0.5 -0.46211715726 -0.434905884882 -0.0272112723783
-0.4 -0.379948962255 -0.358339920462 -0.0216090417935
-0.3 -0.291312612452 -0.276566556042 -0.0147460564099
-0.2 -0.197375320225 -0.189585791622 -0.0077895286032
-0.1 -0.099667994625 -0.0973976272017 -0.00227036742325
0.1 0.099667994625 0.0973976272017 0.00227036742325
0.2 0.197375320225 0.189585791622 0.0077895286032
0.3 0.291312612452 0.276566556042 0.0147460564099
0.4 0.379948962255 0.358339920462 0.0216090417935
0.5 0.46211715726 0.434905884882 0.0272112723783
0.6 0.537049566998 0.506264449302 0.0307851176963
0.7 0.604367777117 0.572415613722 0.0319521633955
0.8 0.664036770268 0.633359378142 0.0306773921261
0.9 0.716297870199 0.689095742562 0.0272021276373