/xad

Powerful automatic differentiation in C++ and Python

Primary LanguageC++GNU Affero General Public License v3.0AGPL-3.0

XAD

πŸš€ XAD: Powerful Automatic Differentiation for C++ & Python

XAD is the ultimate solution for automatic differentiation, combining ease of use with high performance. It's designed to help you differentiate complex applications with speed and precisionβ€”whether you're optimizing neural networks, solving scientific problems, or performing financial risk analysis.

Download PRs Welcome Build Status Coverage Codacy Quality

🌟 Why XAD?

XAD is trusted by professionals for its speed, flexibility, and scalability across various fields:

  • Machine Learning & Deep Learning: Accelerate neural network training and model optimization.
  • Optimization in Engineering & Finance: Solve complex problems with high precision.
  • Numerical Analysis: Improve methods for solving differential equations efficiently.
  • Scientific Computing: Simulate physical systems and processes with precision.
  • Risk Management & Quantitative Finance: Assess and hedge risks in sophisticated financial models.
  • Computer Graphics: Optimize rendering algorithms for high-quality graphics.
  • Robotics: Enhance control and simulation for robotic systems.
  • Meteorology: Improve accuracy in weather prediction models.
  • Biotechnology: Model complex biological processes effectively.

Key Features

  • Forward & Adjoint Mode: Supports any order using operator overloading.
  • Checkpointing Support: Efficient tape memory management for large-scale applications.
  • External Function Interface: Seamlessly connect with external libraries.
  • Thread-Safe Tape: Ensure safe, concurrent operations.
  • Exception-Safe: Formal guarantees for stability and error handling.
  • High Performance: Optimized for speed and efficiency.
  • Proven in Production: Battle-tested in large-scale, mission-critical systems.

πŸ’» Example

Calculate first-order derivatives of an arbitrary function with two inputs and one output using XAD in adjoint mode.

Adouble x0 = 1.3;              // initialise inputs
Adouble x1 = 5.2;  
tape.registerInput(x0);        // register independent variables
tape.registerInput(x1);        // with the tape
tape.newRecording();           // start recording derivatives
Adouble y = func(x0, x1);      // run main function
tape.registerOutput(y);        // register the output variable
derivative(y) = 1.0;           // seed output adjoint to 1.0
tape.computeAdjoints();        // roll back adjoints to inputs
cout << "dy/dx0=" << derivative(x0) << "\n"
     << "dy/dx1=" << derivative(x1) << "\n";

πŸš€ Getting Started

git clone https://github.com/auto-differentiation/xad.git
cd xad
mkdir build
cd build
cmake ..
make

For more detailed guides, refer to our Installation Guide and explore Tutorials.

🀝 Contributing

Want to get involved? We welcome contributors from all backgrounds! Check out our Contributing Guide and join the conversation in our Discussions.

πŸ› Found a Bug?

Please report any issues through our Issue Tracker.


πŸ“¦ Related Projects