AmeyaJagtap
Assistant Professor (Tenure-Track), Aerospace Engineering Department, Worcester Polytechnic Institute, Worcester, MA 01609, USA
United States
Pinned Repositories
Activation-functions-in-regression-and-classification
How important are How important are activation functions in regression and classification? A survey, performance comparison, and future directions
Adaptive_Activation_Functions
We proposed the simple adaptive activation functions deep neural networks. The proposed method is simple and easy to implement in any neural networks architecture.
Augmented_PINNs_-APINNs-
Conservative_PINNs
We propose a conservative physics-informed neural network (cPINN) on decompose domains for nonlinear conservation laws. The conservation property of cPINN is obtained by enforcing the flux continuity in the strong form along the sub-domain interfaces.
Error_estimates_PINN_and_XPINN_NonlinearPDEs
The first comprehensive theoretical analysis of PINNs (and XPINNs) for a prototypical nonlinear PDE, the Navier-Stokes equations are given.
fourier_neural_operator
Use Fourier transform to learn operators in differential equations.
Locally-Adaptive-Activation-Functions-Neural-Networks-
Python codes for Locally Adaptive Activation Function (LAAF) used in deep neural networks. Please cite this work as "A D Jagtap, K Kawaguchi, G E Karniadakis, Locally adaptive activation functions with slope recovery for deep and physics-informed neural networks, Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 20200334, 2020. (http://dx.doi.org/10.1098/rspa.2020.0334)".
Rowdy_Activation_Functions
We propose Deep Kronecker Neural Network, which is a general framework for neural networks with adaptive activation functions. In particular we proposed Rowdy activation functions that inject sinusoidal fluctuations thereby allows the optimizer to exploit more and train the network faster. Various test cases ranging from function approximation, inferring the PDE solution, and the standard deep learning benchmarks like MNIST, CIFAR-10, CIFAR-100, SVHN etc are solved to show the efficacy of the proposed activation functions.
XPINNs
Extended Physics-Informed Neural Networks (XPINNs): A Generalized Space-Time Domain Decomposition Based Deep Learning Framework for Nonlinear Partial Differential Equations
XPINNs_TensorFlow-2
XPINN code written in TensorFlow 2
AmeyaJagtap's Repositories
AmeyaJagtap/XPINNs
Extended Physics-Informed Neural Networks (XPINNs): A Generalized Space-Time Domain Decomposition Based Deep Learning Framework for Nonlinear Partial Differential Equations
AmeyaJagtap/Conservative_PINNs
We propose a conservative physics-informed neural network (cPINN) on decompose domains for nonlinear conservation laws. The conservation property of cPINN is obtained by enforcing the flux continuity in the strong form along the sub-domain interfaces.
AmeyaJagtap/Locally-Adaptive-Activation-Functions-Neural-Networks-
Python codes for Locally Adaptive Activation Function (LAAF) used in deep neural networks. Please cite this work as "A D Jagtap, K Kawaguchi, G E Karniadakis, Locally adaptive activation functions with slope recovery for deep and physics-informed neural networks, Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences, 20200334, 2020. (http://dx.doi.org/10.1098/rspa.2020.0334)".
AmeyaJagtap/XPINNs_TensorFlow-2
XPINN code written in TensorFlow 2
AmeyaJagtap/Rowdy_Activation_Functions
We propose Deep Kronecker Neural Network, which is a general framework for neural networks with adaptive activation functions. In particular we proposed Rowdy activation functions that inject sinusoidal fluctuations thereby allows the optimizer to exploit more and train the network faster. Various test cases ranging from function approximation, inferring the PDE solution, and the standard deep learning benchmarks like MNIST, CIFAR-10, CIFAR-100, SVHN etc are solved to show the efficacy of the proposed activation functions.
AmeyaJagtap/Adaptive_Activation_Functions
We proposed the simple adaptive activation functions deep neural networks. The proposed method is simple and easy to implement in any neural networks architecture.
AmeyaJagtap/Error_estimates_PINN_and_XPINN_NonlinearPDEs
The first comprehensive theoretical analysis of PINNs (and XPINNs) for a prototypical nonlinear PDE, the Navier-Stokes equations are given.
AmeyaJagtap/Activation-functions-in-regression-and-classification
How important are How important are activation functions in regression and classification? A survey, performance comparison, and future directions
AmeyaJagtap/Augmented_PINNs_-APINNs-
AmeyaJagtap/fourier_neural_operator
Use Fourier transform to learn operators in differential equations.
AmeyaJagtap/locally-adaptive-activation-functions
Simplified implementation of locally adaptive activation functions (LAAF) with slope recovery for deep and physics-informed neural networks (PINNs) in PyTorch.
AmeyaJagtap/Physics_Informed_Deep_Learning
Short course on physics-informed deep learning
AmeyaJagtap/POD-PINN
POD-PINN code and manuscript
AmeyaJagtap/DeepHPMs
Deep Hidden Physics Models: Deep Learning of Nonlinear Partial Differential Equations
AmeyaJagtap/PINNs
Physics Informed Deep Learning: Data-driven Solutions and Discovery of Nonlinear Partial Differential Equations
AmeyaJagtap/UQPINNs