This GitHub repository provides additional details and the code for our paper accepted at the 2023 NeurIPS Workshop on AI for Scientific Discovey: From Theory to Practice.
The paper can be accessed here:
A Transformer Model for Symbolic Regression towards Scientific Discovery (OpenReview)
best_model_weights/
directory with the weights of our pretained modelsdatasets/
directory code to generate the training datasetsmodel/
directory with the Transformer architecture.gitignore
README.md
this fileevaluate_model.py
Python script used when testing our best Transformer model using the SRSD datasetsrequirements.txt
Dependenciestrain_transformer.py
Python script used (on GPUs) for training using our synthetic datasetstutorial.ipynb
Jupyter Notebook for a demonstration using our best Transformer. You might want to start from here!
Install dependencies (we used Python 3.11.3)
pip install -r requirements.txt
@inproceedings{lalande2023,
title = {A Transformer Model for Symbolic Regression towards Scientific Discovery},
author = {Florian Lalande and Yoshitomo Matsubara and Naoya Chiba and Tatsunori Taniai and Ryo Igarashi and Yoshitaka Ushiku},
booktitle = {NeurIPS 2023 AI for Science Workshop},
year = {2023},
url = {https://openreview.net/forum?id=AIfqWNHKjo},
}