Sangam: A Confluence of Knowledge Streams

Integration of Neural Network-Based Symbolic Regression in Deep Learning for Scientific Discovery

Show simple item record

dc.creator Kim, Samuel
dc.creator Lu, Peter Y
dc.creator Mukherjee, Srijon
dc.creator Gilbert, Michael
dc.creator Jing, Li
dc.creator Ceperic, Vladimir
dc.creator Soljacic, Marin
dc.date 2022-05-02T15:20:13Z
dc.date 2022-05-02T15:20:13Z
dc.date 2020
dc.date 2022-05-02T14:19:46Z
dc.date.accessioned 2023-03-01T18:07:44Z
dc.date.available 2023-03-01T18:07:44Z
dc.identifier https://hdl.handle.net/1721.1/142227
dc.identifier Kim, Samuel, Lu, Peter Y, Mukherjee, Srijon, Gilbert, Michael, Jing, Li et al. 2020. "Integration of Neural Network-Based Symbolic Regression in Deep Learning for Scientific Discovery." IEEE Transactions on Neural Networks, 32 (9).
dc.identifier.uri http://localhost:8080/xmlui/handle/CUHPOERS/278856
dc.description Symbolic regression is a powerful technique to discover analytic equations that describe data, which can lead to explainable models and the ability to predict unseen data. In contrast, neural networks have achieved amazing levels of accuracy on image recognition and natural language processing tasks, but they are often seen as black-box models that are difficult to interpret and typically extrapolate poorly. In this article, we use a neural network-based architecture for symbolic regression called the equation learner (EQL) network and integrate it with other deep learning architectures such that the whole system can be trained end-to-end through backpropagation. To demonstrate the power of such systems, we study their performance on several substantially different tasks. First, we show that the neural network can perform symbolic regression and learn the form of several functions. Next, we present an MNIST arithmetic task where a convolutional network extracts the digits. Finally, we demonstrate the prediction of dynamical systems where an unknown parameter is extracted through an encoder. We find that the EQL-based architecture can extrapolate quite well outside of the training data set compared with a standard neural network-based architecture, paving the way for deep learning to be applied in scientific exploration and discovery.
dc.format application/pdf
dc.language en
dc.publisher Institute of Electrical and Electronics Engineers (IEEE)
dc.relation 10.1109/TNNLS.2020.3017010
dc.relation IEEE Transactions on Neural Networks
dc.rights Creative Commons Attribution 4.0 International License
dc.rights https://creativecommons.org/licenses/by/4.0
dc.source IEEE
dc.title Integration of Neural Network-Based Symbolic Regression in Deep Learning for Scientific Discovery
dc.type Article
dc.type http://purl.org/eprint/type/JournalArticle


Files in this item

Files Size Format View
Integration_of_ ... r_Scientific_Discovery.pdf 2.348Mb application/pdf View/Open

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse